About
I am a postdoctoral research associate at the School of Engineering in the University of Edinburgh, working with Elliot J. Crowley.
My research interests include:
- Automated machine learning (AutoML)
- Representation learning
- Multi-modal learning
- Responsible applications to climate and healthcare
I have a PhD from the University of Edinburgh, focused on “Self-Supervised Learning for Transferable Representations”. This work was supervised by Tim Hospedales and funded in part by the EPSRC.
News
[09/2024] einspace
accepted to NeurIPS 2024. See you in Vancouver!
[09/2024] I presented einspace
in AutoML Seminars! Watch the talk here.
[06/2024] Released our work on einspace
, a new expressive NAS search space. Check out the preprint or our project page.
[01/2024] I am co-organising this year’s NAS Unseen-Data competition as part of AutoML 2024.
[11/2023] Started a postdoc position with Elliot J. Crowley at the University of Edinburgh.
[10/2023] Our AutoML 2023 paper received a best paper award!
[10/2023] Presented a paper on domain adaptation at AutoML 2023!
[12/2022] Presented a paper at the NeurIPS 2022 workshop on Self-Supervised Learning - Theory and Practice!
[11/2022] Presented a paper at BMVC 2022!
[09/2022] Started an internship at Samsung AI Centre Cambridge with Tim Hospedales & Da Li.
[10/2021] Started an internship at Huawei Noah’s Ark Lab with Steven McDonagh & Ales Leonardis.
[08/2021] Our new survey paper is published in the IEEE Signal Processing Magazine!
[06/2021] Presented a paper at CVPR 2021!
Publications
Check out my Google Scholar page.
einspace
: Searching for Neural Architectures from Fundamental Operations [paper][code][project page][video]
Ericsson L., Espinosa Minano M., Yang C., Antoniou A., Storkey A., Cohen S. B., McDonagh S., Crowley E. J., In NeurIPS, 2024.
Self-Supervised Disentanglement by Leveraging Structure in Data Augmentations [paper][code]
Eastwood C., von Kügelgen Julius, Ericsson L., Bouchacourt D., Vincent P., Schölkopf B., Ibrahim M., Causal Representation Learning, and Self-Supervised Learning - Theory and Practice, Workshops at NeurIPS, 2023.
Parameter-Efficient Fine-Tuning for Medical Image Analysis: The Missed Opportunity [paper]
Dutt R., Ericsson L., Sanchez P., Tsaftaris S. A. and Hospedales T. M., In MIDL 2024.
Better Practices for Domain Adaptation [paper][code]
Ericsson L., Li D. and Hospedales T. M., In AutoML, 2023 (Best paper award).
Region Proposal Network Pre-Training Helps Label-Efficient Object Detection [paper]
Ericsson L., Dong N., Yang Y., Leonardis A. and McDonagh S., In Neurocomputing, and Self-Supervised Learning - Theory and Practice, Workshop at NeurIPS, 2022.
Why Do Self-Supervised Models Transfer? Investigating the Impact of Invariance on Downstream Tasks [paper][code]
Ericsson L., Gouk H. and Hospedales T. M., In BMVC, 2022.
How Well Do Self-Supervised Models Transfer? [paper][code]
Ericsson L., Gouk H. and Hospedales T. M., In CVPR, 2021.
Self-Supervised Representation Learning: Introduction, Advances and Challenges [paper]
Ericsson L., Gouk H., Loy C.C. and Hospedales T. M., In IEEE Signal Processing Magazine.
Other Experience
I have worked as an intern with Steven McDonagh & Ales Leonardis at Huawei Noah’s Ark Lab, and with Tim Hospedales & Da Li at Samsung AI Center. Previously, I completed a Master’s degree in Computer Science at Durham University where I worked with Magnus Bordewich on multi-task and transfer learning in RL. During my undergraduate degree I developed an algorithmic composition system with Steven Bradley. I also worked with Professor Toby Breckon at the university over a summer, developing dense stereo vision and visual odometry for robotics. During this time I had the chance to collaborate with the Centre for Vision and Visual Cognition on a project involving Brain-Computer Interfaces as an application of Deep Learning.
Teaching
I have undertaken tutoring, demonstrating and marking roles while studying both in Edinburgh and Durham. Courses I’ve taught include Introduction to Programming (in Python/Java), Introductory Applied Machine Learning and Theory of Computation.