Meta-Learning is gaining popularity in Machine Learning as a progressive step to Artificial General Intelligence (AGI). Lots of work has been done in the past 2 decades in Meta-Learning. This repository aims to track the progress in Meta-Learning (MtL) and give an overview of the state-of-the-art (SOTA) across the most common MtL problems and research topics. It aims to cover both traditional and core MtL tasks.
The goal of this repository is to become the center of information for anything related to Meta-Learning. We are also building a online community on our Reddit page. Please join the subreddit to spread news and articles related to meta-learning.
- Books
- Papers
- Tutorials, Blogs and Talks
- Code, Datasets and Tools
- Researchers, Labs and Workships
- Resources for Students
- Collaborations
- Wish List
Theory
- Metalearning: Applications to Data Mining - Authors: Brazdil, P., Giraud Carrier, C., Soares, C., Vilalta, R. (2009)
- Learning to Learn - Author: Thrun, S. (1998)
- Automated Machine Learning - Authors: Hutter, F., Lars, K., Vanschoren, J. (2019)
Practical
- Hands-On Meta Learning with Python - Author: Ravichandiran, S. (2018)
Literature Library
Click here to see a wider selection of important meta-learning literature.
Recent Top Impact
- Online Meta-Learning - Authors: Finn, C., Rajeswaran, A., Kakade, S., Levine, S. (2019)
- How to train your MAML - Authors: Antoniou, A., Edwards, H., Storkey, A. (2019)
- Towards learning-to-learn - Authors: Lansdell, B. J., Kording, K. P. (2019)
- Probabilistic Mixture of Model-Agnostic Meta-Learners - Authors: Sattigeri, P., Ghosh, S., Kumar, A., Ramamurthy, N. K., Hoffman, S., Padhi, I., Drissi, Y. (2018)
- On First-Order Meta-Learning Algorithms - Authors: Nichol, A., Achiam, J., Schulman, J. (2018)
- A Simple Neural Attentive Meta-Learner - Authors: Mishra, N., Rohaninejad, M., Chen, X., Abbeel, A. (2018)
- Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks - Authors: Finn, C., Abbeel, P., Levine, S. (2017)
- Meta-SGD: Learning to Learn Quickly for Few-Shot Learning - Authors: Zhenguo, L., Fengwei, Z., Fei, C., Hang, L. (2017)
- Learning to learn by gradient descent by gradient descent - Authors: Google DeepMind (2016)
Most recent or highest impact carrying guides on practical meta-learning.
Tutorials
- OpenML Tutorials - Author: The OpenML Team (2019)
- Meta-Learning - Author: Vanschoren, J. (2018)
- Intro to AutoML - Authors: Vanschoren, J., Hutter, F. (2018)
- Metalearning - A Tutorial - Author: Giraud Carrier, C. (2008)
- Meta-Learning: Learning to Learn Fast - Author: Weng, L. (2018)
Blog Posts
- Paper Review: Meta-Transfer-Learning - Author: Meskhi, M M. (2019)
- An introduction to Meta-learning - Author: Wolf, T. (2018)
- Learning to Learn - Author: Finn, C. (2017)
- Meta-Learning: Learning to Learn Fast - Author: Weng, L. (2018)
- What’s New in Deep Learning Research: Understanding Meta-Learning - Author: Rodriguez, J. (2018)
Talks
- Introduction to Meta-Learning at the Houston Machine Learning Meet Up - Author: Meskhi, M M. (2019)
Code
Code usually is from papers mentioned above or other popular Github repositories.
- Model-Agnostic Meta-Learning - Author: Finn, C.
- Reptile - Author: OpenAI
- Hands-On Meta Learning with Python - Author: Ravichandiran, S. (2018)
- Meta-Learning with Latent Embedding Optimization - Author: DeepMind (2018)
- How to train your MAML - Author: Antoniou, A. (2019)
Datasets
Popular datasets used in publications and algorithm efficacy testing.
Tools
- OpenML-Python - Description: Meta-data, flows, tasks & experiments. - Author: Vanschoren, J.
- AutoML Benchmarking - Author: OpenML
- mfe - Description: Meta-Feature Extractor in R. - Author: Rivolli, A.
- metalearn - Description: BYU's python library of useable tools for metalearning. - Authors: BYU-DML.
- mtlSuite - Description: Meta-learning basic suite for machine learning experiments in R. - Author: Mantovani, R.
- pymfe - Description: Meta-Feature Extractor in Python. - Author: Ealcobaca (2019)
- DCoL - Description: Topologcal meta-feature extractor in C++. - Author: Macià, N.
The following is a list of prominent and active researchers working on meta-learning across the world:
- Mikhail M.Meskhi, University of Houston, Pattern Analysis Lab
- Ricardo Vilalta, University of Houston, Pattern Analysis Lab
- Joaquin Vanschoren, Eindhoven University of Technology, OpenML
- Pieter Gijsbers, Eindhoven University of Technology, GAMA
- Matthias Feurer, University of Freiburg, Machine Learning Lab
- Chrisotpher Giraud-Carrier, Brigham Young University, BYM-DML
- Brandon Schoenfeld, Brigham Young University, BYM-DML
- Youssef Drissi, IBM, AI Research Lab
- Prasanna Sattigeri, IBM, AI Research Lab
- Antreas Antoniou, University of Edinburgh, BayesWatch
- Chelsa Finn, University of California at Berkeley
- Sachin Ravi, Princeton University
- Hugo Larochelle, Google Brain
- Adam Santoro, DeepMind
Workshops
- Meta-learning Workshop - Description: NIPS 2019, Vancouver, Canada
- One of the best resources I have found for graduate students on how to deal with Ph.D. related stress, how to conduct good research, or how to write and read papers and much more can be found in this list compiled by a professor at Rice Univeristy.
- Sometimes it is hard to focus or feel motivated due to various reasons. But do not feel down my dear colleagues, read this and keep on working hard!
I welcome anyone willing to work with me on keeping this repository up to date and as well as working on publishing/experimenting on research problems mentioned above. Just send me an email and we can take it from there.
Things that everyone would like to see here:
- Create road map for this repo
- Create research topic list
- Add more datasets that are standard in publications
- Add more regarding meta-learning research directions
- Add more topics
This repository was inspired by awesome-meta-learning and NLP-Progress. All work here is open source and copy right free. The goal is to help the research community prosper and communicate better.