Never-ending Learning (NEL) systems are, in a very high-level, computer systems that learn over time to become better in solving one or more specific tasks. In the last decades, different Never-Ending Learning (NEL) approaches have been proposed and applied in different tasks involving Artificial Intelligence, Machine Learning, Natural Language Processing (NLP) and Natural Language Understanding (NLU). Results from NEL approaches are getting more frequently successful, thus encouraging us to address the problem of how to build computer systems that can take advantage of NEL principles. In addition, the variety of different names (never-ending learning, continuous learning, lifelong learning, etc.) used to describe systems and models that can keep learning in a continuous way, as well as the new achievements on self-supervised learning and multi-task learning (which are closely related to NEL principles) present in large pre-trained language models, make relevant to have a tutorial on NEL.
In this tutorial we will explore the Never-Ending Learning (NEL) ideas and principles, different approaches (and variations) that can be found in the literature, the similarities and differences from traditional ML approaches (such as semi-supervised learning, reinforcement learning, etc.) and the applications. Another interesting aspect to be explored is how to formulate a problem following a NEL approach. Thus, we will also show how to model a problem in a NEL fashion and help the audience to become familiar with such approaches.
In summary, this tutorial aims at enabling the attendees to:
Estevam Hruschka is the Lab Director and Staff Research Scientist at Megagon Labs in Mountain View, CA. Before Megagon Labs, Estevam was co-founder and co-leader of the Carnegie Mellon Read the Web project – (in which the Never-Ending Language Learner (NELL), and the head of the Machine Learning Lab (MaLL) at the Federal University of Sao Carlos, in Brazil (where he was associate professor 2004–2019). From 2016 through 2022, he was also an adjunct professor in the Machine Learning Department at Carnegie Mellon University. Estevam was ”young research fellow” at FAPESP (Sao Paulo state research agency, Brazil) and, ”research fellow” at CNPq (Brazilian national research agency). He has also received a Google Research Award (for Latin America). From 2017-2020, Estevam was with Amazon helping Alexa to learn to read the Web. His main research interests are related to never-ending learning, natural language understanding, knowledge representations, machine learning, and conversational learning.
Different versions of this tutorial were presented to different communities in different venues. For each venue, the duration, the focus, and the main aspects can vary. Please find below a link to the material presented in each different venue:
Title: Never-Ending Learning, Lifelong Learning and Continual Learning in the Era of Large Pre-Trained Language Models
WWW 2023 Tutorial slides
Title: Never-Ending Learning, Lifelong Learning and Continual Learning: Systems, Models, Current Challenges and Applications
AAAI2023 Tutorial slides