Towards dynamic adaptation in Natural Language Processing - where are we now?
Humans adapt their language use to fit an unbounded variety of contexts. In Natural Language Processing (NLP), however, the dominant paradigm is to treat language as uniform and static: models are trained from scarce and biased training data. The consequences are serious: these models work well only on similar text types, and suffer dramatically on texts from other types or authors with different demographic backgrounds. The central problem is that NLP models cannot dynamically adapt. Each system is developed from scratch every time a new language, task or domain is encountered. In this talk I will survey some current approaches to learning from diverse sources (domains, tasks and languages), including some of my own work on multi-task learning from fortuitous data sources such as keystrokes, cross-lingual learning and data selection for transfer learning. I will outline current challenges and give some preliminary directions on where to go next for dynamic learning under limited (or absence) of annotated resources.
Barbara Plank is Assistant Professor (tenured) in Natural Language Processing at the University of Groningen. She has previously held positions as assistant professor and postdoc at the University of Copenhagen and a postdoc position at the University of Trento. Her main research includes learning under sample selection bias (domain adaptation, transfer learning), annotation bias and generally, semi-supervised, weakly-supervised and multi-task learning for cross-domain and cross-lingual NLP, applied to a range of NLP tasks covering tagging, parsing, relation extraction, opinion mining, and personality detection. She is on the editorial board of the Computational Linguistics Journal and is serving as area chair for NAACL 2018 and chair for language and computation at ESSLLI 2018. She received her PhD in 2011 from the University of Groningen.
Add to your calendar
24 November 2017 - Barbara Plank: Seminar
Informatics Forum 4.31/4.33