11 September 2018- Suzanne Stevenson: Seminar
How Languages Carve Up the World: Modeling Developmental and Linguistic Relativity Effects
Languages vary in how they structure the terms for a semantic domain, such as colors or spatial relations. For example, in English we say “the cup is on the table”, “the ring is on the finger”, and “the painting is on the wall”, while Dutch speakers use a different preposition in each situation, and other languages use one preposition for the first two and a different one for the third. This kind of crosslinguistic variation raises important cognitive questions: Are all such lexical semantic systems equally easy to learn, and if not, what factors are at play? Does acquiring a particular system influence other parts of cognition – a position known as linguistic relativity? We study these issues using a computational cognitive model of word learning. We show that a novel vector-based meaning representation – based on crosslinguistic data over a domain – can be used to approximate a “universal” semantic space that captures cognitive biases. This approach to semantic representation can provide an explanation for both the developmental trajectory of words in a domain and subsequent behavior on a non-verbal task in the domain. This is joint work with Barend Beekhuizen, University of Toronto.
Suzanne Stevenson received a bachelor's degree in Computer Science and Linguistics from William and Mary, and master's and Ph.D. degrees in Computer Science from the University of Maryland, College Park. From 1995-2000, she was on the faculty at Rutgers University, holding joint appointments in the Department of Computer Science and in the Rutgers Center for Cognitive Science (RuCCS). She returned to the University of Toronto in July, 2000, where she is now Professor of Computer Science. She was a Visiting Professor in Linguistics at the University of California, Santa Barbara, in 2010-11 and 2015-16. Dr. Stevenson's research is in computational linguistics (CL) and cognitive science, integrating computational theories and techniques with insights from the fields of linguistics and psycholinguistics. In cognitive science, she works on computational models of child language acquisition and adult processing, taking probabilistic approaches that learn from and adapt to the environment. Her work in CL often focuses on machine learning of semantic and syntactic information from text data, showing how linguistic knowledge or cognitive principles can help guide the learning process.