Noyce Conference Room
Seminar
  US Mountain Time
Speaker: 
Martha Lewis

Our campus is closed to the public for this event.

Abstract: We interact with computers every day, and often using something like human language. There is therefore a huge amount of research going into how to represent human language computationally. Statistical approaches to language modelling, where words are represented as vectors based on their distribution in large bodies of text, have formed the most successful approaches over recent years, and in particular deep neural language models have proven extremely effective in modelling language at the word level and above. However, deep neural models are somewhat opaque. In contrast, rule based models such as formal semantics give a clear account of how to compose words, and so we would like to use this knowledge in modelling language more transparently and, potentially, in a more  compositional way. I will give an overview of the model I work with that shows how to combine vector-based word representations with formal semantics. I will go on to show how this framework can be extended to model lexical phenomena such as entailment and ambiguity, and go on to discuss directions in metaphor, analogy, and grounded language learning.

SPEAKER

Martha LewisMartha LewisAssistant Professor
SFI Host: 
Melanie Mitchell

More SFI Events