Vectorial and neural models of compositional semantics

Lecturer: 
Teaching asssistant: 
Topic: 
Language & Computation
Level: 
Foundational
Abstract: 

This course reviews the recent tradition of models of vector-based semantic composition. In these models, lexical semantics is typically represented with vectors (as in lexical distributional semantics) and composition is represented using techniques from vector, matrix or tensor algebra. These models are now seen by many as filling the gap between (corpus-based and probabilistic) distributional lexical semantics, and (intuition-driven, logic-based) formal semantics. We focus in particular on neural models, where the vectors become activation vectors, and the composition functions can be learned from data using standard techniques from neural network modeling. We discuss recent successes with such models for language modelling (Mikolov et ak. 2012), constituency parsing (Socher et al. 2013) and dependency parsing (Le et al. 2014). We also relate these models to foundational neural networks models of hierarchical structure in language (Elman, 1990; Pollack, 1990), and to recent developments in understanding how the brain processes such structure.

Week: 
Second week
Slot: 
17:00 - 18:30 - slot 4
Room: 
52.015