Recursive Deep Learning for Modeling Semantic Compositionality
- 16:00 29th July 2013 ( week 15, Trinity Term 2013 )Lecture Theatre B
Compositional and recursive structure is commonly found in different modalities, including natural language sentences and scene images. I will introduce several recursive deep learning models that, unlike standard deep learning methods can learn compositional meaning vector representations for phrases, sentences and images. These recursive neural network based models obtain state-of-the-art performance on a variety of syntactic and semantic language tasks such as parsing, paraphrase detection, relation classification and sentiment analysis.
Besides the good performance, the models capture interesting phenomena in language such as compositionality. For instance the models learn different types of high level negation and how it can change the meaning of longer phrases with many positive words. They can learn that the sentiment following a "but" usually dominates that of phrases preceding the "but." Furthermore, unlike many other machine learning approaches that rely on human designed feature sets, features are learned as part of the model.