Recursive Deep Learning for Modeling Semantic Compositionality Richard Socher Stanford Jan 17, 2014 Great progress has been made in natural language processing thanks to many different algorithms, each often specific to one application. Most algorithms force language into simplified representations such as bag-of-words or fixed-sized windows or require human-designed features. I will introduce two general models based on recursive neural networks that can learn linguistically plausible representations o
Hide player controls
Hide resume playing