ConclusionLisbon Machine Learning School 2015 のスライド。トピックは自然言語処理。
Summary
* supervised machine reading is a viable research direction with the available data,
* LSTM based recurrent networks constantly surprise with their ability to encode dependencies in sequences,
* attention is a very effective and exible modelling technique.
Future directions
* more and better data, corpus querying, and cross document queries,
* recurrent networks incorporating long term and working memory are well suited to NLU task.
Showing posts with label Lei Yu. Show all posts
Showing posts with label Lei Yu. Show all posts
2015-11-19
Teaching Machines to Read and Comprehend (slide)
http://lxmls.it.pt/2015/lxmls15.pdf