2015-11-19

Teaching Machines to Read and Comprehend (slide)

http://lxmls.it.pt/2015/lxmls15.pdf
Conclusion

Summary
* supervised machine reading is a viable research direction with the available data,
* LSTM based recurrent networks constantly surprise with their ability to encode dependencies in sequences,
* attention is a very effective and exible modelling technique.

Future directions
* more and better data, corpus querying, and cross document queries,
* recurrent networks incorporating long term and working memory are well suited to NLU task.
Lisbon Machine Learning School 2015 のスライド。トピックは自然言語処理。