Showing posts with label DeepMind. Show all posts
Showing posts with label DeepMind. Show all posts

2015-12-02

[1506.03340] Teaching Machines to Read and Comprehend

http://arxiv.org/abs/1506.03340
In this work we define a new methodology that resolves this bottleneck and provides large scale supervised reading comprehension data.
The Impatient Reader
The Attentive Reader is able to focus on the passages of a context document that are most likely to inform the answer to the query. We can go further by equipping the model with the ability to reread from the document as each query token is read.
文書を学習して、穴埋め問題を解く。Impatient Reader と Attentive Reader の比較。

スライドはこの記事を参照。

2015-11-19

Teaching Machines to Read and Comprehend (slide)

http://lxmls.it.pt/2015/lxmls15.pdf
Conclusion

Summary
* supervised machine reading is a viable research direction with the available data,
* LSTM based recurrent networks constantly surprise with their ability to encode dependencies in sequences,
* attention is a very effective and exible modelling technique.

Future directions
* more and better data, corpus querying, and cross document queries,
* recurrent networks incorporating long term and working memory are well suited to NLU task.
Lisbon Machine Learning School 2015 のスライド。トピックは自然言語処理。