Showing posts with label ReLU. Show all posts
Showing posts with label ReLU. Show all posts

2015-04-08

[1504.00941] A Simple Way to Initialize Recurrent Networks of Rectified Linear Units

http://arxiv.org/abs/1504.00941
In this paper, we propose a simpler solution that use recurrent neural networks composed of rectified linear units.
Key to our solution is the use of the identity matrix or its scaled version to initialize the recurrent weight matrix.
再帰ネットに ReLU (rectified linear units) を使って長期の時間的構造を学習する。