2015-04-27

GloVe: Global Vectors for Word Representation

http://nlp.stanford.edu/projects/glove/
GloVe is an unsupervised learning algorithm for obtaining vector representations for words. Training is performed on aggregated global word-word co-occurrence statistics from a corpus, and the resulting representations showcase interesting linear substructures of the word vector space.

2015-04-20

Echo state network - Wikipedia, the free encyclopedia

http://en.wikipedia.org/wiki/Echo_state_network
The echo state network (ESN) is a recurrent neural network with a sparsely connected hidden layer (with typically 1% connectivity). The connectivity and weights of hidden neurons are randomly assigned and are fixed. The weights of output neurons can be learned so that the network can (re)produce specific temporal patterns.

Reservoir computing - Wikipedia, the free encyclopedia

http://en.wikipedia.org/wiki/Reservoir_computing
Reservoir computing is a framework for computation like a neural network.

The “echo state” approach to analysing and training recurrent neural networks – with an Erratum note

http://web.info.uvt.ro/~dzaharie/cne2013/proiecte/tehnici/ReservoirComputing/EchoStatesTechRep.pdf
This is a corrected version of the technical report H. Jaeger(2001): The ”echo state” approach to analysing and training recurrent neural networks. GMD Report 148, German National Research Center for Information Techno logy, 2001.

An overview of reservoir computing: theory, applications and implementations

https://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2007-8.pdf
This tutorial will give an overview of current research on theory, application and implementations of Reservoir Computing.

Frontiers | MACOP modular architecture with control primitives | Frontiers in Computational Neuroscience

http://journal.frontiersin.org/article/10.3389/fncom.2013.00099/full
2.4.2. Echo state networks
We use an Echo State Network (ESN) (Jaeger, 2001) as inverse model. An ESN is composed of a discrete-time recurrent neural network [commonly called the reservoir because ESNs belong to the class of Reservoir Computing techniques (Schrauwen et al., 2007)] and a linear readout layer which maps the state of the reservoir to the desired output.
Download PDF
http://journal.frontiersin.org/article/10.3389/fncom.2013.00099/pdf

2015-04-14

青空朗読 | 青空文庫に所蔵されている本を朗読しています

http://aozoraroudoku.jp/
「青空朗読」はインターネット上の図書館である「青空文庫」に掲載されている本を朗読する ...
「青空文庫」を朗読した mp3 をダウンロードできる。
この音声データと「青空文庫」のテキストを合わせれば、日本語音声コーパスとして使えるだろう。

2015-04-08

[1504.00941] A Simple Way to Initialize Recurrent Networks of Rectified Linear Units

http://arxiv.org/abs/1504.00941
In this paper, we propose a simpler solution that use recurrent neural networks composed of rectified linear units.
Key to our solution is the use of the identity matrix or its scaled version to initialize the recurrent weight matrix.
再帰ネットに ReLU (rectified linear units) を使って長期の時間的構造を学習する。

Learning a Deep Convolutional Network for Image Super-Resolution

http://mmlab.ie.cuhk.edu.hk/projects/SRCNN.html
Test code for SRCNN
SRCNN のソースコードがある。

[1501.00092] Image Super-Resolution Using Deep Convolutional Networks

http://arxiv.org/abs/1501.00092
The mapping is represented as a deep convolutional neural network (CNN) that takes the low-resolution image as the input and outputs the high-resolution one.
畳み込みニューラルネットを使って、低解像度の画像を入力して高解像度の画像を出力する。

Random Ponderings: A Brief Overview of Deep Learning

http://yyue.blogspot.ca/2015/01/a-brief-overview-of-deep-learning.html
(This is a guest post by Ilya Sutskever on the intuition behind deep learning as well as some very useful practical advice. Many thanks to Ilya for such a heroic effort!)

2015-04-07

Stanford University CS231n: Course Projects Winter 2015

http://cs231n.stanford.edu/reports.html
CS231n: Convolutional Neural Networks for Visual Recognition

CS231n Course Project Reports