Our results show that deep networks can be trained using only 16-bit wide fixed-point number representation when using stochastic rounding, and incur little to no degradation in the classification accuracy.16ビットの浮動小数点数(半精度浮動小数点数)でも確率的に丸め処理をすれば使い物になるらしい。
Showing posts with label IBM. Show all posts
Showing posts with label IBM. Show all posts
2015-03-21
[1502.02551] Deep Learning with Limited Numerical Precision
http://arxiv.org/abs/1502.02551
2014-08-10
Yann LeCun - My comments on the IBM TrueNorth neural net...
https://www.facebook.com/yann.lecun/posts/10152184295832143
Neuron states are binary (spikes) and synaptic weights are 9-bit signed integers with a 4-bit time delay. The overall peak performance is 4096x256x256x1000 = 266 GSops (billion synaptic operations per second).