Comment by alkonaut
1 year ago
I feel old. I made my masters thesis on RNN's for learning dynamic systems e.g. for control purposes (quite a novelty at the time, around 2000). We wrote the backprop in C++ and ran it over night. Yes it was slow as hell with the tiny gradients. The network architectures were e.g. 5 or 10 neurons in a single hidden layer. NN's were a tiny subject that you were lucky to find courses in. Then closed my eyes for two seconds and looked at the subject again in 2015. Wow.
No comments yet
Contribute on Hacker News ↗