Comment by torginus
5 days ago
I've wondered if some kind of smart pruning is possible during evaluation.
What I mean by that, is if a neuron implements a sigmoid function and its input weights are 10,1,2,3 that means if the first input is active, then evaluation the other ones is mathematically pointless, since it doesn't change the result, which recursively means the inputs of those neurons that contribute to the precursors are pointless as well.
I have no idea how feasible or practical is it to implement such an optimization and full network scale, but I think its interesting to think about
No comments yet
Contribute on Hacker News ↗