← Back to context

Comment by fooker

7 hours ago

> I still don't know exactly what you mean

Straight forward quantization, just to one bit instead of 8 or 16 or 32. Training a one bit neural network from scratch is apparently an unsolved problem though.

> The trees that correspond to the neural networks are huge.

Yes, if the task is inherently 'fuzzy'. Many neural networks are effectively large decision trees in disguise and those are the ones which have potential with this kind of approach.

> Training a one bit neural network from scratch is apparently an unsolved problem though.

I don't think it's correct to call it unsolved. The established methods are much less efficient than those for "regular" neural nets but they do exist.

Also note that the usual approach when going binary is to make the units stochastic. https://en.wikipedia.org/wiki/Boltzmann_machine#Deep_Boltzma...

  • Interesting.

    By unsolved I guess I meant: this looks like it should be easy and efficient but we don't know how to do it yet.

    Usually this means we are missing some important science in the classification/complexity of problems. I don't know what it could be.