Comment by ted_dunning
1 year ago
Minsky and Papert showed that single layer perceptrons suffer from exponentially bad scaling to reach a certain accuracy for certain problems.
Multi-layer substantially changes the scaling.
1 year ago
Minsky and Papert showed that single layer perceptrons suffer from exponentially bad scaling to reach a certain accuracy for certain problems.
Multi-layer substantially changes the scaling.
No comments yet
Contribute on Hacker News ↗