Comment by ted_dunning
2 years ago
Minsky and Papert showed that single layer perceptrons suffer from exponentially bad scaling to reach a certain accuracy for certain problems.
Multi-layer substantially changes the scaling.
2 years ago
Minsky and Papert showed that single layer perceptrons suffer from exponentially bad scaling to reach a certain accuracy for certain problems.
Multi-layer substantially changes the scaling.
No comments yet
Contribute on Hacker News ↗