← Back to context

Comment by jcims

4 days ago

I wonder about this for things like self-driving cats. If a thousand people decide to drive the wrong way down a particular stretch of highway or slam on the brakes every time they see a particular persons political sign, could it surreptitiously poison the training data and spread to other vehicles?

As a person who aren't in USA|Canada, I worry more that cars that were developed there will learn to "turn on red"