← Back to context

Comment by Dylan16807

10 hours ago

Asking whether an entity has modeled and evaluated a specific situation, using that evaluation to inform its decisions, is not about subjective experience.

If you're asking whether their training data includes situations like this, and whether their trained model/other pieces of runtime that drive the car include that feature as part of their model, the answer is yes. But not in the way a normal human driver would think about it; many of the details of its decision making process are based on large statistical collections, rather than "I'm in a school zone and need to anticipate children may be obscured and run out into traffic." There are many places where the car needs to take caution without knowing specifically it's within 50 feet of a school zone.

While the deep details are not public, Waymo has shared a fair amount of description of their system, from which you can glean some ideas about the world model it creates and the actions it takes in specific situations: https://waymo.com/blog/2024/10/ai-and-ml-at-waymo https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto... https://waymo.com/blog/2024/10/introducing-emma