Yellow Box Problem
Glossaries
Term | Definition |
---|---|
Yellow Box Problem | This ethical dilemma in self-driving cars refers to scenarios where unavoidable accidents occur, posing challenges in decision-making The "Yellow Box Problem" in the AI world isn't as widely used as a formal term, but it refers to a specific ethical dilemma faced by self-driving cars in situations where an accident is unavoidable. It gets its name from the yellow box markings used at some intersections to prevent vehicles from blocking them, highlighting the potential for dangerous conflicts in these areas. Here's a breakdown of the problem: The Scenario: Imagine a self-driving car approaching a yellow box intersection. Suddenly, it detects an obstacle (e.g., pedestrian, another car) that it cannot avoid colliding with, regardless of its actions. The Dilemma: The AI system controlling the car has two main options:
No Easy Answer: There's no universally accepted solution to the Yellow Box Problem. Each option has its drawbacks, and the "best" choice depends on various factors, including:
|