Self-Driving Cars Are Safe Until They Encounter The Unexpected (Which Should Be Expected)
I don't understand why police officials rush to make a statement not based on the facts.
On Monday, the day after the fatal Uber crash in Arizona, Sylvia Moir, the police chief, told reporters that from her viewing of the video, it appeared that neither the driver nor the self-driving car were at fault. "It’s very clear it would have been difficult to avoid this collision in any kind of mode,” Moir stated. "The driver said it was like a flash, the person walked out in front of them." We were led to believe the victim entered the roadway suddenly right before the collision.
Now we learn that (1) the victim was in the roadway already when struck, (2) the victim could be seen from a considerable distance before she was struck, (3) the driver in the Uber self-driving car was not acting as a backup, e.g. she was not looking at the road, (4) the Uber vehicle neither slowed down nor tried to swerve to avoid the collision. The car's technology - video, radar & other systems - failed.
It's a basic duty of all drivers that if a person is visible in a roadway - regardless of whether or not she in a crosswalk - to stop or take other action to avoid the collision.
The idea that autonomous driving vehicles will make us all more safe and reduce vehicle fatality places a greater faith in computers than I believe is warranted - at least for the next decade.