Yep. In this case, neither the robot NOR the human driver was able to stop the accident. It probably was genuinely unavoidable.
That was the story about two cars, no fatalities. This is a story about a pedestrian killed by the Uber car. We don't know nearly enough about the circumstances here, other than the decedent was not in a crosswalk. This would be a case of, if it was a human driver, an "error in judgement." That's a big problem for the AI people to overcome. It's what I've been trying to get across in the whole thread. A self driving car has to exercise split-second judgement calls, and I don't think AI is up to that yet.
Someday it will be, but until it can do that, it's dangerous to put these on the same streets as human beings. Humans make mistakes, and a self-driving car had better learn how to drive defensively or it's a menace. The story of the cars crashing last year (that I apparently should not have linked because people are confusing the two) was probably unavoidable, but this case of a pedestrian fatality is going to get a much closer look from the people who have to decide whether AI driven cars are an acceptable risk. Next time it might be you or me out there with a car arrogantly asserting its right of way.