Mercedes SLK World banner

1 - 3 of 3 Posts

·
Premium Member
Joined
·
14,703 Posts
Discussion Starter #1
The car blames the driver when they do the read out of the program. The driver blames they car because it is self driving and he/she had to take over. What a perfect world.:grin:

Driverless cars are the next big thing these days, and some startups have begun to test their prototypes in the field as well.

Since driving a car is a serious and risky business at the same time, accidents do happen. Even to cars that can drive themselves. However, this story is not about an autonomous vehicle that got into a fender-bender in California. It is more a “what’s the worst that can happen?” type of situation with an ironic twist.

According to a story on The Verge, an autonomous prototype belonging to Cruise Automation was involved in an accident in San Francisco on January 8, 2016.

So far, nothing outstanding, right? Well, it turns out that the vehicle in question had a human driver who decided to turn off the self-driving mode and take the wheel. Furthermore, this seems to be the first ever accident involving an autonomous vehicle in that particular city. Not something to be proud of, but still.

For some reason, things did not work out as expected and the autonomous converted Nissan Leaf rear-ended a parked Toyota Prius. The hybrid car was parallel parked on the side of 7th Street in San Francisco, and the self-driving prototype was driving in the right lane (that is the closest to the curb) at around 20 MPH (32 km/h) when it began moving to the left.

The vehicle then began correcting itself and went to the right. The human driver decided to take control and failed to modify the vehicle’s trajectory, colliding with the Prius mentioned above. Thankfully, nobody was injured.

Kyle Vogt, the CEO of the Cruise Automation, stated that “there was enough room to avoid the accident, but the operator made a mistake.” The CEO of the startup told IDN News Service that the aftermarket system developed by his company alerted the driver several seconds before the incident.

Unfortunately, we do not know how many seconds in advance of the potential impact the driver was warned, what the operator was doing at the time, and if there was another factor involved in the incident.

Average human driver reaction time for an external alert is of two seconds. A professional racing driver could get a reaction time of around a second from the moment his eyes spot imminent danger and the time the braking or avoidance maneuvers begin.

If we assume that the “several seconds” mentioned by the company’s CEO refer to more than two, this incident is, most likely, caused by human error. After all, it’s crashing into a parked car at 20 MPH in clear visibility on the right lane.

It is worth noting that the guys at Cruise Automation are the first of their kind, meaning a company that developed an aftermarket system that attaches to a standard car and can turn it into a self-driving vehicle.

The prototype RP-1 system is still being tested, but the company has big plans for the system it intends to sell for around $10,000 a piece once it is ready for the public.


 

·
Super Moderator
Joined
·
24,227 Posts
10K, sheesh, could easily buy two R170s and have them updated for that.
Driver will always be to blame, they decide to engage/disengage system. So, even if on it would be driver error.
 

·
Super Moderator
Joined
·
24,227 Posts
Just watching a tv ad for Toyota, the one with a fighter pilot sitting next to the driver.


Kapow! NCIS moment.


So it looks like switching off the auto also takes out the pre collision system.
Because cars like that must have pre collision, it would be part of the auto drive.


That makes it programming error. The type that gets huge class action. Like the you must wear a helmet sticker on bikes,
or this end faces forward on shooters.
 
1 - 3 of 3 Posts
Top