Tesla Autopilot Crash Exposes Industry Divide – IEEE Spectrum

http://spectrum.ieee.org/cars-that-think/transportation/self-driving/what-next-for-teslas-autopilot?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+IeeeSpectrum+%28IEEE+Spectrum%29&utm_content=FaceBook

Tesla Autopilot Crash Exposes Industry Divide article says:

The U.S. National Highway Traffic Safety Administration (NHTSA) classifies automation systems from Level 1, sporting basic lane-keeping or anti-lock brakes, through to Level 4, where humans need never touch the wheel (if there is one).

The first death of a driver in a Tesla Model S with its Autopilot system engaged has exposed a fault line running through the self-driving car industry. In one camp, Tesla and many other carmakers believe the best route to a truly driverless car is a step-by-step approach where the vehicle gradually extends control over more functions and in more settings (for example with over-the-air software updates like Tesla does). A Level 2 system like Tesla’s Autopilot can take over in certain circumstances, such as highways, but requires human oversight to cope with situations that the car cannot handle.

“What Tesla was thinking, I believe, is that maybe a lidar sensor wasn’t necessary because you have the human operator in the loop, acting as a fail-safe input.”—Karl Iagnemma, nuTonomy

Google and most self-driving car startups take an opposite view, aiming to deliver vehicles that are fully autonomous from the start, requiring passengers to do little more than tap in their destinations and relax. The way to fully autonomous has also had challenges. Complexities and bureaucratic wrangling that have dogged the nearly decade-long effort to roll out vehicle-to-vehicle (V2V) technologies.

 

Related postings:

Tesla’s Autopilot being investigated by the government following fatal crash

Automated to Death

12 Comments

  1. Tomi Engdahl says:

    Questions About Tesla Autopilot Safety Hit Stone Wall
    http://www.eetimes.com/document.asp?doc_id=1330477&

    A fatal accident in China thrust Tesla’s transparency into sharp focus this week, posing fresh and daunting questions as to how safe Tesla’s Autopilot really is.

    New reports surfaced this week in China about a crash that killed a 23-year-old occupant while driving a Tesla Model S in Handan, a city about 300 miles south of Beijing.

    This took place on January 20, 2016 — four months before Joshua Brown died in Florida, in a Tesla Model S on Autopilot.

    The Chinese government news channel CCTV reported that the Chinese driver, Gao Yaning, borrowed his father’s Tesla Model S. He was driving on the highway, when his car hit a street-sweeper truck on the side of the road at highway speed.

    CCTV showed a video footage of the accident captured by the Tesla Model S driver’s dash camera.

    The police found no sign that the vehicle applied the brakes before hitting the truck. Local media reported that the Autopilot was engaged at the time of the accident.

    That crash, according to the Chinese reports, was under investigation for the first half of this year, the result of which is a lawsuit filed in July by the victim’s family against Tesla China.

    Tesla’s credibility and transparency in question
    If reports are true, China’s Tesla fatality in January presents a problem for Tesla.

    Reply
  2. Tomi Engdahl says:

    Are Tesla Crashes Balanced Out By The Lives That They Save?
    https://tech.slashdot.org/story/16/11/13/2245224/are-tesla-crashes-balanced-out-by-the-lives-that-they-save

    Friday EE Times shared the story of a Tesla crash that occurred during a test drive. “The salesperson suggested that my friend not brake, letting the system do the work. It didn’t…” One Oregon news site even argues autopiloted Tesla’s may actually have a higher crash rate.

    Tesla’s own numbers show Autopilot has higher crash rate than human drivers
    http://katu.com/news/auto-matters/teslas-own-numbers-show-autopilot-has-higher-crash-rate-than-human-drivers

    A couple of weeks ago, I wrote about Tesla’s claim that its Autopilot driver-assistance software is safer than a human driver.

    After a fatal Autopilot crash last May, the company said the death was the first in 130 million miles of Autopilot driving—and noted that, “among all vehicles, in the U.S., there is a fatality every 94 million miles.”

    The clear implication: Autopiloted Teslas are safer than human-piloted cars, and lives would be saved if every car had Autopilot.

    But Tesla’s statistics are questionable at best. The small sample size—one crash—makes any calculation of Autopilot fatality rate almost meaningless.

    Furthermore, Tesla compared its Autopilot crash rate to the overall U.S. traffic fatality rate—which includes bicyclists, pedestrians, buses and 18-wheelers.

    A better yardstick for comparison is the fatality rate for U.S. drivers of cars and light trucks compiled by the Insurance Institute for Highway Safety.

    By that yardstick, the Tesla Autopilot driver fatality rate is almost four times higher than typical passenger vehicles.

    Reply
  3. Tomi Engdahl says:

    Adding Some Statistical Perspective To Tesla Autopilot Safety Claims
    http://www.forbes.com/sites/samabuelsamid/2016/07/05/adding-some-statistical-perspective-to-tesla-autopilot-safety-claims/#70f1f2a2f8f6

    “Figures never lie, but liars always figure.”

    “Tell me which side of the argument you are on and I will give you the statistics to prove you are right.”

    Variations of those quotes have been around for ages and the origins are debatable. However, there is a great deal of truth to both idioms. Whether discussing unemployment numbers, economic growth, the latest poll numbers or safety claims, the first thing you must always do before accepting the data is to understand the question that was asked to get that data. Subtle changes in the question can have a huge impact on the results.

    Based on Tesla’s statements we can assume the 130 million miles is the total of all miles traveled in Autopilot mode by all Model S and X vehicles globally since the update was released in October 2015. If we assume that we must also assume there have been no fatal accidents in other parts of the world that we don’t know about yet. Given the amount of telemetry that Tesla collects from their vehicles let’s give them the benefit of the doubt on this one. So one fatality in 130 million miles stands for the moment.

    How about the one fatality every 94 million miles in the United States? The best source for such data is the Department of Transportation’s Fatality Analysis Reporting System (FARS) which compiles accident data from state and local agencies nationwide.

    In 2014, Americans traveled 3.026 trillion miles on the road and a total of 32,675 people died along the way. That actually works out to one death ever 92.6 million miles

    That last part is important because the FARS data includes all traffic deaths, those in cars and trucks as well as those riding motorized or pedal cycles and pedestrians struck by a vehicle. As far as we know, no Autopilot equipped vehicle has struck and killed a pedestrian or cyclist. So Tesla’s comparison is actually looking at two quite different data sets. In 2014, 4,586 motorcyclists and 5,813 pedestrians/cyclists were killed.

    That leaves 22,276 vehicle occupants (drivers and passengers) that died. This latter set are probably the ones we should be comparing to Tesla’s one death in 130 million miles

    Based on that statistic, humans are actually better drivers than computers. However, even that isn’t necessarily a valid comparison.

    Reply
  4. Tomi Engdahl says:

    Darrell Etherington / TechCrunch:
    NHTSA closes probe into June 2016 Tesla crash, clearing Autopilot of fault and praising its safety features, including a ~40% drop in crashes since introduction — The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation …

    NHTSA’s full final investigation into Tesla’s Autopilot shows 40% crash rate reduction
    https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/

    The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation into last year’s fatal crash involving a driver’s use of Tesla’s semi-autonomous Autopilot feature. The report clears Tesla’s Autopilot system of any fault in the incident, and in fact at multiple points within the report praises its design in terms of safety, and highlights its impact on lowering the number of traffic incidents involving Tesla vehicles overall.

    NHTSA notes that crash rates involving Tesla cars have dropped by almost 40 percent since the wide introduction of Autopilot.

    It’s essentially as good as result as Tesla can have hoped for from the U.S. traffic safety agency’s investigation, which took place over the last six months. Reuters reported earlier on Thursday that the investigation would not result in a recall of Tesla vehicles, but the full findings show that in fact, the federal regulatory body found plenty to praise while conducting its inquiry.

    The investigation does conclude with a minor admonition that Tesla could perhaps be more specific about its system limitations in its driver-assist features

    U.S. traffic safety agency to close Tesla Autopilot investigation without recall request
    https://techcrunch.com/2017/01/19/u-s-traffic-safety-agency-to-close-tesla-autopilot-investigation-without-recall-request/

    The U.S. National Highway Traffic Safety Administration (NHTSA) will close the investigation it began six months ago into a driver death that occurred while using Tesla’s Autopilot highway semi-autonomous driving feature, Reuters reports. The investigation did not find cause for a recall of Tesla vehicles with Autopilot, the report claims.

    U.S. regulator finds no evidence of defects after Tesla death probe
    http://www.reuters.com/article/us-tesla-safety-idUSKBN1532F8

    U.S. auto safety regulators said on Thursday they found no evidence of defects in a Tesla Motors Inc (TSLA.O) car involved in the death of a man whose Model S collided with a truck while he was using its Autopilot system.

    The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks.

    Reply
  5. Tomi Engdahl says:

    Sam Thielman / The Guardian:
    After accidents, Tesla has selectively released driver data logs to media that even drivers can’t access, raising concerns over privacy, incomplete storytelling

    The customer is always wrong: Tesla lets out self-driving car data – when it suits
    https://www.theguardian.com/technology/2017/apr/03/the-customer-is-always-wrong-tesla-lets-out-self-driving-car-data-when-it-suits

    The luxury car maker is quick to divulge data to suggest its technology was not responsible for crashes but refuses to let drivers themselves see the data logs

    Reply
  6. Tomi Engdahl says:

    Bloomberg:
    NTSB’s preliminary report on fatal Tesla Model X crash in March: Autopilot was being used, and the car sped up and steered left seconds before impact

    Tesla Model X in California Crash Sped Up Prior to Impact
    https://www.bloomberg.com/news/articles/2018-06-07/tesla-model-x-in-california-crash-sped-up-seconds-before-impact

    The Tesla Inc. Model X that crashed in California earlier this year while being guided by its semi-autonomous driving system sped up to 71 miles an hour in the seconds before the vehicle slammed into a highway barrier, investigators said Thursday.

    The Tesla Inc. Model X that crashed in California earlier this year while being guided by its semi-autonomous driving system sped up to 71 miles an hour in the seconds before the vehicle slammed into a highway barrier, investigators said Thursday.

    A U.S. National Transportation Safety Board preliminary report on the March 23 accident in Mountain View raises new questions about the capabilities of Tesla’s semi-autonomous driving system and the actions of the driver. His hands were detected on the steering wheel only 34 seconds during the last minute before impact and he had programmed the car to drive at 75 mph, the report said.

    The investigation is the latest to shine a spotlight into potential flaws in emerging autonomous driving technology.

    Another NTSB probe of a self-driving Uber Technologies Inc. car that killed a pedestrian March 18 in Arizona found that the car’s sensors picked up the victim, but the vehicle wasn’t programmed to brake for obstructions.

    Reply
  7. Tomi Engdahl says:

    Jordan Novet / CNBC:
    DMV report: Apple, which has 66 self-driving cars approved for road testing in CA, says one of the SUVs was rear-ended, its first self-driving collision in CA — Apple said that its first autonomous vehicle crash in California occurred last week. — The accident during …

    Apple reports first autonomous vehicle collision in California
    https://www.cnbc.com/2018/08/31/apple-reports-first-autonomous-vehicle-collision-in-california.html

    Apple now has 66 autonomous vehicles approved for testing in California.
    There were no injuries in the crash.

    Reply
  8. Tomi Engdahl says:

    Researchers Trick Tesla to Drive into Oncoming Traffic
    https://www.bleepingcomputer.com/news/security/researchers-trick-tesla-to-drive-into-oncoming-traffic/

    Steering a Tesla car off the normal driving lane, potentially on a collision path, is possible without hacking the vehicle’s advanced driver-assistance system, better known as the Enhanced Autopilot.

    By painting interference patches on the road, researchers demonstrated that a Tesla Model S 75 can follow a fake path without asking the driver for permission, as the Autopilot component does in the case of changing lanes.

    Road stickers allow “fake-lane” attack

    Researchers at Tencent’s Keen Security Lab examined how lane detection technology works when the Autosteer mode is active and experimented with the Autopilot hardware version 2.5 and firmware revision 2018.6.1.

    Responsible for this procedure is the function “detect_and_track,” which also feeds data to CUDA kernels to correctly determine and interpret the marking on the road.

    Reply
  9. Tomi Engdahl says:

    WORN OUT EMMC CHIPS ARE CRIPPLING OLDER TESLAS
    https://hackaday.com/2019/10/17/worn-out-emmc-chips-are-crippling-older-teslas/

    Another advantage, at least in theory, is reduced overal maintenance cost. While a modern EV will of course be packed with sensors and complex onboard computer systems, the same could be said for nearly any internal combustion engine (ICE) car that rolled off the lot in the last decade as well. But mechanically, there’s a lot less that can go wrong on an EV.

    Unfortunately, it seems the rise of high-tech EVs is also ushering in a new error of unexpected failures and maintenance woes. Case in point, some owners of older model Teslas are finding they’re at risk of being stranded on the side of the road by a failure most of us would more likely associate with losing some documents or photos: a disk read error.

    LINUX LOUDLY LOGGING

    Reply
  10. Tomi Engdahl says:

    Nearly 4,000 Tesla Cybertrucks are being recalled for flimsy pedals at risk of causing unintentional acceleration. https://trib.al/3jWpJBy

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*