25

Tuesday, January 23, 2018

Federal Officials Investigating Crash Involving Tesla on Autopilot in Culver City

Federal investigators are looking into a crash in Culver City after the driver of a Tesla involved said the car was on autopilot at the time, the National Transportation Safety Board tweeted Tuesday.

The luxury car slammed into the back of a fire engine while traveling at a speed of 65 mph on Monday, Culver City firefighters tweeted. No injuries were reported, but a day later, the collision had managed to draw the attention of federal investigators.

Firefighters said the Tesla driver claimed the car was on Autopilot — a feature that gives “full self-driving capability at a safety level substantially greater than that of a human driver,” as the company explains on its website.

Other crashes involving Tesla’s self-driving feature have been blamed on an “over reliance” drivers develop on Tesla’s Autopilot system, as the National Transportation Safety Board said in September following an investigation into a fatal Florida collision. The board advised Tesla to make a number of changes to the Autopilot system to make it safer and the carmaker did make a few.

The company currently classifies the Autopilot feature as a “driver assistance system,” making it still reliant upon the human driver. In a statement to KTLA, the company said: “Autopilot is intended for use only with a fully attentive driver.”

After the fatal crash in Florida, the National Transportation Safety Board said Tesla’s Autopilot system was not designed to — or just not able to — identify a truck crossing its path that it ended up hitting.

“Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate,” the safety board said of the moments leading up to the fatal crash.

Notably, the safety board also said the Tesla driver’s over reliance on the Autopilot system could also be seen as a “probable cause” of the collision. But that reliance was the result of Autopilot’s design, which allowed for the driver to actually disengage from “the driving task” for prolonged periods of time, the board said.

 

The self-driving hardware and software inside the Tesla Model S includes an onboard computer and a set of eight cameras that are supposed to provide 360 degree visibility around the car for a range of up to 820 feet, the company website states. There’s also “ultrasonic” sensors all around the vehicle to detect hard and soft objects.

Along with a calendar and other equipment, this all creates a driving experience that can at least feel entirely dependent upon the vehicle.

“All you will need to do is get in and tell your car where to go,” the company states. “If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar.”

In the Tesla Model S owner’s manual, however, the company warns drivers of possible issues that may arise with Autopilot. This could be the car changing lanes or suddenly accelerating when someone puts on a turn signal.

Still, following the fatal Florida crash, national transportation investigators said the design of Autopilot “enables” drivers to ignore such warnings. Tesla made some changes after that investigation, the board said, including reduced the amount of time before the Autopilot system alerts the driver of when his or her hands are off the steering wheel.

The transportation safety board also issued safety recommendations to the U.S. Department of Transportation, National Highway Traffic Safety Administration and two other manufacturers of Level 2 vehicle automation systems.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said the board’s chairman, Robert L. Sumwalt III, in a statement.

Jennifer Thang contributed to this story.



from KTLA http://ift.tt/2n84V4n

No comments:

Post a Comment