WASHINGTON (WLS) -- There are new demands for Tesla to update its Autopilot feature as federal investigators examine a series of crashes.
Consumer Reports called on the automaker to disable the feature until updates make sure the driver's hands stay on the steering wheel at all times. Consumer Reports said it gives drivers a false sense of security.
Feds examine how Tesla Autopilot reacts to crossing traffic
Federal investigators looking into electric car maker Tesla Motors' Autopilot system after a fatal crash in Florida are zeroing in on the limitations of the system and how it reacts when obstacles cross its path.
The National Highway Traffic Safety Administration on Tuesday posted a nine-page letter seeking information from Tesla about Autopilot and why it failed to detect a tractor-trailer that crossed in front of a Model S sedan May 7 in Williston, Florida.
Much of the letter seeks information on how the system works at intersections with crossing traffic, but it also asks Tesla to describe how the system detects "compromised or degraded" signals from cameras and other sensors and how such problems are communicated to drivers.
The crash in Williston killed former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla, which collects data from its cars via the Internet, says the cameras on Brown's Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and the car didn't automatically brake.
The safety agency also asked Tesla for its reconstruction of the Brown crash, and for details of all known crashes, consumer complaints and lawsuits filed or settled because the Autopilot system didn't brake as expected.
NHTSA said Tesla must comply with its request by Aug. 26 or face penalties of up to $21,000 per day, to a maximum of $105 million.
A spokesman said the agency hasn't determined if a safety defect exists with Autopilot. The information request is a routine step in an investigation, spokesman Bryan Thomas said.
Tesla's Autopilot system uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company says that before Autopilot can be used, drivers must acknowledge that it's an "assist feature" that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.
Tesla released Autopilot last fall. Some safety advocates have questioned whether the company - which says the system is still in "beta" phase, a computer industry term for software testing by customers - and NHTSA allowed the public access to the system too soon.
"No safety-significant system should ever use consumers as test drivers on the highways," said Clarence Ditlow, head of the nonprofit Center for Automotive Safety. He said NHTSA lacks the electronic engineers and laboratories needed to keep up with advanced technology such as General Motors air bags or Tesla's Autopilot.
Tesla says Autopilot has been safely used in over 100 million miles of driving by customers and that data shows drivers who use Autopilot are safer than those who don't.
NHTSA's Thomas said he won't comment on specifics of the investigation. The agency does not currently have legal authority to prevent automakers from rolling out features if they meet basic federal motor vehicle safety standards. It is in the process of developing standards for self-driving cars.
The NHTSA letter came as Tesla disclosed that a second crash occurred while at least part of the Autopilot system was operating.
A driver who was heading from Seattle to Yellowstone National Park told a state trooper that his Tesla Model X SUV was on Autopilot when it crashed early Saturday on a rural two-lane road in Montana, the state's Highway Patrol said.
But Tesla said the driver activated autosteer, one of the Autopilot features, and no force was detected on the steering wheel for more than two minutes. If there's no force on the wheel or a sharp turn is detected, the vehicle is programmed to gradually reduce speed, stop and turn on the emergency lights, Tesla said in a statement.
The company said the Model X alerted the driver to put his hands on the wheel, but he didn't do it. "As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway," the statement said.
It wasn't clear whether the Model X had made a decision to stop at the time of the crash.
Neither the driver nor the passenger was injured in the single-vehicle crash, but there was extensive damage to the passenger side and the car lost a wheel, Montana Highway Patrol trooper Jade Shope said.
The car negotiated a right curve and went off the road, traveling about 200 feet on the narrow shoulder, taking out 13 posts, Shope said.
The trooper did not cite the driver, saying he believed any citation would be voided because of the driver's claim that the car was on Autopilot.
The NHTSA investigation, opened June 28, could have broad implications for the auto industry and its path toward self-driving cars. If the probe finds defects with Tesla's system, the agency could seek a recall. Other automakers have or are developing similar systems that may need to be changed due to the probe, which also could affect self-driving car regulations to be unveiled this summer.
In the letter, NHTSA also asked Tesla for details on any modification to the Autopilot system that Tesla has made.
The Associated Press contributed to this report.