Did Tesla Flunk ‘Interpretation' Test?

发布时间:2018-01-29 00:00
作者:Ameya360
来源: Junko Yoshida
阅读量:967

  Earlier this week, a Tesla operating on Autopilot at 65 mph crashed into a stalled fire truck on the highway.

  Ouch.

  Luckily, no humans got hurt.

  “The fire truck had been parked in the left emergency lane and carpool lane, blocking off the scene of a previous accident, with a California Highway Patrol vehicle behind it and to the side,” according to the Mercury News.

  The U.S. National Transportation Safety Board (NTSB) has opened the investigation into the matter.

  The safety board sent two investigators to Culver City on Tuesday, while NHTSA confirmed Wednesday that it is also dispatching a special team "to investigate the crash and assess lessons learned." The use of Autopilot, reported by the Tesla’s “driver” — although this remains unconfirmed by a third party — appears to have gotten the federal agencies interested in the case.

  Details remain sketchy and there are a lot of unanswered questions, since Tesla and authorities have gone mum. Nonetheless, Tesla holds the key to data that could reveal what exactly happened. “Tesla can give NTSB a ton of information because of the black box recording they do with those vehicles,” said Phil Magney, founder and principal advisor for VSI Labs.

  It’s known that each Tesla has an SD card that stores all data onboard. Tesla also can access this information by connecting wirelessly with the car, unless the media unit or on board LTE are damaged.  “Even if you do retrieve the SD card,” Magney said, “It will not give you the information the NTSB needs. Only Tesla can provide this, as it is highly encrypted.”

  We asked automotive experts to weigh in with the questions raised in their minds — especially related to Tesla’s advanced driver assistance system (ADAS).

  Let’s start with what we don’t know for sure.

  Autopilot

  Magney told us, “We don’t actually know if Autopilot (AP) was active or not. The driver might be blaming the accident on AP like that Minnesota case from several months ago.” In a Minnesota crash last July, the human driver initially blamed Tesla’s Autopilot, but later recanted.

  Traffic-Aware Cruise Control

  Mike Demler, senior analyst at the Linley Group, reminded us that “Autopilot” is actually a package of features. They include Traffic-Aware Cruise Control, Autosteer, and Auto Lane Change. "If the driver had adaptive cruise control on, I believe that a well-designed system should have detected the stoppage (in the high-speed lane apparently), but Tesla’s owners manual warns that it  may not."

  The Tesla's owners manual states:

  “Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead.”

  So assuming adaptive cruise was on, it apparently gives priority to the radar and not the cameras to detect objects, Demler theorized. "That would be a software and sensor-fusion problem," he said.

  Did AEB save the driver?

  Magney wonders if Automatic Emergency Braking (AEB) was activated or not. He told us, “In Tesla cars, the driver can deactivate this function!” Given the car was traveling at 65 mph, this was a big hit on a very large fire truck. The impact could have severely injured the driver. “It is possible AEB mitigated the crash a bit and this how the driver could survive this,” Magney hypothesized.

  Did sensors work?

  Magney said, “If this was a late model S it would have a more sophisticated set of sensors, mainly camera and radar.” He noted, “The radar should have picked this up since it was in the path of the vehicle.”

  There is one caveat, though. Magney noted, “Radars have a lot of filtering for stationary objects because if do not, you would have lots of false positive from parked vehicles on the sides of the road or signs, etc.”

  The cameras should have also picked up and classified this as a vehicle.

  But there is another wrinkle. “It is possible that the orientation of the fire truck was such that the camera did not know it was a large truck,” Magney theorized. “Or possibly the camera picked up the large truck but did not realize it was in the path of the vehicle, similar situation with the radar.”

  Demler similarly sees potential problems here. Speaking of the radar detecting the stopped firetruck, he said this case reminds him of the Florida fatality. "The software only uses the radar to detect changes in speed. In Florida, the trailer obstruction was moving across the field of view, so Tesla’s software thought it  was a stationary overhead road sign and ignored it. If the driver in Culver City wasn’t paying attention, and cars moved out of the lane to go around the firetruck, the Telsa may have initially sped up!" Demler quoted once again Tesla's owner manual which says this:

  “When the vehicle you are following is no longer detected, Traffic-Aware Cruise Control accelerates back to the set speed.”

  How far did the radar see?

  Asked whose radar unit was in the Tesla vehicle, Magney said, “We don’t know if this is a Bosch or Continental radar unit, as Tesla recently switched to Conti.  If this was a late model (2017 or 2018) it would be the Conti unit. Prior models would have the Bosch.” He added, “These are mid-range radars with ranges of 160 meters. According to some reports, the Conti unit they switched to has longer range and this is the reason for doing so.”

  Magney believes both camera and radar should have been working with the AP engaged. They both likely saw the truck, he noted.

  But here’s the rub. “The AP motion planner did not think the object was in the path of the vehicle. The radar might have filtered it out since it was stationary while the camera may have failed to classify what it was seeing,” Magney said.

  The left hand, in sum, did not know what the right hand was doing.

  What about microphones?

  Demler asked about microphones. "With all the talk about cameras, lidars, and radars, people often ignore another crucial sensor for autonomous vehicles — microphones!"  Demler believes that Google has used them in its test vehicles for a long time, to detect sirens. Maybe the firetruck didn’t have its sirens on, just its flashing lights. "But the cameras should have easily detected that," said Demler.

  Extra layers of safety, reliability?

  One lesson from the accident is that even though sensors on board may have good perception, they must grasp the situation and interpret it, Hagai Zyss, CEO of the Israeli company Autotalks, told EE Times. Stationary objects are easily misinterpreted. Zyss said, “But if a stationary vehicle sends an alert to other vehicles from a mile away, it’s hard to misinterpret the communication because the vehicle itself is reporting.” In short, DSRC-based vehicle-to-vehicle (V2V) communication technology could have helped.

  Of course, the V2V, vehicle-to-infrastructure (V2I) requires a network effect. Society needs to agree on the technology and the technology must penetrate the market, so people can see its effectiveness. But a self-contained solution — such as depending solely on sensors inside the car — has obvious limitations, Zyss told us. V2V and V2I provide an extra layer of safety. “You don’t need to wait for fully automated vehicles to avoid accidents. You can implement V2V today,” Zyss stressed.

  Meanwhile, Magney pointed out, “This [Tesla] case make a strong case for L2+ (as Mobileye calls it).”

  In order to provide common terminology for automated driving, the SAE offers six levels of driving automation from “no automation” (Level 0) to “full automation” (Level 5).

  During the Consumer Electronics Show earlier this month, Mobileye coined a level called “Level 2 +.” It was explained as the movement toward applying “localization assets” to L2 application.

  Magney explained, “Level 2 as we know it today is little more than adaptive cruise control coupled with active lane keeping. But relying on following vehicles and lanes eventually proves problematic when lane lines become hard to detect or the following car goes away.”

  He said, “L2 systems would be safer with a localization method against a base map to better understand trajectories.”  GM has done it with the company’s Supercruise.

  “The Tesla system currently does this only with its camera and this is not good enough in my opinion,” he said. “If you apply an HD Map and localize against it you will give the car a lot more intelligence and avoid these collisions when a very large object happens to be in the path of a vehicle.”

(备注:文章来源于网络,信息仅供参考,不代表本网站观点,如有侵权请联系删除!)

在线留言询价

相关阅读
  • 一周热料
  • 紧缺物料秒杀
型号 品牌 询价
RB751G-40T2R ROHM Semiconductor
BD71847AMWV-E2 ROHM Semiconductor
CDZVT2R20B ROHM Semiconductor
TL431ACLPR Texas Instruments
MC33074DR2G onsemi
型号 品牌 抢购
TPS63050YFFR Texas Instruments
BP3621 ROHM Semiconductor
STM32F429IGT6 STMicroelectronics
BU33JA2MNVX-CTL ROHM Semiconductor
IPZ40N04S5L4R8ATMA1 Infineon Technologies
ESR03EZPJ151 ROHM Semiconductor
热门标签
ROHM
Aavid
Averlogic
开发板
SUSUMU
NXP
PCB
传感器
半导体
相关百科
关于我们
AMEYA360微信服务号 AMEYA360微信服务号
AMEYA360商城(www.ameya360.com)上线于2011年,现 有超过3500家优质供应商,收录600万种产品型号数据,100 多万种元器件库存可供选购,产品覆盖MCU+存储器+电源芯 片+IGBT+MOS管+运放+射频蓝牙+传感器+电阻电容电感+ 连接器等多个领域,平台主营业务涵盖电子元器件现货销售、 BOM配单及提供产品配套资料等,为广大客户提供一站式购 销服务。