,亚洲欧美日韩国产成人精品影院,亚洲国产欧美日韩精品一区二区三区,久久亚洲国产成人影院,久久国产成人亚洲精品影院老金,九九精品成人免费国产片,国产精品成人综合网,国产成人一区二区三区,国产成...

立即打開
特斯拉認為Model X致命事故過錯在于司機

特斯拉認為Model X致命事故過錯在于司機

Kirsten Korosec 2018-04-15
特斯拉稱,事故發生的唯一可能在于車主沒有注意道路,盡管汽車已經發出了多次警告。

特斯拉(Tesla)表示,導致Model X撞擊公路隔欄的過錯方,是在事故中喪生的車主,而不是車載的半自動駕駛系統。

車主沃爾特·黃的家屬委托了一家法律機構研究可以采取的法律手段,而隨后,這家電動汽車廠商發布了指責司機的聲明。黃今年38歲,他的2017款Model X撞上了沒有防撞桶的混凝土公路隔欄邊緣,并導致了他的死亡。

特斯拉在一份郵件聲明中表示:“根據其家人的說法,黃先生很清楚Autopilot自動駕駛系統不夠完美,他也曾告訴家人Autopilot在事發位置并不可靠,但卻仍然在那里啟動了Autopilot。事發時天氣晴朗,前方幾百英尺都清晰可見,也就是說,事故發生的唯一可能在于黃先生沒有注意道路,盡管汽車已經發出了多次警告。”

這是3月23日撞車事件發生以來特斯拉發表的第三份聲明,也是在撇清責任上措辭最強烈的一次。公司之前曾稱沃爾特·黃在事故發生的6秒前沒有用雙手控制方向盤。

以下是特斯拉的聲明全文:

我們對車主家人遭遇的損失深表遺憾。

根據其家人的說法,黃先生很清楚Autopilot自動駕駛系統不夠完美,他也曾告訴家人Autopilot在事發位置并不可靠,但卻仍然在那里啟動了Autopilot。事發時天氣晴朗,前方幾百英尺都清晰可見,也就是說,事故發生的唯一可能在于黃先生沒有注意道路,盡管汽車已經發出了多次警告。

這一事件中,道德和法律責任的基本前提并不存在。特斯拉十分清楚,Autopilot要求司機保持警惕,將雙手放在方向盤上。每次啟用Autopilot,司機都會收到這一提示。如果系統檢測到司機沒有把手放在方向盤上,就會發出圖像和聲音警報。在黃先生駕駛的那天,系統發出了多次警報。

黃先生的家屬失去親人,悲痛之情可以理解,但我們要強調,錯誤地認為Autopilot不安全,會對道路上的其他司機造成危害。美國國家公路交通安全管理局(NHTSA)發現,即使是特斯拉Autopilot的早期版本也可以減少40%的撞車事件,而自那以后,系統又有了大幅改進。其他家庭沒有登上新聞是因為他們摯愛的親人仍然活著。

特斯拉的發言人并未提供關于撞車的更多信息,包括汽車行駛時對黃先生發送了多少次警報,一般情況下Autopilot系統又會在發出多少次警報后自動停止工作?如果司機沒有把手放在方向盤上,Autopilot系統會發出圖像和聲音警報。在數次警報之后,系統就會自動停止工作。

代表黃先生家庭的法律公司Minami Tamaki表示,經過初步調查,他們發現其他特斯拉用戶也有過針對Autopilot導航錯誤的抱怨。公司稱:“我們認為特斯拉的Autopilot存在缺陷,可能導致了黃先生的死亡,盡管特斯拉顯然試圖把這場可怕悲劇的過錯歸于受害者。”

特斯拉在推廣中聲稱Autopilot是半自動駕駛系統,而不是在特定環境下無需司機參與、能夠處理所有駕駛操作的全自動系統。Autopilot擁有一系列功能,例如幫助司機在車道內行駛的自動轉向功能,根據周圍交通狀況維持車速的自適應巡航控制功能,以及轉向燈激活且路況安全的情況下移動到相鄰車道的自動變道功能。

當Autopilot激活時,特斯拉的系統確實會提醒司機把手放在方向盤上。然而,不是所有的司機都會接受這些警告,特斯拉因為沒有阻止系統遭到誤用而一直受到批評。美國國家安全運輸委員會(National Transportation Safety Board)的調查顯示,2016年5月佛羅里達州一場致命的撞車,部分原因就在于司機過分依賴Autopilot。

當時,國家安全運輸委員會表示“運行限制”,例如特斯拉沒能確保司機在汽車高速行駛時保持專注,是2016年致命撞車的主要因素。該機構建議特斯拉和其他汽車廠商采取措施,確保半自動駕駛系統不會遭到誤用。例如,通用汽車(GM)凱迪拉克(Cadillac)CT6汽車上可選搭載的半自動系統Supercruise就配有攝像系統,確保司機在關注前方路況,否則系統就會自動停止工作。

美國國家公路交通安全管理局(National Highway Traffic Safety Administration)之前曾調查過2016年的撞車事件,當時他們沒有發現Autopilot存在缺陷。(財富中文網)

譯者:嚴匡正?

Tesla says a Model X owner who died in a crash last month near San Francisco is at fault, not the semi-autonomous Autopilot system that was engaged when the SUV slammed into a highway divider.

The electric carmaker issued the statement blaming the driver, Walter Huang, after his family had hired a law firm to explore legal options for them. Huang, 38, died when his 2017 Model X drove his car into the unprotected edge of a concrete highway median that was missing its crash guard.

“According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location,” Tesla said in an emailed statement. “The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

This is the third public statement by Tesla since the March 23 crash—and its strongest language yet in an effort to distance itself from the fatality. Tesla previously said Huang’s hands were not on the steering wheel for six seconds prior to the collision.

Tesla’s complete statement:

We are very sorry for the family’s loss.

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.

A Tesla spokesman would not provide additional information about the crash, including how many times the vehicle issued the alerts to Huang while driving or how many warnings are typically issued before the Autopilot system disengages. Autopilot issues visual and audio alerts if the driver’s hands leave the steering wheel for a period of time. The system is supposed to disengage after several warnings.

The law firm representing the Huang family said that a preliminary review has uncovered complaints by other Tesla drivers of navigational errors by the Autopilot feature. “The firm believes Tesla’s Autopilot feature is defective and likely caused Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy,” according to the law firm, Minami Tamaki.

Autopilot is marketed by Tesla as a semi-autonomous driving system rather than fully autonomous that handles all aspects of driving in certain conditions without expectation of the driver’s involvement. Autopilot includes several features like an automatic steering function that helps drivers steer within a lane, adaptive cruise control that maintains the car’s speed in relation to surrounding traffic, and an auto lane change feature that is supposed to move the vehicle into an adjacent lane automatically when the turn signal is activated—but only when it’s safe to do so.

Tesla’s system does give warnings to remind drivers to keep hands on the wheel when Autopilot is activated. However, not all drivers heed those warnings, and Tesla has been criticized for failing to protect against misuse of the system. A fatal May 2016 crash in Florida was caused partly by the driver overly relying on Autopilot, according to a National Transportation Safety Board investigation.

At the time, the NTSB said that “operational limits” such as Tesla being unable to ensure that drivers are paying attention when a car travels at high speed played a major role in the 2016 fatal crash. The agency recommended Tesla and other automakers take steps to ensure that semi-autonomous systems are not misused. For instance, GM’s semi-autonomous Supercruise system, which is an option in the Cadillac CT6, has a camera system that ensures the driver is looking ahead or it will disengage.

The National Highway Traffic Safety Administration, which also investigated the 2016 crash, found no defects with Autopilot.

熱讀文章
熱門視頻
掃描二維碼下載財富APP