這項技術將給駕駛帶來巨大變化
我們只要看看朋友的臉,就可以看出他/她是累了還是心事重重,又或者是喝醉了。用不了多久,汽車就也能夠識別出來。 隨著面部識別技術的逐漸進步,很快機器就不僅可以識別人臉,還能夠辨別人類情緒。這就意味著,下一代汽車可能配備面部識別功能,可以辨別司機面部的疲勞表情,又或身體受損的其他跡象。 波士頓初創企業Affectiva等公司已經在開發軟件,幫助汽車行業整合相關技術。Affectiva的首席執行官拉娜·卡利烏比表示,情緒辨別軟件不只關心司機,也會掃描乘客的狀態。 這說明今后在汽車制造商推出的新型汽車上,可能根據乘員視覺信號調整某些功能,例如溫度、燈光和娛樂等。隨著更多自動駕駛汽車的問世,此類功能可能具有特別的吸引力。 “這是非常重要的技術,不僅有智商,情商也很高。” 卡利烏比于6月11日上午在紐約舉行的《財富》首席執行官倡議論壇上說道。 她補充道,讓機器感受人類情緒尤其重要,因為人類交流過程中只有7%通過文字,其他93%都是通過聲調、表情和肢體語言完成。 雖然看起來汽車業很適合配備情緒識別軟件,但也只是受面部識別技術深刻影響的行業之一。 根據卡利烏比的說法,個人護理行業也可以從中獲益,并描述了人類護士與一組能夠感受人類情緒的機器人合作照顧病人的場景。 不過,此類場景也可能加深機器取代人類的恐懼感,而且理解情緒向來是區分人類與機器人的重要特征。但卡利烏比說道,不必杞人憂天。 “人類和機器之間不是競爭關系,更像是伙伴關系。”她說道,并聲稱人類永遠都將是機器的主導。 卡利烏比還提到,確實存在一種風險,即面部識別技術開發者訓練算法時可能只用社會中一小部分群體的數據,難免導致偏見。她說道,Affectiva在努力避免出現此類情況,確保數據庫在性別、種族和年齡等方面保持多樣性。(財富中文網) 譯者:Charlie 審校:夏林 |
We can look at a friend and tell by his or her face that this person is tired or pre-occupied—or perhaps that this person is drunk. Soon, our cars will be able to do the same. Advances in facial recognition technology mean machines can not only recognize different people, but also how they are feeling. This means the next generation of automobiles may contain features that scan drivers’ faces for fatigue or other signs of impairment. Companies, including Boston-based Affectiva, are already making software to help the auto industry integrate such technology. According to CEO Rana el Kaliouby, software that reads emotion is not focused only on drivers, but on passengers, too. This will mean that automakers may come to build vehicles that may adjust comfort factors like heat, lighting, and entertainment based on visual cues from their individual occupants—features that could be especially appealing as more autonomous cars hit the roads. “It’s really important technology not only have IQ, but lots of EQ too,” said el Kaliouby, speaking on June 11 morning at Fortune‘s CEO Initiative in New York. She added that building empathy into machines is especially important given that humans use words for only 7% of their communications. The other 93%, el Kaliouby says, consists of vocal intonations, expression, and body language. While the auto industry appears well-suited to integrate emotion-reading software, it is just one business where facial recognition could have a big impact. According to el Kaliouby, the personal care industry could also benefit from the technology. She described a scenario where a nurse works in tandem with a team of empathetic robots to take care of patients. Such a scenario, however, may also exacerbate fears of machines replacing humans—especially as the ability to understand emotions is what differentiates us from robots. But el Kaliouby said this should not be concern. “It’s not a competition between humans and machines. It’s more like a partnership,” she says, claiming that people will always be the ones in charge of the machines. el Kaliouby also addressed the risk of facial recognition makers perpetuating bias by training their algorithms on a narrow segment of society. She said Affectiva takes pains to avoid this by ensuring their databases are diverse in terms of gender, ethnicity, and age. |