Tesla hits Musk’s threshold for ‘safe unsupervised’ driving

Tesla hits Musk’s threshold for ‘safe unsupervised’ driving

特斯拉达到马斯克设定的“安全无人监督”驾驶门槛

We’ve crossed yet another one of Elon Musk’s self-driving thresholds. Tesla’s fleet of vehicles using the company’s Full Self-Driving (Supervised) system has driven over 10 billion miles, according to the company’s updated safety page. That means the company has crossed the line Musk set earlier this year for “safe unsupervised” driving.

我们又跨过了埃隆·马斯克设定的自动驾驶门槛之一。根据特斯拉更新的安全页面显示,其搭载“全自动驾驶(监督版)”(FSD Supervised)系统的车队行驶里程已超过 100 亿英里。这意味着该公司已经达到了马斯克今年早些时候设定的“安全无人监督”驾驶的界限。

But Tesla owners did not suddenly wake up today to find their FSD (Supervised) vehicles transformed into FSD (Unsupervised) ones. FSD is still just a Level 2 system that requires a fully attentive human driver behind the wheel to monitor the road and be prepared to take over at any moment.

然而,特斯拉车主们今天醒来时,并没有发现他们的 FSD(监督版)车辆突然变成了 FSD(无人监督版)。FSD 目前仍仅是一个二级(Level 2)系统,要求驾驶座上必须有一名全神贯注的人类驾驶员监控路况,并随时准备接管车辆。

In January, Musk said on X that “roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving” — the implication being that once Tesla reached that milestone, the company would flip the switch and all its customers would suddenly have access to an unsupervised driving system.

今年 1 月,马斯克在 X 上表示,“实现安全的无人监督自动驾驶大约需要 100 亿英里的训练数据”——其言下之意是,一旦特斯拉达到这一里程碑,公司就会“按下开关”,所有客户将突然获得使用无人监督驾驶系统的权限。

Of course, that would have been an enormously risky move by Tesla, especially when there are still so many questions about the company’s willingness to accept legal responsibility for over a million vehicles with FSD. When a Waymo vehicle is responsible for a crash, Waymo assumes liability because it owns the tech and the fleet. But Tesla’s terms of service put the liability on the owner, based mostly on its characterization of FSD as a Level 2 supervised system. What happens when FSD goes unsupervised? Who assumes responsibility for a crash then?

当然,这对特斯拉来说将是一个极其冒险的举动,尤其是在该公司是否愿意为超过一百万辆搭载 FSD 的车辆承担法律责任方面,仍存在诸多疑问。当 Waymo 车辆发生事故时,Waymo 会承担责任,因为它拥有该技术和车队。但特斯拉的服务条款将责任归于车主,这主要基于其将 FSD 定义为二级监督系统的定性。那么,当 FSD 进入无人监督模式时会发生什么?届时谁来为事故承担责任?

It’s not clear that Tesla has figured that out yet. Over the years, there have been hundreds of crashes involving Tesla’s partially autonomous features and dozens of fatalities. But the company has been able to avoid liability, either by settling with victims or convincing courts to dismiss the lawsuits. On its website, Tesla maintains that FSD (Supervised) “requires active driver supervision and does not make the vehicle autonomous.”

目前尚不清楚特斯拉是否已经解决了这个问题。多年来,涉及特斯拉部分自动驾驶功能的事故已发生数百起,并造成数十人死亡。但该公司一直能够通过与受害者和解或说服法院驳回诉讼来规避责任。特斯拉在其网站上坚持认为,FSD(监督版)“需要驾驶员主动监督,并不能使车辆实现自动驾驶”。

A federal jury in Florida last year found Tesla partly liable for a deadly 2019 crash involving the company’s Autopilot driver assist software, and ordered the company to pay the victims’ families $243 million. Tesla appealed the ruling, but a judge rejected that effort.

去年,佛罗里达州的一个联邦陪审团裁定,特斯拉对 2019 年一起涉及其 Autopilot 驾驶辅助软件的致命车祸负有部分责任,并责令该公司向受害者家属赔偿 2.43 亿美元。特斯拉对该裁决提出上诉,但被法官驳回。

Still, it’s worth acknowledging the incredible accomplishment of 10 billion miles driven in FSD (Supervised). Tesla claims that its FSD-equipped vehicles drive 5.5 million miles on average before a major collision, as compared to 660,000 miles for the average US driver. Tesla touts this as evidence that FSD is safer than human driving.

尽管如此,FSD(监督版)行驶里程达到 100 亿英里这一惊人成就仍值得肯定。特斯拉声称,其配备 FSD 的车辆平均行驶 550 万英里才会发生一次重大碰撞,而美国普通驾驶员的这一数据为 66 万英里。特斯拉以此作为证据,宣称 FSD 比人类驾驶更安全。

Experts have long questioned Tesla’s methodology. Studies have shown that the company’s safety reports fail to take into account basic facts about traffic statistics, such as that crashes are more common on city roads and undivided roads than on the highway, where Autopilot is most often used. Some researchers believe that Tesla may be miscounting crashes in order to make Autopilot and FSD seem safer than it actually is.

专家们长期以来一直质疑特斯拉的方法论。研究表明,该公司的安全报告未能考虑交通统计的基本事实,例如在城市道路和未分隔道路上发生的碰撞比在 Autopilot 最常使用的高速公路上更为频繁。一些研究人员认为,特斯拉可能在统计事故时存在偏差,从而使 Autopilot 和 FSD 看起来比实际更安全。

Unsupervised driving may still be elusive to Tesla’s customers, but the company is ramping up its use of unsupervised vehicles in its robotaxi fleet. After launching in Dallas and Houston with just a pair of vehicles, Tesla has since added more vehicles to its fleet. Dallas now has five unsupervised robotaxis, while Houston has six, according to the Robotaxi Tracker. Austin, where Tesla first launched its robotaxi service, now has 29 supervised vehicles (employees in the front passenger seat) and 22 unsupervised ones.

对于特斯拉客户而言,无人监督驾驶或许仍遥不可及,但该公司正在其 Robotaxi 车队中加大无人监督车辆的使用力度。在达拉斯和休斯顿仅投放两辆车起步后,特斯拉此后增加了车队规模。根据 Robotaxi Tracker 的数据,达拉斯目前有 5 辆无人监督的 Robotaxi,休斯顿有 6 辆。在特斯拉首次推出 Robotaxi 服务的奥斯汀,目前有 29 辆监督型车辆(前排乘客座位上有员工)和 22 辆无人监督型车辆。

Naturally, this makes many Tesla owners feel like they are tantalizingly close to getting access to unsupervised driving. But questions around liability are likely to continue to delay their access. In an earnings call last month, Musk said that unsupervised driving was coming when “it is legal to do so.” Asked specifically about unsupervised FSD in customer cars, he predicted it would arrive in the fourth quarter of the year.

自然,这让许多特斯拉车主感到他们距离获得无人监督驾驶权限仅一步之遥。但围绕责任归属的问题可能会继续推迟这一权限的开放。在上个月的财报电话会议上,马斯克表示,当“法律允许时”,无人监督驾驶就会到来。在被特别问及客户车辆上的无人监督 FSD 时,他预测这将在今年第四季度实现。

Another threshold, or another goalpost that will inevitably move?

这又是一个新的门槛,还是另一个不可避免会被移动的目标?