Tesla And Owners Go Reckless Again With New Automated Driving Feature

Let me be absolutely clear. Any manufacturer of any product that pushes out a safety critical feature that is not fully developed, tested and validated is guilty of negligence. This applies to Boeing
BA
and the maneuvering control augmentation system on the 737 Max and to Tesla with its roll-out of features for its automated driving system. The latest of these is the supposed ability to detect and respond to traffic signals and signs. 

While Tesla calls the package that it now charges $7,000 for “Full Self-Driving” it is in fact anything but. In fact, beyond the preceding sentence, I will no longer use Tesla branding because even the manufacturer acknowledges that the human driver remains fully responsible and must keep their eyes on the road and hands on wheel while using the systems. Any system that mandates human supervision is neither autonomous nor self-driving. It is at best partially automated. 

Over the past year or so, Tesla has been sending out over-the-air software updates that add some of the features that are eventually supposed to comprise a highly automated driving (AD) system, but every one of these updates has been very problematic. The so-called Smart Summon in fall 2019 was particularly bad. Countless videos are available online of Teslas stumbling around parking lots like drunken sailors. 

The latest example that just came out this week is traffic signal recognition. One of the key reasons Autopilot has always been intended only for use on divided highways was because of its inability to respond to traffic signals or signage. Tesla drivers engaging Autopilot in urban areas would simply sail right through red lights. 

Of all the challenges that must be overcome to build a highly automated driving system, traffic signals would seem like a relatively straightforward one. Within a given region, they tend to be fairly well standardized and they rarely change location. Yet, developers of AD systems go to great lengths to ensure that they can detect these signals reliably. 

One of the reasons almost every company developing AD that isn’t Tesla uses high definition maps is so that they know exactly where to look for these signals. If an automated vehicle knows where the signals and signs are, the software doesn’t have to scan the entire scene to look for them, reducing processing requirements. Signals and signs can also be obscured by other vehicles, trees or other obstructions. 

One solution to this is vehicle to infrastructure (V2I) communications. Many recent Audi vehicles have V2I communications that get information about traffic signal state and timing from the local control system. Alerts about signal timing are provided to the driver in the instrument cluster. AD micro-transit developer May Mobility actually installs roadside units with cameras on utility poles where they aren’t obscured by vehicles or challenging lighting conditions and transmits the information to the vehicles.

This brings us back to what Tesla has just released. As Tesla has been doing since the 2015 debut of Autopilot, the new software is labeled as beta.There isn’t another major automaker in the world that sends beta versions of safety critical systems to customers and for good reason. Customers vary widely in knowledge and experience and should never, ever be used as testers for these sorts of systems. 

Customers don’t know what is in the software and what it cannot do. More importantly, even if the customers have given “informed consent” to be test subjects, the other road users have not. Training a neural network should be the job of the engineers developing the system, not paying customers. Flashing up a message on the infotainment screen to pay attention while using this system is not sufficient cover. After nearly five years of Autopilot, Tesla and its CEO Elon Musk know full well that there are foreseeable misuse cases when they distribute these systems and they should be held accountable for not doing enough to prevent this misuse. 

Instead, we have examples like the opening post in a thread on the Teslamotors
TSLA
subreddit. 

I received the 2020.12.6 update today while at home. I installed it immediately and took a ride with the kids before too long to test it out.”

Installing beta safety critical software in a car and then immediately going for a ride with kids in the car is absolutely reckless, and also totally consistent with the behavior of many Tesla owners that post on social media. 

While this particular owner didn’t have any issues on that first drive, others didn’t fare so well. The very next comment showed why this sort of software needs far more verification before testing. 

“It goes full speed up to stop signs and stops really hard at the last second. The new speed limit cap had cars passing me on a double yellow line. The weirdest issue I had was at a flashing red stoplight with a stop sign. It stopped as it should but when I proceeded by pressing the accelerator it immediately stopped again as soon as I let off the pedal. I pressed it again and it went forward a few feet then stopped abruptly again when I let off the pedal. The third time I held the pedal down until I was all the way through the intersection and then it returned to normal. Turning this one off for a few versions.“

Another poster wrote:

“The weirdest thing was at lights where I was going through a green and there was a car in the right turn lane. My car would want to stop behind the car in the turn lane every time. Even if I had confirmed to go through the green. I would have to hold in the pedal to get it to go.”

Having a vehicle randomly stop or otherwise behave erratically in response to absolutely predictable and normal traffic scenarios is absolutely unacceptable. During my 17 years working as a development engineer on automotive control systems, one of the most important things we always tried to accomplish was predictability. Even if the system doesn’t have absolutely optimal performance, as long as a driver knows how it will respond to a given input, they can adapt. However, unpredictable behavior such as random stops at a green signal is dangerous for both the driver and whoever, might be following expecting the car to go through. This problem is further amplified when the vehicle stops suddenly and aggressively, reducing the possible response time time of other road users. 

Why is Tesla pushing out unproven software to customers? Only Tesla can say for sure. However, there are several possibilities. The ability to participate in the development process of new technology is certainly appealing to many of the early adopters that are part of the Tesla audience. 

While Tesla gets $7,000 up front for selling its automated driving package, it can’t actually book that as revenue until it ships the product. Sending out updates like navigate on Autopilot, Smart Summon and Traffic Signal Recognition allows it to book some portion of that revenue, strengthening its bottom line even if the product isn’t actually ready. 

Proper testing and validation of software is also difficult. It takes many engineers and technicians months or years to complete in any sort of robust development process and that cost money, money that Tesla would apparently rather not spend, relying on its customers instead. 

Whatever the reason or combination of reasons, Tesla and its customers are again being reckless in the development and deployment of automated driving technology. This is absolutely the wrong way to go about this but unfortunately, the regulators responsible for oversight have utterly abdicated their responsibilities under the current administration (and the prior administration as well). Until regulators grow some backbone or someone is willing to take Tesla to court over this, the situation will continue. 

Speak Your Mind

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Get in Touch

350FansLike
100FollowersFollow
281FollowersFollow
150FollowersFollow

Recommend for You

Oh hi there 👋
It’s nice to meet you.

Subscribe and receive our weekly newsletter packed with awesome articles that really matters to you!

We don’t spam! Read our privacy policy for more info.

You might also like

Brazil Kicks Off World Cup Qualifiers In Style With...

SAO PAULO, BRAZIL - OCTOBER 09: Marquinhos of Brazil...

Facebook Tightens Political Ad Bans As US Election Nears

FILE - This July 30, 2019 file photo shows an update information of Facebook...

Council Post: Five E-Commerce Trends To Maximize In 2021

CEO and Co-Founder of CyberFortress; building a new kind of revenue interruption insurance designed for...

Cardi B And Megan Thee Stallion Hit No. 1...

LOS ANGELES, CALIFORNIA - DECEMBER 12: Megan Thee Stallion...