This alarming video of unpleasant Tesla autopilot drivers could actually help make autopilot safer

Illustration of this article entitled This alarming video of improper driving of the Tesla autopilot could actually help make the autopilot safer

Print Screen: YouTube

Tesla, as usual, is very generous in giving us a lot to talk about, especially when it comes to them. Level 2 known the driver assistance system, confused, as an autopilot and / or Full Self Driving (FSD). Yesterday there was an accident of a Tesla using the autopilot that hit a police car, and now a video of a largely autopilot-assisted unit through Oakland he makes the tour, generating a lot of attention due to the often confusing and / or just weak decisions that the car makes. Strangely, however, it is the insanity of system performance that can help people use it safely.

All this follows a letter from the National Transport Safety Council (NTSB) to the US Department of Transportation (USDOT) on the National Administration for National Road Safety Administration’s (NHTSA) “Proposed Advanced Regulatory Opinion” (ANPRM), where the NTSB actually says what the hell (WTF) should we do about autonomy testing vehicles (AV) on public roads.

From that letter:

Because NHTSA has not implemented any requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limitations of the AV control system. For example, Tesla recently released a beta version of its Level 2 autopilot system, described as having full self-driving capability. By launching the system, Tesla is testing highly automated AV technology on public roads, but with limited surveillance or reporting requirements.

Although Tesla includes a disclaimer that “currently enabled features require active driver supervision and do not make the vehicle autonomous,” NHTSA’s practical approach to AV test monitoring poses a potential risk to drivers and other road users.

At the moment, the NTSB / NHTSA / USDOT letter does not propose any solution, but brings into question something we have been seeing for years: Tesla and other companies are testing beta car software with automatic driving on public roads, surrounded by other drivers and pedestrians who did not consent to be part of any test and, in this beta software test, the locks have the potential to be literal.

All of this provides a good context for the Oakland autopilot video, the highlights of which can be seen in this tweet.

… and the full 13-and-a-half minute video can be viewed here:

There is so much in this video that is worth watching if you are interested in Tesla’s autopilot / FSD system. This video uses what I think is the latest beta version of FSD, version 8.2, of which there are many other driving videos available online.

There is no doubt that the system is technologically impressive; doing any of these is a colossal achievement, and Tesla engineers should be proud.

At the same time, however, it is not nearly as good as a human driver, at least in many contexts and, yes, as a level 2 semi-Autonomous system, requires a driver to be careful and ready to take over at any time, a task in which people are notorious for bad and why I think anything The L2 system is inherently defective.

While many FSD videos show the system used on highways, where the general driving environment is much more predictable and easier to navigate, this video is interesting precisely because driving in the city has such a high level of difficulty.

It’s also interesting, because the guy in the passenger seat is such a constant and infallible apologist, to the point where if Tesla attacked and cut a kitten’s bedding, he would praise him for his excellent ability to track a small target.

Along the way to Oakland there are a lot of places where Tesla works very well. There are also places where he makes some really terrible decisions, driving on the received lane or turning in the wrong direction in one direction or weaving like a drunk robot or cutting curbs or stopping for no reason. clearly, right in the middle of the road.

In fact, the video is helpfully divided into chapters based on these interesting events:

0:00Introduction

0:42Double parked cars (1)

1:15Pedestrian in the Crosswalk

1:47It goes through solid lines

2:05decommitments

2:15China Town

3:13Avoiding the driver

3:48Unprotected left (1)

4:23Turn right onto the wrong lane

5:02Close to the Head On On incident

5:37Acting drunk

6:08Unprotected left (2)

6:46decommitments

7:09Do not turn red

7:26“Take over immediately”

8:09Wrong lane; Behind parked cars

8:41Double parked truck

9:08Lane bus only

9:39End call (restriction)

10:04Turn left; Lane Locked

10:39The wrong way !!!

10:49Double parked cars (2)

11:13Delay stop signal

11:36Hesitant left

11:59Near Collision (1)

12:20Near collision (2)

12:42Call closing (wall / fence)

12:59Verbal review of Drive / Beta

This looks like the song list of a very strange concept album.

Nothing in this video, however impressive it is objective, says that this car drives better than a man. If a man had done the things seen here, you would have wondered out loud what the hell happened to him countless times.

Some situations are clearly things that the software has not been programmed to understand, such as how cars parked with emergency lights on are obstacles that should be driven carefully. Other situations are the result of the system’s poor interpretation of camera data, or overcompensation or difficulties in processing its environment.

Some of the video’s defenses help to address the larger issues involved:

The argument that there are many, many other human accidents on a given day is very misleading. Sure, there are many more, but there are many more people driving cars, and even if the numbers were equal, no car manufacturer is trying to sell shitty human drivers.

In addition, the reminders that the FSB is a beta version only serves to remind us of that NTSB letter with all the acronyms: we should leave the companies auto beta testdriving the car software in public without supervision?

The Tesla FSB is still no safer than a normal human driver, which is why videos like this, which show many disturbing FSD driving events, are so important and could save lives. These videos erode some confidence in FSD, which is exactly what needs to happen if this beta software is to be tested safely.

Blind faith in any L2 system is how you end up collapsing and maybe you’re dead. Because L2 systems give little without warning when they need people to take the lead, and a person who does not trust the wheel is more likely to be prepared to take control.

I’m not the only one to suggest this either:

The paradox is that the better a level 2 system gets, the more likely it is that people behind the wheel will trust it, which means that the less attention they pay, the more it will make them little able to take control when the system actually needs them.

That’s why most it collapses with Automatic pilot it happens on highways, where a combination of generally good autopilot performance and high speeds leads to reduced driver attention and shorter reaction time, which can lead to disaster.

All level 2 systems not just autopilot you suffer from it and therefore it is all rubbish.

While this video clearly shows that FSD’s core driving skills need work, Tesla should not focus on this, but on identifying safe and manageable failover procedures, so no immediate attention is needed. the driver.

Until then, the best case for the safe use of autopilot, FSD, SuperCruise or any other level 2 system is to watch all these videos of the systems screwing, to lose some confidence in them and stay tense for a while. and alert when the device is driving.

I know this is nobody like from autonomous vehicles, but the truth is that they are not yet realized. It is time to accept this and treat them that way, if we ever want to make real progress.

Defensive defense and trying to cover up driving cars with sugar does not help anyone.

So, if you love your Tesla and love autopilot and FSD, watch the video carefully. Appreciate the good parts, but really accept the bad parts. Don’t try to apologize. Watch, learn and keep this nonsense in the back of your mind when you get behind the wheel, not really driving.

It’s not fun, but this stage of any such technology always requires work, and work isn’t always fun.

.Source