Blog posts related to Autopilot

Exclusive look at Enhanced Summon and Autopilot coming in v10

September 5, 2019

I've had access to a few videos of a Model 3 running 2019.28.3.11 - an Early Access only release. As always, I want to share what I think is interesting in these videos with you. These new videos showcase a few features that I expect will be released to the general public under the version 10 of the firmware, hopefully in a couple of months.

Past Tesletter articles

2019.20.4.4 and I can now take a 90 degree turn on AP

The Tesla AP team keeps improving the software with every release, impressive video of a Tesla taking a 90 degree turn that the car could never do before.

AP2 allowing changing lanes in roads that weren't available before

Multiple people have reported this in Norway, Finland, and the Netherlands.

Read more: TMC Forum

API and AP2 comparison

Chillaban compares API and AP2 running version 2018.21.9. He summarizes his points saying «Overall, my experience so far is that without a doubt in my mind, AP2 is more capable than AP1 now in 2018.21.9 and beyond».

Read more: TMC Forum

Airplane vs. Tesla Autopilot

Ryan is an airline pilot and flight instructor, as well as a Model S owner. In this episode of Supercharged Podcast he compares his experience with autopilot in airliners and Teslas.

Read more: Supercharged Podcast

Announcements from the Tesla Q2 2018 results conference call

We don’t cover Tesla’s stock, financial results, or other stuff that isn’t their product but this time during the Q2 call, they made really cool product announcements:

  • v9 would include on-ramp off-ramp AP with automatic lane change when possible
  • Hardware 3 will be introduced next year. According to the announcement, the new hardware is a chip designed in-house. This chip is designed explicitly for neural network processing. There are drop-in replacements for the Model S, Model X, and Model 3. According to Elon, the current Nvidia hardware can process 200fps while this new one can process over 2000fps.
  • The coast-to-coast trip on AP won’t happen soon. The AP team is focused on «safety features» like recognizing stop signs and traffic lights in a super reliable way. They don’t want to distract the team or do it with a predefined hardcoded route, so it’s going to have to wait.

Autopilot usage survey

Are you curious to see how other Tesla drivers use the AP? BostonGraver has put together a survey to collect this data anonymously. Results are public:

More: Survey

Autopilot/FSD information from Q4 2018 earnings call

u/tp1996 put together a list of what was discussed:

  • NoAP - Option to disable stalk confirmation coming in a few weeks.
  • Traffic Light and intersections - Currently working at about 98% success rate. Goal is 99.999%+.
  • Stop sign detection - Elon said this was very easy and trivial as it can be geo-coded, and stop signs are very easy to recognize.
  • Navigating complex parking lots - Includes parking garages with heavy pedestrian traffic.
  • Advanced Summon - Will be first example of car being able to navigate complex parking lots. ETA “Probably next month».

Read more: Reddit

Data sharing and privacy - what's actually collected?

By verygreen What other things I have noticed: the «anonymization» is actually pretty superficial.«)

Read more: TMC Forum

Deep-dive into Autopilot Shadow Mode

Since there is a lot of misinformation about how Tesla’s «tesla shadow mode» works, verygreen went on a Tweetstorm to try and throw some light about what it does do and doesn’t. In a sense it’s more advanced than what people think -lots of rules and triggers- but not what other people thought -like not constantly monitoring what the driver does and communicating discrepancies to Tesla. Pretty informative read, as always, from verygreen.

Read more: Twitter

Example of emergency lane departure avoidance working as intended

Onwer ended up with his head in the clouds (his words, not mine) for a moment, and didn’t notice how the road was curving. The emergency lane departure feature saved him from an expensive repair.

Explanation of electric assisted steering by BozieBeMe2

BozieBeMe2 says: «The electric steering is nothing more than a servo system. Just like in a robot. (…) When you intervene the computers commands, by offering a resistance in that movements, the amp loads go up in the servo and tells the AP computer to cancel the AP sequencing. It’s the same thing when a robot in the factory strikes some thing, the amp loads go up and the robot faults out.»

Read more: TMC Forum

First look at Tesla’s latest Autopilot (2.5) computer in Model 3, S, and X vehicles

The user Balance reports seeing the following:

  • Two Nvidia GP106-510 gpu’s. The GP106 is used in the GTX 1060 (GP106-300 for 3GB and GP106-400 for 6GB). Looks like a very highly binned version
  • One Nvidia Parker SoC (need better photo) with four 8 gigabit (1 gigabyte) SK Hynix DRAM chips for a total of 4 GB DRAM
  • Fairly large Infineon and Marvell chips u-blox Neo-M8L GNSS module (GPS/QZSS, GLONASS, BeiDou, Galileo)

Read more:

How to use Tesla Autopilot

In the last earnings call, Elon mentioned that the most common reason to visit a service center is to ask how Autopilot works. Well, Tesla Raj has published a great video describing how to use Autopilot for Tesla beginners.

Human side of Tesla Autopilot

Professor Lex Fridman -we published one of his surveys a while ago- just released his paper on driver functional vigilance during Tesla Autopilot.

Read paper

Multi-task learning in the wilderness

Andrej Karpathy talks about Autopilot and deep learning. As always, it’s really interesting to hear him talking about this stuff, even when we are not experts on the subject by any means.

BTW I was lucky enough to get invited to the Neuralink event on Tuesday, and can you guess who I ran into? I didn’t want to bother him too much, I just thanked him as an owner and Tesla enthusiast.

Neural Networks & Autopilot v9 With Jimmy_d

Jimmy_d does an awesome job at explaining Neural Networks and Tesla’s work on Autopilot to us mere mortals. Definitely worth listening!

Listen to the interview: TechCast Daily

New 30 day Enhanced Autopilot trial!

Tesla just raised the price of AP if purchased after delivery to $8k but the new trial offers a promotional price of $5,500. If you try it and like it, buy it while it’s still discounted!

Disclaimer: We love our AP and think it makes road trips much more relaxed even though you still have to pay full attention to the car, traffic, etc.

Read more: Tesla

New Autopilot visualization animation in 2019.28.3.7

People in the Early Access Program have now a new AP visualization when changing lanes.

The same version allows the driver to zoom in and rotate.

Cool stuff!

Seeing the world in AutoPilot, part deux

A few issues ago we included videos of how AP2 sees the world. Now, thanks to verygreen and DamianXVI there is a more up-to-date and better version. It’s pretty awesome to be able to see what the car sees, this is a rare opportunity since it isn’t a marketing video. We recommend you to watch the videos and read - at least - the original post since it has a lot of interesting info. A few highlights:

  • AP identifies motorcycles, trucks, and pedestrians. Tesla doesn’t show them just yet but it is showing motorcycles in v9
  • The car identifies lanes and turns with no actual road markings
  • AP predicts how the road behind a hill crest is going to be
  • While it doesn’t render 3D boxes, it detects vehicles, speed, and direction
  • The car is continuously calculating all these, no matter if AP is on or off. Some of this state could be matched from the «triggers» to cause snapshots to be generated and sent back to Tesla. But verygreen doesn’t seem to think it actually compares this model to the actual driving input at this time.

Read more: TMC Forum

Seeing the world in Autopilot

Here’s a couple of pretty cool videos showing a visualization of the Tesla interpreted radar data. The size of the circle indicates how close the object is while the color represents the type (green - moving, orange - stationary, yellow - stopped). It is neat to be able to see what the radar sees, great work verygreen and DamianXVI!

Seeing the world in Autopilot V9 (part three)

If you’ve been following Tesletter or almost any Tesla forum or blog for sometime you probably already know who verygreen is (or greentheonly depending on the forum) and how much awesome information he is revealing from his car. This time he is back with a follow-up on what AP v9 seems. He has capture six of the cameras and the information that they detect. As always, this is awesome, great job verygreen!

In a second video we can see how AP can’t see a tire in the middle of the road, so please always be careful while using Autopilot!

Some SW internals of Tesla Autopilot node (HW2+)

Following up on his Tweetstorm, verygreen explained the software internals for AP2+. Since it is pretty hard to summarize it I would recommend you to go and read the whole thing on Reddit.

Read more: Reddit

Successfully negotiated 2 roundabouts in quick succession on Autopilot 28.3.1

Ok, it is a small one, no traffic and it just continues straight after the roundabout… but this seems like progress!

TRAIN AI 2018 - Building the Software 2.0 Stack, by Andrej Karpathy

Tesla related stuff in this talk starts in minute 15:30. Some interesting bits:

  • Since he joined - 11 months ago - the neural network is taking over the AP code base
  • Tesla has created a ton of tooling for the people who tag images so they can be more efficient
  • Around minute 20 he shows why labeling something that seemed easy like lane lines isn’t as easy as it seems
  • Min. 22:30 - He shows traffic lights, some are really crazy!
  • He talks about how random data collection doesn’t work for Tesla. For instance, if you want to identify when a car changes lanes and you collect images at random most often than not blinkers are going to be off
  • Min. 25:40 - He mentions how auto wiper and how the dataset is crazy and it ‘mostly works’

Take me to San Francisco!

Just say “take me to San Francisco” and your Tesla will automatically get on the Bay Bridge, pay the toll, and drive all the way to San Francisco with 0 disengagements.

Tesla AP urban environment 360-degree visualization

A visualization of the data eight autopilot cameras see when driving in an urban environment with cross traffic. Data for visualization was taken directly from an AP2.5 car.

Tesla Autopilot HW3

verygreen has found new information about Tesla’s AP HW3 in the firmware. There is a lof of information for us to try to condense it here, if you like these type of things go ahead and read the entire post on TMC.

Read more: TMC Forum

Tesla Autopilot Survey (from MIT)

Lex Fridman (MIT) is leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision and needs Tesla owners to help him answering this Tesla Autopilot Survey.

Read more: TMC Forums

Tesla Autopilot Survey (from MIT)

Lex Fridman (MIT) is leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision and needs Tesla owners to help him answering this Tesla Autopilot Survey.

Read more: TMC Forums

Tesla Autopilot miles

Check out this cool animated gif with Tesla autopilot miles since 2015.

Tesla Enhanced AP saves owner

Marc Benton - well known Tesla community member - just recorded this while traveling from LA to Nashville.

Tesla autopilot detecting stop lines

greentheonly is back at giving us a glimpse on what AP sees. As of 2019.4, we have stop lines detection enabled in public firmware.

Tesla autopilot stopping at red lights all by itself

Greentheonly - who else? - just discovered that 2019.8.3 comes with the hidden ability of detecting and reacting to red lights and stop signs and activated it!. If you watch the video, it’s clear that this feature isn’t ready for prime time, but it is exciting seeing progress on this front! Look at 0:40 and 1:30 for two success detections and 4:05 and 5:12 for failures.

Tesla could have to offer computer retrofits to all AP 2.0 and 2.5 cars

Several comments made by CEO Elon Musk since the launch of its AP 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.

Now it looks like Tesla might have to also offer computer retrofits for AP 2.5 cars.

Read more:

Tesla is now giving trials of the FSD

Tesla used to give trials of the EAP but it was unclear what was going to happen after they re-defined the packages to AP and FSD. Well, it seems like they just started giving trials of the FSD features (Auto Lane Change, NoA, and Summon) right now.

Read more: Twitter

Tesla owners raging about AP on version 2018.48

Lots of Tesla owners reporting AP on 2018.48 is great.

«This is literally the first time I felt like my car can actually drive itself.»

«I feel like it now handles curves perfectly.»

«This update, even with AP2.5 hardware, made me really hopeful and optimistic about FSD. I didn’t really enjoy my NoAP experience too much before.»

We haven’t really had a chance to fully try it yet, but we’re very excited after reading all these comments :D

Read more: Reddit

Tesla reveals new self-driving Autopilot hardware 3.0 computer diagram ahead of launch

A source told Electrek that Tesla updated the Model 3 wiring diagrams in its internal service documentation on January 9th to include the new Autopilot 3.0 computer as the new standard ECU in Model 3.

An interesting new piece of information is that in the diagram we can see how the chip has connectors for a second radar although current models only have mounted a forward facing radar.

Read more: Electrek

Tesla vehicle deliveries and projections

In analyzing the Tesla subset of the MIT-AVT dataset, Lex Fridman said: «I came across the need to estimate how many Tesla vehicles have been delivered, and how many of them have Autopilot and which version of Autopilot hardware.»

The post was last updated on June 17th, 2018 and includes an FAQ section with questions received.

The new neural networks (really well) explained

Jimmy_d made an awesome job at explaining to us mortals how the new Neural Networks pushed on 2018.10.04 work. The three main types he observed are called main, fisheye and repeater. «I believe main is used for both the main and narrow forward facing cameras, that fisheye is used for the wide angle forward facing camera, and that repeater is used for both of the repeater cameras.»- he says.

Read more:

This is what Tesla collects in an accident on AP2+

greentheonly (aka verygreen in TMC) came into procession of two Autopilot units from crashed cars and was able to extract the snashots that the car stored during the crash. One of the crashes is an AP1 that crashed on July 2017, the crash snapshots only included 5 cameras: main, narrow, fishseye and pillars (2) at 1 fps, while the other crash is from October 2017 and its snapshop data includes 30 fps footage for narrow and main cams in addition to the data mentioned earlier.

Read more: Reddit

What Tesla Autopilot sees at night

verygreen is back, this time he is showing us what the AP2 running v9 sees at night. It is weird the glare the blinkers put on the repeater cameras. You can see it at 00:32 on the left repeater camera.

Read more: TMC Forum

What's new in beta 2018.21

Tesla release versions odd numbers have historically been betas only pushed to a handfull number of users (3 - 5 according to Teslafi) under an NDA. Last time that a beta was rolled out was a few weeks before the NN rewrote, so we have high hopes that this one includes something big as well. The lucky ones who have rooted their Tesla are dropping some bits of info on TMC, see below (Thanks BigD0g and dennis_d for all the info!):

  • Screen showing blind spot messages and detection (read more)
  • Improvement on how AP handles road dividers (read more)
  • AP2 running 2018.21 showing cars in adjacent lanes (ala AP1) (video)
    Read more: TMC Forum