Autopilot

Tesla Autopilot enables your car to steer, accelerate and brake automatically within its lane. As a driver, you’re required to supervise Autopilot’s decisions. Its main features are Navigate on Autopilot, to suggest lane changes to optimize your route; Autosteer, which keeps on being improved to navigate more complex roads; and Smart Summon, which will take your car to you in a parking lot, maneuvering around objects as necessary.

Tesla cars come with standard advanced hardware to provide Autopilot features today and full self-driving capabilities in the future. You can learn more about Tesla Autopilot in their website https://www.tesla.com/autopilot

Blog posts related to Autopilot

Exclusive look at Enhanced Summon and Autopilot coming in v10

September 5, 2019

I've had access to a few videos of a Model 3 running 2019.28.3.11 - an Early Access only release. As always, I want to share what I think is interesting in these videos with you. These new videos showcase a few features that I expect will be released to the general public under the version 10 of the firmware, hopefully in a couple of months.

Videos about Autopilot

Autopilot with heavy rain - Model S

Autopilot handling curves under the rain like a boss. Heavy rain at 1.44 min and 2.19 min. Really great experience, Autopilot is super helpful specially when you have little visibility.

Past Tesletter articles

2019.20.4.4 and I can now take a 90 degree turn on AP

The Tesla AP team keeps improving the software with every release, impressive video of a Tesla taking a 90 degree turn that the car could never do before.

From issue #68

3D printed Tesla Model 3 storage tray with a built-in USB hub and J1772 adapter slot

I’m not sure how useful it is to have the J1772 adapter right there in the console - probably it depends on how much you use it - but I like seeing people designing these things

Read more: Reddit

From issue #80

AP2 allowing changing lanes in roads that weren't available before

Multiple people have reported this in Norway, Finland, and the Netherlands.

Read more: TMC Forum

From issue #10

API and AP2 comparison

Chillaban compares API and AP2 running version 2018.21.9. He summarizes his points saying «Overall, my experience so far is that without a doubt in my mind, AP2 is more capable than AP1 now in 2018.21.9 and beyond».

Read more: TMC Forum

From issue #15

Airplane vs. Tesla Autopilot

Ryan is an airline pilot and flight instructor, as well as a Model S owner. In this episode of Supercharged Podcast he compares his experience with autopilot in airliners and Teslas.

Read more: Supercharged Podcast

From issue #10

Andrej Karpathy talks about AI at Tesla

Here are some highlights of Karpathy’s talk about AI at Tesla at PyTorch Developer Conference. Having said that, I encourage you to watch the entire video, he makes a great job explaining this complex subject.

  • [3:45] Cameras get stitched together to understand the layout around it
  • [4:00] The images get stitched up to create the Smart Summon visualization. It reminds me of old strategy games when, under the fog of war, you keep discovering the map as you move forward
  • [5:00] Three cameras predicting intersections
  • The Autopilot has 48 neural networks, that make 1,000 distinct predictions, and takes 70,000 GPU hours to train them. It would take one node with eight GPUs you will be training for a year
  • None of the predictions made by Autopilot can’t regress. To make sure that doesn’t happen, they use loop validation and other types of validation and evaluation
  • They target all neural networks to the new HW3 processor
  • Project Dojo: Neural Network training computer in a chip. The goal is to improve the efficiency of training by an order of magnitude
  • 1B miles on NoA and 200,000 automated lane changes, and 800,000 sessions of Smart Summon
From issue #85

Announcements from the Tesla Q2 2018 results conference call

We don’t cover Tesla’s stock, financial results, or other stuff that isn’t their product but this time during the Q2 call, they made really cool product announcements:

  • v9 would include on-ramp off-ramp AP with automatic lane change when possible
  • Hardware 3 will be introduced next year. According to the announcement, the new hardware is a chip designed in-house. This chip is designed explicitly for neural network processing. There are drop-in replacements for the Model S, Model X, and Model 3. According to Elon, the current Nvidia hardware can process 200fps while this new one can process over 2000fps.
  • The coast-to-coast trip on AP won’t happen soon. The AP team is focused on «safety features» like recognizing stop signs and traffic lights in a super reliable way. They don’t want to distract the team or do it with a predefined hardcoded route, so it’s going to have to wait.
From issue #19

Autopilot recognizing a dog as a pedestrian

In this video by greentheonly, Autopilot recognizes both the dog and the owner. Even though it incorrectly tags the dog as a pedestrian, note that it does get recognized so that’s what matters in order to be able to avoid it.

From issue #85

Autopilot usage survey

Are you curious to see how other Tesla drivers use the AP? BostonGraver has put together a survey to collect this data anonymously. Results are public:

More: Survey

From issue #9

Autopilot vs Fog - Sydney to Wollongong - Australia

Tesla AP handling the fog like a boss thanks to the combination of cameras and front radar.

From issue #64

Autopilot/FSD information from Q4 2018 earnings call

u/tp1996 put together a list of what was discussed:

  • NoAP - Option to disable stalk confirmation coming in a few weeks.
  • Traffic Light and intersections - Currently working at about 98% success rate. Goal is 99.999%+.
  • Stop sign detection - Elon said this was very easy and trivial as it can be geo-coded, and stop signs are very easy to recognize.
  • Navigating complex parking lots - Includes parking garages with heavy pedestrian traffic.
  • Advanced Summon - Will be first example of car being able to navigate complex parking lots. ETA “Probably next month».

Read more: Reddit

From issue #45

Bypassing the Los Angeles rush hour commute, at 116 MPH, underground

I can’t wait for a solution to end the soul crushing heavy traffic!

From issue #65

Data sharing and privacy - what's actually collected?

By verygreen What other things I have noticed: the «anonymization» is actually pretty superficial.«)

Read more: TMC Forum

From issue #16

Deep-dive into Autopilot Shadow Mode

Since there is a lot of misinformation about how Tesla’s «tesla shadow mode» works, verygreen went on a Tweetstorm to try and throw some light about what it does do and doesn’t. In a sense it’s more advanced than what people think -lots of rules and triggers- but not what other people thought -like not constantly monitoring what the driver does and communicating discrepancies to Tesla. Pretty informative read, as always, from verygreen.

Read more: Twitter

From issue #47

Europe considers relaxing self-driving laws under pressure from Tesla

It’s known that Tesla’s Autopilot doesn’t work as well in Europe as it does in the USA, the reason is that regulators in Europe have some absurd rules. For instance, they have limited the radius that Autopilot can turn the steering wheel resulting in the car asking the driver to take over immediately in the middle of turns, or they have limited the amount of time that the car can engage the blinker to change lanes.

Honestly, all these sound absurd and in the case of the steering wheel limitation even dangerous. That is why I hope, as the article says, regulators in Europe relax their laws soon. The article is behind a paywall if you want to read it in full, go here.

From issue #82

Example of emergency lane departure avoidance working as intended

Onwer ended up with his head in the clouds (his words, not mine) for a moment, and didn’t notice how the road was curving. The emergency lane departure feature saved him from an expensive repair.

From issue #62

Explanation of electric assisted steering by BozieBeMe2

BozieBeMe2 says: «The electric steering is nothing more than a servo system. Just like in a robot. (…) When you intervene the computers commands, by offering a resistance in that movements, the amp loads go up in the servo and tells the AP computer to cancel the AP sequencing. It’s the same thing when a robot in the factory strikes some thing, the amp loads go up and the robot faults out.»

Read more: TMC Forum

From issue #15

First look at Tesla’s latest Autopilot (2.5) computer in Model 3, S, and X vehicles

The user Balance reports seeing the following:

  • Two Nvidia GP106-510 gpu’s. The GP106 is used in the GTX 1060 (GP106-300 for 3GB and GP106-400 for 6GB). Looks like a very highly binned version
  • One Nvidia Parker SoC (need better photo) with four 8 gigabit (1 gigabyte) SK Hynix DRAM chips for a total of 4 GB DRAM
  • Fairly large Infineon and Marvell chips u-blox Neo-M8L GNSS module (GPS/QZSS, GLONASS, BeiDou, Galileo)

Read more: Reddit.com

From issue #1

How Tesla Autopilot detects cars moving into your lane

Good video explaining how Autopilot works (and used to work) at detecting cars cutting into your lane. Worth watching!

From issue #77

How to use Tesla Autopilot

In the last earnings call, Elon mentioned that the most common reason to visit a service center is to ask how Autopilot works. Well, Tesla Raj has published a great video describing how to use Autopilot for Tesla beginners.

From issue #71

Human side of Tesla Autopilot

Professor Lex Fridman -we published one of his surveys a while ago- just released his paper on driver functional vigilance during Tesla Autopilot.

Read paper

From issue #54

Multi-task learning in the wilderness

Andrej Karpathy talks about Autopilot and deep learning. As always, it’s really interesting to hear him talking about this stuff, even when we are not experts on the subject by any means.

BTW I was lucky enough to get invited to the Neuralink event on Tuesday, and can you guess who I ran into? I didn’t want to bother him too much, I just thanked him as an owner and Tesla enthusiast.

From issue #68

Neural Networks & Autopilot v9 With Jimmy_d

Jimmy_d does an awesome job at explaining Neural Networks and Tesla’s work on Autopilot to us mere mortals. Definitely worth listening!

Listen to the interview: TechCast Daily

From issue #32

New 30 day Enhanced Autopilot trial!

Tesla just raised the price of AP if purchased after delivery to $8k but the new trial offers a promotional price of $5,500. If you try it and like it, buy it while it’s still discounted!

Disclaimer: We love our AP and think it makes road trips much more relaxed even though you still have to pay full attention to the car, traffic, etc.

Read more: Tesla

From issue #32

New Autopilot visualization animation in 2019.28.3.7

People in the Early Access Program have now a new AP visualization when changing lanes.

The same version allows the driver to zoom in and rotate.

Cool stuff!

From issue #74

PSA: Don't wiggle the steering wheel to stop the nag, apply force in one direction

Really good tip by wickedsight on Reddit, this is at least how I do it.

From issue #79

Seeing the world in AutoPilot, part deux

A few issues ago we included videos of how AP2 sees the world. Now, thanks to verygreen and DamianXVI there is a more up-to-date and better version. It’s pretty awesome to be able to see what the car sees, this is a rare opportunity since it isn’t a marketing video. We recommend you to watch the videos and read - at least - the original post since it has a lot of interesting info. A few highlights:

  • AP identifies motorcycles, trucks, and pedestrians. Tesla doesn’t show them just yet but it is showing motorcycles in v9
  • The car identifies lanes and turns with no actual road markings
  • AP predicts how the road behind a hill crest is going to be
  • While it doesn’t render 3D boxes, it detects vehicles, speed, and direction
  • The car is continuously calculating all these, no matter if AP is on or off. Some of this state could be matched from the «triggers» to cause snapshots to be generated and sent back to Tesla. But verygreen doesn’t seem to think it actually compares this model to the actual driving input at this time.

Read more: TMC Forum

From issue #26

Seeing the world in Autopilot

Here’s a couple of pretty cool videos showing a visualization of the Tesla interpreted radar data. The size of the circle indicates how close the object is while the color represents the type (green - moving, orange - stationary, yellow - stopped). It is neat to be able to see what the radar sees, great work verygreen and DamianXVI!

From issue #13

Seeing the world in Autopilot V9 (part three)

If you’ve been following Tesletter or almost any Tesla forum or blog for sometime you probably already know who verygreen is (or greentheonly depending on the forum) and how much awesome information he is revealing from his car. This time he is back with a follow-up on what AP v9 seems. He has capture six of the cameras and the information that they detect. As always, this is awesome, great job verygreen!

In a second video we can see how AP can’t see a tire in the middle of the road, so please always be careful while using Autopilot!

From issue #30

Some SW internals of Tesla Autopilot node (HW2+)

Following up on his Tweetstorm, verygreen explained the software internals for AP2+. Since it is pretty hard to summarize it I would recommend you to go and read the whole thing on Reddit.

Read more: Reddit

From issue #47

Successfully negotiated 2 roundabouts in quick succession on Autopilot 28.3.1

Ok, it is a small one, no traffic and it just continues straight after the roundabout… but this seems like progress!

From issue #74

TRAIN AI 2018 - Building the Software 2.0 Stack, by Andrej Karpathy

Tesla related stuff in this talk starts in minute 15:30. Some interesting bits:

  • Since he joined - 11 months ago - the neural network is taking over the AP code base
  • Tesla has created a ton of tooling for the people who tag images so they can be more efficient
  • Around minute 20 he shows why labeling something that seemed easy like lane lines isn’t as easy as it seems
  • Min. 22:30 - He shows traffic lights, some are really crazy!
  • He talks about how random data collection doesn’t work for Tesla. For instance, if you want to identify when a car changes lanes and you collect images at random most often than not blinkers are going to be off
  • Min. 25:40 - He mentions how auto wiper and how the dataset is crazy and it ‘mostly works’
From issue #12

Take me to San Francisco!

Just say “take me to San Francisco” and your Tesla will automatically get on the Bay Bridge, pay the toll, and drive all the way to San Francisco with 0 disengagements.

From issue #53

Tesla AP urban environment 360-degree visualization

A visualization of the data eight autopilot cameras see when driving in an urban environment with cross traffic. Data for visualization was taken directly from an AP2.5 car.

From issue #39

Tesla Autopilot HW3

verygreen has found new information about Tesla’s AP HW3 in the firmware. There is a lof of information for us to try to condense it here, if you like these type of things go ahead and read the entire post on TMC.

Read more: TMC Forum

From issue #41

Tesla Autopilot Survey (from MIT)

Lex Fridman (MIT) is leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision and needs Tesla owners to help him answering this Tesla Autopilot Survey.

Read more: TMC Forums

From issue #3

Tesla Autopilot Survey (from MIT)

Lex Fridman (MIT) is leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision and needs Tesla owners to help him answering this Tesla Autopilot Survey.

Read more: TMC Forums

From issue #9

Tesla Autopilot miles

Check out this cool animated gif with Tesla autopilot miles since 2015.

From issue #15

Tesla Enhanced AP saves owner

Marc Benton - well known Tesla community member - just recorded this while traveling from LA to Nashville.

From issue #52

Tesla autopilot detecting stop lines

greentheonly is back at giving us a glimpse on what AP sees. As of 2019.4, we have stop lines detection enabled in public firmware.

From issue #49

Tesla autopilot stopping at red lights all by itself

Greentheonly - who else? - just discovered that 2019.8.3 comes with the hidden ability of detecting and reacting to red lights and stop signs and activated it!. If you watch the video, it’s clear that this feature isn’t ready for prime time, but it is exciting seeing progress on this front! Look at 0:40 and 1:30 for two success detections and 4:05 and 5:12 for failures.

From issue #52

Tesla could have to offer computer retrofits to all AP 2.0 and 2.5 cars

Several comments made by CEO Elon Musk since the launch of its AP 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.

Now it looks like Tesla might have to also offer computer retrofits for AP 2.5 cars.

Read more: Electrek.com

From issue #7

Tesla is now giving trials of the FSD

Tesla used to give trials of the EAP but it was unclear what was going to happen after they re-defined the packages to AP and FSD. Well, it seems like they just started giving trials of the FSD features (Auto Lane Change, NoA, and Summon) right now.

Read more: Twitter

From issue #57

Tesla owners raging about AP on version 2018.48

Lots of Tesla owners reporting AP on 2018.48 is great.

«This is literally the first time I felt like my car can actually drive itself.»

«I feel like it now handles curves perfectly.»

«This update, even with AP2.5 hardware, made me really hopeful and optimistic about FSD. I didn’t really enjoy my NoAP experience too much before.»

We haven’t really had a chance to fully try it yet, but we’re very excited after reading all these comments :D

Read more: Reddit

From issue #39

Tesla reveals new self-driving Autopilot hardware 3.0 computer diagram ahead of launch

A source told Electrek that Tesla updated the Model 3 wiring diagrams in its internal service documentation on January 9th to include the new Autopilot 3.0 computer as the new standard ECU in Model 3.

An interesting new piece of information is that in the diagram we can see how the chip has connectors for a second radar although current models only have mounted a forward facing radar.

Read more: Electrek

From issue #43

Tesla shuffles the EAP and FSD definitions around

Tesla just changed the definitions going forward but announced that customers who already purchased EAP will still get those features even though now are only included in the FSD package. Thanks Troy Teslike for the visualziation

See more: Tesla

From issue #49

Tesla vehicle deliveries and projections

In analyzing the Tesla subset of the MIT-AVT dataset, Lex Fridman said: «I came across the need to estimate how many Tesla vehicles have been delivered, and how many of them have Autopilot and which version of Autopilot hardware.»

The post was last updated on June 17th, 2018 and includes an FAQ section with questions received.

From issue #13

The new neural networks (really well) explained

Jimmy_d made an awesome job at explaining to us mortals how the new Neural Networks pushed on 2018.10.04 work. The three main types he observed are called main, fisheye and repeater. «I believe main is used for both the main and narrow forward facing cameras, that fisheye is used for the wide angle forward facing camera, and that repeater is used for both of the repeater cameras.»- he says.

Read more: Teslamotorsclub.com

From issue #1

This is what Tesla collects in an accident on AP2+

greentheonly (aka verygreen in TMC) came into procession of two Autopilot units from crashed cars and was able to extract the snashots that the car stored during the crash. One of the crashes is an AP1 that crashed on July 2017, the crash snapshots only included 5 cameras: main, narrow, fishseye and pillars (2) at 1 fps, while the other crash is from October 2017 and its snapshop data includes 30 fps footage for narrow and main cams in addition to the data mentioned earlier.

Read more: Reddit

From issue #35

What Tesla Autopilot sees at night

verygreen is back, this time he is showing us what the AP2 running v9 sees at night. It is weird the glare the blinkers put on the repeater cameras. You can see it at 00:32 on the left repeater camera.

Read more: TMC Forum

From issue #34

What's new in beta 2018.21

Tesla release versions odd numbers have historically been betas only pushed to a handfull number of users (3 - 5 according to Teslafi) under an NDA. Last time that a beta was rolled out was a few weeks before the NN rewrote, so we have high hopes that this one includes something big as well. The lucky ones who have rooted their Tesla are dropping some bits of info on TMC, see below (Thanks BigD0g and dennis_d for all the info!):

  • Screen showing blind spot messages and detection (read more)
  • Improvement on how AP handles road dividers (read more)
  • AP2 running 2018.21 showing cars in adjacent lanes (ala AP1) (video)
    Read more: TMC Forum
From issue #10