Tesla Autopilot enables your car to steer, accelerate and brake automatically within its lane. As a driver, you’re required to supervise Autopilot’s decisions. Its main features are Navigate on Autopilot, to suggest lane changes to optimize your route; Autosteer, which keeps on being improved to navigate more complex roads; and Smart Summon, which will take your car to you in a parking lot, maneuvering around objects as necessary.
Tesla cars come with standard advanced hardware to provide Autopilot features today and full self-driving capabilities in the future. You can learn more about Tesla Autopilot in their website https://www.tesla.com/autopilot
Blog posts related to Autopilot
I've had access to a few videos of a Model 3 running 2019.28.3.11 - an Early Access only release. As always, I want to share what I think is interesting in these videos with you. These new videos showcase a few features that I expect will be released to the general public under the version 10 of the firmware, hopefully in a couple of months.
Videos about Autopilot
Autopilot with heavy rain - Model S
Autopilot handling curves under the rain like a boss. Heavy rain at 1.44 min and 2.19 min. Really great experience, Autopilot is super helpful specially when you have little visibility.
Past Tesletter articles
With 2019.40.1.1 Tesla pushed code to our cars that they are not currently using for the public. According to @greentheonly, there’s code tagged as «CityStreetsBehavior» which is brand new. In a follow-up tweet, he mentioned that most of its logic is missing and some comments indicate it’s targeting HW3, which isn’t a surprise since city streets will be only for folks who purchased FSD, all of which should get the AP processor swapped in the next few months.
Read more: TwitterFrom issue #88
The Tesla AP team keeps improving the software with every release, impressive video of a Tesla taking a 90 degree turn that the car could never do before.#68
I’m not sure how useful it is to have the J1772 adapter right there in the console - probably it depends on how much you use it - but I like seeing people designing these things
Read more: RedditFrom issue #80
Multiple people have reported this in Norway, Finland, and the Netherlands.TMC Forum From issue #10
Here are some highlights of Karpathy’s talk about AI at Tesla at PyTorch Developer Conference. Having said that, I encourage you to watch the entire video, he makes a great job explaining this complex subject.
- [3:45] Cameras get stitched together to understand the layout around it
- [4:00] The images get stitched up to create the Smart Summon visualization. It reminds me of old strategy games when, under the fog of war, you keep discovering the map as you move forward
- [5:00] Three cameras predicting intersections
- The Autopilot has 48 neural networks, that make 1,000 distinct predictions, and takes 70,000 GPU hours to train them. It would take one node with eight GPUs you will be training for a year
- None of the predictions made by Autopilot can’t regress. To make sure that doesn’t happen, they use loop validation and other types of validation and evaluation
- They target all neural networks to the new HW3 processor
- Project Dojo: Neural Network training computer in a chip. The goal is to improve the efficiency of training by an order of magnitude
- 1B miles on NoA and 200,000 automated lane changes, and 800,000 sessions of Smart Summon
We don’t cover Tesla’s stock, financial results, or other stuff that isn’t their product but this time during the Q2 call, they made really cool product announcements:
- v9 would include on-ramp off-ramp AP with automatic lane change when possible
- Hardware 3 will be introduced next year. According to the announcement, the new hardware is a chip designed in-house. This chip is designed explicitly for neural network processing. There are drop-in replacements for the Model S, Model X, and Model 3. According to Elon, the current Nvidia hardware can process 200fps while this new one can process over 2000fps.
- The coast-to-coast trip on AP won’t happen soon. The AP team is focused on «safety features» like recognizing stop signs and traffic lights in a super reliable way. They don’t want to distract the team or do it with a predefined hardcoded route, so it’s going to have to wait.
In this video by greentheonly, Autopilot recognizes both the dog and the owner. Even though it incorrectly tags the dog as a pedestrian, note that it does get recognized so that’s what matters in order to be able to avoid it.
Looks pretty elaborate and not a single frame fluke? Or am I imagining things?— green (@greentheonly) November 11, 2019
Somebody, walk your dog in front of your car and see if it shows up on IC to test (put car in D)?
All cams but pillars it seems. pic.twitter.com/8u846Vsq1A
In the recent Elon Musk interview by Thirdrow Tesla Part 2, we learned that the Autopilot team is going through a foundational rewrite that goes from different networks for different cameras to combine all of them, planning, and perception. This reddit post goes over why this is a big deal and why the change was needed.
Read more: RedditFrom issue #98
Tesla AP handling the fog like a boss thanks to the combination of cameras and front radar.#64
u/tp1996 put together a list of what was discussed:
- NoAP - Option to disable stalk confirmation coming in a few weeks.
- Traffic Light and intersections - Currently working at about 98% success rate. Goal is 99.999%+.
- Stop sign detection - Elon said this was very easy and trivial as it can be geo-coded, and stop signs are very easy to recognize.
- Navigating complex parking lots - Includes parking garages with heavy pedestrian traffic.
- Advanced Summon - Will be first example of car being able to navigate complex parking lots. ETA “Probably next month».
Read more: RedditFrom issue #45
I can’t wait for a solution to end the soul crushing heavy traffic!#65
By verygreen What other things I have noticed: the «anonymization» is actually pretty superficial.«)
Read more: TMC ForumFrom issue #16
Since there is a lot of misinformation about how Tesla’s «tesla shadow mode» works, verygreen went on a Tweetstorm to try and throw some light about what it does do and doesn’t. In a sense it’s more advanced than what people think -lots of rules and triggers- but not what other people thought -like not constantly monitoring what the driver does and communicating discrepancies to Tesla. Pretty informative read, as always, from verygreen.
Read more: TwitterFrom issue #47
It’s known that Tesla’s Autopilot doesn’t work as well in Europe as it does in the USA, the reason is that regulators in Europe have some absurd rules. For instance, they have limited the radius that Autopilot can turn the steering wheel resulting in the car asking the driver to take over immediately in the middle of turns, or they have limited the amount of time that the car can engage the blinker to change lanes.
Honestly, all these sound absurd and in the case of the steering wheel limitation even dangerous. That is why I hope, as the article says, regulators in Europe relax their laws soon. The article is behind a paywall if you want to read it in full, go here.From issue #82
Onwer ended up with his head in the clouds (his words, not mine) for a moment, and didn’t notice how the road was curving. The emergency lane departure feature saved him from an expensive repair.#62
BozieBeMe2 says: «The electric steering is nothing more than a servo system. Just like in a robot. (…) When you intervene the computers commands, by offering a resistance in that movements, the amp loads go up in the servo and tells the AP computer to cancel the AP sequencing. It’s the same thing when a robot in the factory strikes some thing, the amp loads go up and the robot faults out.»
Read more: TMC ForumFrom issue #15
The user Balance reports seeing the following:
- Two Nvidia GP106-510 gpu’s. The GP106 is used in the GTX 1060 (GP106-300 for 3GB and GP106-400 for 6GB). Looks like a very highly binned version
- One Nvidia Parker SoC (need better photo) with four 8 gigabit (1 gigabyte) SK Hynix DRAM chips for a total of 4 GB DRAM
- Fairly large Infineon and Marvell chips u-blox Neo-M8L GNSS module (GPS/QZSS, GLONASS, BeiDou, Galileo)
Read more: Reddit.comFrom issue #1
Good video explaining how Autopilot works (and used to work) at detecting cars cutting into your lane. Worth watching!#77
The car did well in some of the experiments, not so much in others. Interesting thread!
We'll start with the ideal case. Car on AP, driving steadily when we encounter pedestrian firmly in our way.— green (@greentheonly) December 6, 2019
We can see him from far away so we gracefully slow down, just like a real human! Perfect score! This happened 3 times out of 4. pic.twitter.com/w3ttt3ieAx
Andrej Karpathy talks about Autopilot and deep learning. As always, it’s really interesting to hear him talking about this stuff, even when we are not experts on the subject by any means.
BTW I was lucky enough to get invited to the Neuralink event on Tuesday, and can you guess who I ran into? I didn’t want to bother him too much, I just thanked him as an owner and Tesla enthusiast.#68
Tesla just raised the price of AP if purchased after delivery to $8k but the new trial offers a promotional price of $5,500. If you try it and like it, buy it while it’s still discounted!
Disclaimer: We love our AP and think it makes road trips much more relaxed even though you still have to pay full attention to the car, traffic, etc.
Read more: TeslaFrom issue #32
People in the Early Access Program have now a new AP visualization when changing lanes.
The same version allows the driver to zoom in and rotate.
Cool stuff!From issue #74
Really good tip by wickedsight on Reddit, this is at least how I do it.#79
In this «what does Autopilot sees» tweet from Greentheonly, we can see how the neural net tries to detect traffic lights and how it makes predictions about going/stopping.
Hm, I think this is shaping in the right direction now.— green (@greentheonly) February 11, 2020
Also seems the system is somewhat confused by the double-stacked traffic lights, thinks they are further away and the bounding box is all wrong too. pic.twitter.com/fXfBf9VInT
A few issues ago we included videos of how AP2 sees the world. Now, thanks to verygreen and DamianXVI there is a more up-to-date and better version. It’s pretty awesome to be able to see what the car sees, this is a rare opportunity since it isn’t a marketing video. We recommend you to watch the videos and read - at least - the original post since it has a lot of interesting info. A few highlights:
- AP identifies motorcycles, trucks, and pedestrians. Tesla doesn’t show them just yet but it is showing motorcycles in v9
- The car identifies lanes and turns with no actual road markings
- AP predicts how the road behind a hill crest is going to be
- While it doesn’t render 3D boxes, it detects vehicles, speed, and direction
- The car is continuously calculating all these, no matter if AP is on or off. Some of this state could be matched from the «triggers» to cause snapshots to be generated and sent back to Tesla. But verygreen doesn’t seem to think it actually compares this model to the actual driving input at this time.
Read more: TMC ForumFrom issue #26
Here’s a couple of pretty cool videos showing a visualization of the Tesla interpreted radar data. The size of the circle indicates how close the object is while the color represents the type (green - moving, orange - stationary, yellow - stopped). It is neat to be able to see what the radar sees, great work verygreen and DamianXVI!#13
If you’ve been following Tesletter or almost any Tesla forum or blog for sometime you probably already know who verygreen is (or greentheonly depending on the forum) and how much awesome information he is revealing from his car. This time he is back with a follow-up on what AP v9 seems. He has capture six of the cameras and the information that they detect. As always, this is awesome, great job verygreen!
In a second video we can see how AP can’t see a tire in the middle of the road, so please always be careful while using Autopilot!#30
Ok, it is a small one, no traffic and it just continues straight after the roundabout… but this seems like progress!#74
Tesla related stuff in this talk starts in minute 15:30. Some interesting bits:
- Since he joined - 11 months ago - the neural network is taking over the AP code base
- Tesla has created a ton of tooling for the people who tag images so they can be more efficient
- Around minute 20 he shows why labeling something that seemed easy like lane lines isn’t as easy as it seems
- Min. 22:30 - He shows traffic lights, some are really crazy!
- He talks about how random data collection doesn’t work for Tesla. For instance, if you want to identify when a car changes lanes and you collect images at random most often than not blinkers are going to be off
- Min. 25:40 - He mentions how auto wiper and how the dataset is crazy and it ‘mostly works’
A visualization of the data eight autopilot cameras see when driving in an urban environment with cross traffic. Data for visualization was taken directly from an AP2.5 car.#39
DirtyTesla (hi Chris!) posted this awesome video of his car (Model 3 with HW2.5 running 2019.40.1.1) avoiding a construction barrel that is literally in his lane. The screen doesn’t reflect the cone but as greentheonly tweeted this week, «cones are obviously detected on HW2+ just as a bunch of other stuff like traffic signs, traffic signals and so on, on HW2 none of this information is currently being sent to infotainment for visualization (there it used to do it in v9), on HW3 they send the cones info»
Greentheonly - who else? - just discovered that 2019.8.3 comes with the hidden ability of detecting and reacting to red lights and stop signs and activated it!. If you watch the video, it’s clear that this feature isn’t ready for prime time, but it is exciting seeing progress on this front! Look at 0:40 and 1:30 for two success detections and 4:05 and 5:12 for failures.#52
Several comments made by CEO Elon Musk since the launch of its AP 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.
Now it looks like Tesla might have to also offer computer retrofits for AP 2.5 cars.
Read more: Electrek.comFrom issue #7
It’s so cool to keep seeing Smart Summon getting better and better.
It is well known that Autopilot behavior is not the same in all countries since Tesla needs to adapt it to the regulations in the different markets. After the last software update 2019.40.2.2, Tesla has «reduced» the following Autopilot capabilities:
- Auto Lane Change in divided roads only, with a min. wait of 1.5 seconds
- Limited Autosteer
- Summon within 6 meters of your car only
- Reminder to touch steering wheel after 15 seconds
Read more: ElectrekFrom issue #90
Tesla used to give trials of the EAP but it was unclear what was going to happen after they re-defined the packages to AP and FSD. Well, it seems like they just started giving trials of the FSD features (Auto Lane Change, NoA, and Summon) right now.
Read more: TwitterFrom issue #57
Lots of Tesla owners reporting AP on 2018.48 is great.
«This is literally the first time I felt like my car can actually drive itself.»
«I feel like it now handles curves perfectly.»
«This update, even with AP2.5 hardware, made me really hopeful and optimistic about FSD. I didn’t really enjoy my NoAP experience too much before.»
We haven’t really had a chance to fully try it yet, but we’re very excited after reading all these comments :D
Read more: RedditFrom issue #39
A source told Electrek that Tesla updated the Model 3 wiring diagrams in its internal service documentation on January 9th to include the new Autopilot 3.0 computer as the new standard ECU in Model 3.
An interesting new piece of information is that in the diagram we can see how the chip has connectors for a second radar although current models only have mounted a forward facing radar.
Read more: ElectrekFrom issue #43
Tesla just changed the definitions going forward but announced that customers who already purchased EAP will still get those features even though now are only included in the FSD package. Thanks Troy Teslike for the visualziation
See more: TeslaFrom issue #49
According to Greentheonly, in the version 2019.40.50.1, the B node in the HW3 board starts the full copy of the Autopilot software. This is important because if one node crashes, the car can immediately start using the other one for redundancy.
Read more: TwitterFrom issue #92
Interesting stats by Lex Fridman:
- 737,570 Tesla vehicles delivered with Autopilot hardware 2⁄3.
- Estimated Autopilot miles to-date: 2.2 billion miles
- Estimated miles in all Tesla vehicles: 19.1 billion miles
- He projects that in the year they will nearly double the number Autopilot miles
Read more: BlogFrom issue #93
In analyzing the Tesla subset of the MIT-AVT dataset, Lex Fridman said: «I came across the need to estimate how many Tesla vehicles have been delivered, and how many of them have Autopilot and which version of Autopilot hardware.»
The post was last updated on June 17th, 2018 and includes an FAQ section with questions received.#13
Greentheonly tested what happens when his Tesla encounters a stop sign that wasn’t in the maps that the car has. While the stop sign renders correctly on the screen, the car doesn’t react to it. If the sign is in the maps, the car beeps. My assumption is that this will change since detection seems to be working pretty well in the FSD preview. Greentheonly also discovered that if there’s a stop sign in sight, Autopilot refuses to engage if it was not already active.
Today they show the stop signs and traffic lights but they only act on these if the maps agree there's something. It's trivial to demonstrate: pic.twitter.com/mEZIjHImTh— green (@greentheonly) January 13, 2020
Jimmy_d made an awesome job at explaining to us mortals how the new Neural Networks pushed on 2018.10.04 work. The three main types he observed are called main, fisheye and repeater. «I believe main is used for both the main and narrow forward facing cameras, that fisheye is used for the wide angle forward facing camera, and that repeater is used for both of the repeater cameras.»- he says.
Read more: Teslamotorsclub.comFrom issue #1
greentheonly (aka verygreen in TMC) came into procession of two Autopilot units from crashed cars and was able to extract the snashots that the car stored during the crash. One of the crashes is an AP1 that crashed on July 2017, the crash snapshots only included 5 cameras: main, narrow, fishseye and pillars (2) at 1 fps, while the other crash is from October 2017 and its snapshop data includes 30 fps footage for narrow and main cams in addition to the data mentioned earlier.
Read more: RedditFrom issue #35
Tesla release versions odd numbers have historically been betas only pushed to a handfull number of users (3 - 5 according to Teslafi) under an NDA. Last time that a beta was rolled out was a few weeks before the NN rewrote, so we have high hopes that this one includes something big as well. The lucky ones who have rooted their Tesla are dropping some bits of info on TMC, see below (Thanks BigD0g and dennis_d for all the info!):
- Screen showing blind spot messages and detection (read more)
- Improvement on how AP handles road dividers (read more)
- AP2 running 2018.21 showing cars in adjacent lanes (ala AP1) (video) TMC Forum