Blog posts related to Autopilot
I've had access to a few videos of a Model 3 running 2019.28.3.11 - an Early Access only release. As always, I want to share what I think is interesting in these videos with you. These new videos showcase a few features that I expect will be released to the general public under the version 10 of the firmware, hopefully in a couple of months.
Past Tesletter articles
The Tesla AP team keeps improving the software with every release, impressive video of a Tesla taking a 90 degree turn that the car could never do before.
Multiple people have reported this in Norway, Finland, and the Netherlands.TMC Forum
We don’t cover Tesla’s stock, financial results, or other stuff that isn’t their product but this time during the Q2 call, they made really cool product announcements:
- v9 would include on-ramp off-ramp AP with automatic lane change when possible
- Hardware 3 will be introduced next year. According to the announcement, the new hardware is a chip designed in-house. This chip is designed explicitly for neural network processing. There are drop-in replacements for the Model S, Model X, and Model 3. According to Elon, the current Nvidia hardware can process 200fps while this new one can process over 2000fps.
- The coast-to-coast trip on AP won’t happen soon. The AP team is focused on «safety features» like recognizing stop signs and traffic lights in a super reliable way. They don’t want to distract the team or do it with a predefined hardcoded route, so it’s going to have to wait.
u/tp1996 put together a list of what was discussed:
- NoAP - Option to disable stalk confirmation coming in a few weeks.
- Traffic Light and intersections - Currently working at about 98% success rate. Goal is 99.999%+.
- Stop sign detection - Elon said this was very easy and trivial as it can be geo-coded, and stop signs are very easy to recognize.
- Navigating complex parking lots - Includes parking garages with heavy pedestrian traffic.
- Advanced Summon - Will be first example of car being able to navigate complex parking lots. ETA “Probably next month».
Read more: Reddit
By verygreen What other things I have noticed: the «anonymization» is actually pretty superficial.«)
Read more: TMC Forum
Since there is a lot of misinformation about how Tesla’s «tesla shadow mode» works, verygreen went on a Tweetstorm to try and throw some light about what it does do and doesn’t. In a sense it’s more advanced than what people think -lots of rules and triggers- but not what other people thought -like not constantly monitoring what the driver does and communicating discrepancies to Tesla. Pretty informative read, as always, from verygreen.
Read more: Twitter
Onwer ended up with his head in the clouds (his words, not mine) for a moment, and didn’t notice how the road was curving. The emergency lane departure feature saved him from an expensive repair.
BozieBeMe2 says: «The electric steering is nothing more than a servo system. Just like in a robot. (…) When you intervene the computers commands, by offering a resistance in that movements, the amp loads go up in the servo and tells the AP computer to cancel the AP sequencing. It’s the same thing when a robot in the factory strikes some thing, the amp loads go up and the robot faults out.»
Read more: TMC Forum
The user Balance reports seeing the following:
- Two Nvidia GP106-510 gpu’s. The GP106 is used in the GTX 1060 (GP106-300 for 3GB and GP106-400 for 6GB). Looks like a very highly binned version
- One Nvidia Parker SoC (need better photo) with four 8 gigabit (1 gigabyte) SK Hynix DRAM chips for a total of 4 GB DRAM
- Fairly large Infineon and Marvell chips u-blox Neo-M8L GNSS module (GPS/QZSS, GLONASS, BeiDou, Galileo)
Read more: Reddit.com
In the last earnings call, Elon mentioned that the most common reason to visit a service center is to ask how Autopilot works. Well, Tesla Raj has published a great video describing how to use Autopilot for Tesla beginners.
Andrej Karpathy talks about Autopilot and deep learning. As always, it’s really interesting to hear him talking about this stuff, even when we are not experts on the subject by any means.
BTW I was lucky enough to get invited to the Neuralink event on Tuesday, and can you guess who I ran into? I didn’t want to bother him too much, I just thanked him as an owner and Tesla enthusiast.
Tesla just raised the price of AP if purchased after delivery to $8k but the new trial offers a promotional price of $5,500. If you try it and like it, buy it while it’s still discounted!
Disclaimer: We love our AP and think it makes road trips much more relaxed even though you still have to pay full attention to the car, traffic, etc.
Read more: Tesla
People in the Early Access Program have now a new AP visualization when changing lanes.
The same version allows the driver to zoom in and rotate.
A few issues ago we included videos of how AP2 sees the world. Now, thanks to verygreen and DamianXVI there is a more up-to-date and better version. It’s pretty awesome to be able to see what the car sees, this is a rare opportunity since it isn’t a marketing video. We recommend you to watch the videos and read - at least - the original post since it has a lot of interesting info. A few highlights:
- AP identifies motorcycles, trucks, and pedestrians. Tesla doesn’t show them just yet but it is showing motorcycles in v9
- The car identifies lanes and turns with no actual road markings
- AP predicts how the road behind a hill crest is going to be
- While it doesn’t render 3D boxes, it detects vehicles, speed, and direction
- The car is continuously calculating all these, no matter if AP is on or off. Some of this state could be matched from the «triggers» to cause snapshots to be generated and sent back to Tesla. But verygreen doesn’t seem to think it actually compares this model to the actual driving input at this time.
Read more: TMC Forum
Here’s a couple of pretty cool videos showing a visualization of the Tesla interpreted radar data. The size of the circle indicates how close the object is while the color represents the type (green - moving, orange - stationary, yellow - stopped). It is neat to be able to see what the radar sees, great work verygreen and DamianXVI!
If you’ve been following Tesletter or almost any Tesla forum or blog for sometime you probably already know who verygreen is (or greentheonly depending on the forum) and how much awesome information he is revealing from his car. This time he is back with a follow-up on what AP v9 seems. He has capture six of the cameras and the information that they detect. As always, this is awesome, great job verygreen!
In a second video we can see how AP can’t see a tire in the middle of the road, so please always be careful while using Autopilot!
Ok, it is a small one, no traffic and it just continues straight after the roundabout… but this seems like progress!
Tesla related stuff in this talk starts in minute 15:30. Some interesting bits:
- Since he joined - 11 months ago - the neural network is taking over the AP code base
- Tesla has created a ton of tooling for the people who tag images so they can be more efficient
- Around minute 20 he shows why labeling something that seemed easy like lane lines isn’t as easy as it seems
- Min. 22:30 - He shows traffic lights, some are really crazy!
- He talks about how random data collection doesn’t work for Tesla. For instance, if you want to identify when a car changes lanes and you collect images at random most often than not blinkers are going to be off
- Min. 25:40 - He mentions how auto wiper and how the dataset is crazy and it ‘mostly works’
Just say “take me to San Francisco” and your Tesla will automatically get on the Bay Bridge, pay the toll, and drive all the way to San Francisco with 0 disengagements.
A visualization of the data eight autopilot cameras see when driving in an urban environment with cross traffic. Data for visualization was taken directly from an AP2.5 car.
greentheonly is back at giving us a glimpse on what AP sees. As of 2019.4, we have stop lines detection enabled in public firmware.
Greentheonly - who else? - just discovered that 2019.8.3 comes with the hidden ability of detecting and reacting to red lights and stop signs and activated it!. If you watch the video, it’s clear that this feature isn’t ready for prime time, but it is exciting seeing progress on this front! Look at 0:40 and 1:30 for two success detections and 4:05 and 5:12 for failures.
Several comments made by CEO Elon Musk since the launch of its AP 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.
Now it looks like Tesla might have to also offer computer retrofits for AP 2.5 cars.
Read more: Electrek.com
Tesla used to give trials of the EAP but it was unclear what was going to happen after they re-defined the packages to AP and FSD. Well, it seems like they just started giving trials of the FSD features (Auto Lane Change, NoA, and Summon) right now.
Read more: Twitter
Lots of Tesla owners reporting AP on 2018.48 is great.
«This is literally the first time I felt like my car can actually drive itself.»
«I feel like it now handles curves perfectly.»
«This update, even with AP2.5 hardware, made me really hopeful and optimistic about FSD. I didn’t really enjoy my NoAP experience too much before.»
We haven’t really had a chance to fully try it yet, but we’re very excited after reading all these comments :D
Read more: Reddit
A source told Electrek that Tesla updated the Model 3 wiring diagrams in its internal service documentation on January 9th to include the new Autopilot 3.0 computer as the new standard ECU in Model 3.
An interesting new piece of information is that in the diagram we can see how the chip has connectors for a second radar although current models only have mounted a forward facing radar.
Read more: Electrek
In analyzing the Tesla subset of the MIT-AVT dataset, Lex Fridman said: «I came across the need to estimate how many Tesla vehicles have been delivered, and how many of them have Autopilot and which version of Autopilot hardware.»
The post was last updated on June 17th, 2018 and includes an FAQ section with questions received.
Jimmy_d made an awesome job at explaining to us mortals how the new Neural Networks pushed on 2018.10.04 work. The three main types he observed are called main, fisheye and repeater. «I believe main is used for both the main and narrow forward facing cameras, that fisheye is used for the wide angle forward facing camera, and that repeater is used for both of the repeater cameras.»- he says.
Read more: Teslamotorsclub.com
greentheonly (aka verygreen in TMC) came into procession of two Autopilot units from crashed cars and was able to extract the snashots that the car stored during the crash. One of the crashes is an AP1 that crashed on July 2017, the crash snapshots only included 5 cameras: main, narrow, fishseye and pillars (2) at 1 fps, while the other crash is from October 2017 and its snapshop data includes 30 fps footage for narrow and main cams in addition to the data mentioned earlier.
Read more: Reddit
Tesla release versions odd numbers have historically been betas only pushed to a handfull number of users (3 - 5 according to Teslafi) under an NDA. Last time that a beta was rolled out was a few weeks before the NN rewrote, so we have high hopes that this one includes something big as well. The lucky ones who have rooted their Tesla are dropping some bits of info on TMC, see below (Thanks BigD0g and dennis_d for all the info!):