A Tesla hacker has unlocked “Autopilot Augmented Vision,” a new mode in Tesla’s autonomous driving suite, that enables to see what Autopilot can detect in real time.
It’s awesome to watch.
More than two years ago, we shared pictures on Electrek of leaked images from inside a Tesla engineering vehicle testing Autopilot.
The leak showed that Tesla was already testing its “Full Self-Driving” mode, and it also showed that Tesla had something called “Augmented Vision” in its dev mode.
Augmented Vision appeared to show in real time the feed from each Autopilot camera with an overlay of what it was detecting through its computer vision system.
Now Tesla hacker “green,” who managed to access the Autopilot dev mode recently, has activated the Augmented Vision mode
He shared more than five minutes of driving video with the mode activated, resulting in the best footage of what Tesla Autopilot can see to date:
This is one of the best looks at the inner workings of Tesla Autopilot and what it can detect in real-time as well as how it plans it path on the road.
Green ends the thread by asking why Tesla doesn’t enable consumers to activate this mode.
I can think of a few reasons why Tesla wouldn’t want this mode accessible to everyone, but I think the main one is that Tesla doesn’t want to encourage drivers to be watching the center display when using Autopilot.
Drivers should keep their eyes on the road and be ready to take control at all times.
However, the Augmented Vision mode, like the driving visualizations to a lesser degree, is useful to build confidence in the system long-term.
Drivers better understand what the car is seeing and how it takes decisions.
Right now, confidence in the system is not always a good thing. Tesla admitted that most accidents on Autopilot are due to drivers being overconfident with the driver-assist system.
But in the long term, as Tesla achieves full self-driving, building confidence in the driving system is going to be key.