Are you working on self-driving cars?
If you are, Uber has just made your life a lot easier.
Self-driving cars have loads of sensors to keep track of their environment and to make sure that the car stays on the road:
- Cameras facing forward, backward, and sideways
- LIDAR: a laser rangefinder that scans the environment in 360 degrees
- Acoustic sensors that detect obstacles near the car
- Radar, to look far ahead on the highway
A big challenge is to make sense of all this information and combine all these sensor readings to paint a consistent picture of the environment.
Uber has an in-house system called AVS. It runs in a web browser and it can combine sensor log data in real time to draw the car in 3D in its environment. They use AVS to evaluate test drives or troubleshoot their software when their cars make a mistake.
The results are spectacular. This is what it looks like:
You can watch a live clip here: https://youtu.be/IL4k9ECHo9c
And to help other self-driving car companies, Uber has just open-sourced their entire AVS stack!
The software comes in two packages: a server-side component called XVIZ that serializes and streams sensor data, and a client component called StreetScape that visualizes the 3D environment.
You can download them both here: https://avs.auto/#/
This, combined with Tesla’s recent decision to open-source all their patents, is creating a very lucrative market for self-driving car technology.
Expect to see major improvements in the coming years!