Argo AI is releasing curated data along with high-definition maps to researchers for free, the latest company in the autonomous vehicle industry to open source some of the information it has captured while developing and testing self-driving cars.
The aim, the Ford Motor-backed company says, is to give academic researchers the ability to study the impact that HD maps has on perception and forecasting such as identifying and tracking objects on the road, and predicting where those objects will move seconds into the future. In short, Argo sees this as a way to encourage more research and hopefully breakthroughs in autonomous vehicle technology.
Argo has branded this collection of data and maps Argoverse, which is being released for free. Argo isn’t releasing everything it has. This is a curated dataset after all. Still, it’s a large enough batch to give researchers something to dig into and model.
This “Argoverse” contains a selection of data, including two HD maps with lane centerlines, traffic direction and ground height collected on roads in Pittsburgh and in Miami.
For instance, Argoverse also has a motion forecasting dataset with 3D tracking annotation for 113 scenes and more than 300,000 vehicle trajectories, including unprotected left turns and lane changes, and provides a benchmark to promote testing, teaching, and learning, according to the website. There is also one API to connect the map data with sensor information.
Argo isn’t the first autonomous vehicle company to open source its data or other tools. But the company says this batch of data is unique because it includes HD maps, which are considered one of the critical components for self-driving vehicles.
Much of the attention in the world of autonomous vehicles is on the “brain” of the car. But they also need maps to deliver information that helps these vehicles, whether they’re operating in a warehouse or on public roads, the safest and most efficient route possible.
For researchers, access to these kinds of maps can be used to develop prediction and tracking methods.
Earlier this week, Cruise announced it would share a data visualization tool it created called Webviz. The web-based application takes raw data collected from all the sensors on a robot — a category that includes autonomous vehicles — and turns that binary code into visuals. Cruise’s tool lets users configure different layouts of panels, each one displaying information like text logs, 2D charts and 3D depictions of the AV’s environment.
And last year, Aptiv released nuScenes, a large-scale data set from an autonomous vehicle sensor suite.
This post was originally posted at http://feedproxy.google.com/~r/Techcrunch/~3/hKvVglYWCgE/.