Bird footage to speed up R&D in autonomous vehicles
When building autonomous vehicles, “federated learning,” i.e. training an algorithm across multiple decentralized devices or servers, without exchanging locally stored data samples, plays a crucial role.
Nonetheless, there are still data challenges even when training models in a federated way. Zenseact have access to autonomous driving data which can’t be shared freely in R&D projects with external partners. Even within Zenseact’s organization, there are regulatory hurdles to overcome when you want to move raw data and non-trained models between borders.
Proxy data are datasets that share a lot of the same features of the data related to the solution you are aiming to build, minus the regulatory aspects of the data.
In this specific case, video recordings of seabirds, breeding on the cliffs of Stora Karlsö off the coast of Gotland, Sweden, turned out to be a perfect match for Zenseact’s needs.
The knowledge and learnings gained from working with the proxy data, often in collaboration with others, speed things up.
Zenseact is developing a software platform for autonomous driving. A lot of researcher's time is spent on proof-of-concept projects, i.e. demonstrating that state-of-the-art methods or ideas have practical potential.
Proxy data enables organisations to collaborate and AI Sweden’s Data Factory provides the infrastructure to make it happen. With open datasets shared by AI Sweden’s partners, sometimes functioning as proxy data for various kinds of applications, the Data Factory supports fast innovation.
In this case the bird footage data the challenges related to the bird dataset are like the ones to address in developing solutions for autonomous driving; for example, variations in light conditions, and differences in the birds' appearance are like differences between pedestrians.
Faster development times, easier collaboration with other organizations and the possibility to build different solutions based on the same shared dataset.