雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Planetary Rover Localization in Virtual Reality Environment via Orbital and Surface Imagery Learnt Embeddings

Note: We don't have the ability to review paper

PubDate: December 2021

Teams: University of Girona;European Space Agency

Writers: Valerio Franchi; Evridiki Ntagiou

PDF: Planetary Rover Localization in Virtual Reality Environment via Orbital and Surface Imagery Learnt Embeddings

Abstract

The localization system is developed to localize a rover moving in a planetary terrain inside a bird-view map of the environment that it operates in, by utilizing image data from the vehicle’s front-facing monocular camera. Since the system is destined to operate on a planetary terrain wherein the data is not readily available, an artificial model in the form of Virtual Reality (VR) is adopted, despite several limitations including a lack of features and textures compared to real-world data. A Unity Virtual Reality (VR) simulation environment loaded with artificially-built lunar terrains, is utilized to gather the training data and augmented using a Generative Adversarial Network (GAN). The localization system consists of a Monte Carlo Localization algorithm that employs visual odometry to propagate the particles in the environment and a Siamese Neural Network (SNN) to evaluate the estimate of each particle’s location inside the map. The GAN was able to generate realistic images with high similarity to the VR lunar environment, while the rover localization system tested successfully with a positional error below 25 metres after a minimum of 4 iterations of the algorithm, inside a 600 metre by 600 metre map. Additionally, the error was observed to constantly decrease with every new time step.

您可能还喜欢...

Paper