Calibrating point clouds and velocimetry from a FMCW Lidar sensor against synthetic data generated from our rendering software.
Status (Feb 2024): Lidar simulation validated with hardware. Statistically agreements/disagreements between rendered and real-data are reported (results below, paper to follow). Poster presented at AAS GNC 2024 conference, Breckenridge, CO. Available upon request.
Coming up: File cleanup.
Process: We collect FMCW datasets from real-world scenes using Aeva 4D Lidar (example). We also collect the ground truth trajectory information (poses, velocities) using vicon system. 3D assets are generated from dense reconstruction of the real-world scene. We simulate the physics of the FMCW Lidar using a custom ray-tracing software built at LASR laboratory. Ultimately, we compare the statistically similarities between the rendered and the sensor outputs.
Note: We used two datasets (shown below) to validate our results: asteroid wall (landing approach) and spinning satellite (proximity ops). Datasets and rendering softwares are not public. Reach out to me for details.
If you use our software/approach/analysis in a scientific publication, we would appreciate using the following citations:
@article{eapen2022narpa,
title={NaRPA: Navigation and Rendering Pipeline for Astronautics},
author={Eapen, Roshan Thomas and Bhaskara, Ramchander Rao and Majji, Manoranjan},
journal={arXiv preprint arXiv:2211.01566},
year={2022}
}
This project is licensed under the Attribution-NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) License.