This is a large scale, long-term autonomy dataset for robotics research collected on the University of Michigan’s North Campus. The dataset consists of omnidirectional imagery, 3D lidar, planar lidar, GPS, and proprioceptive sensors for odometry collected using a Segway robot. The dataset was collected to facilitate research focusing on longterm autonomous operation in changing environments. The dataset is comprised of 27 sessions spaced approximately biweekly over the course of 15 months. The sessions repeatedly explore the campus, both indoors and outdoors, on varying trajectories, and at different times of the day across all four seasons. This allows the dataset to capture many challenging elements including: moving obstacles (e.g., pedestrians, bicyclists, and cars), changing lighting, varying viewpoint, seasonal and weather changes (e.g., falling leaves and snow), and long-term structural changes caused by construction projects. To further facilitate research, we also provide ground-truth pose for all sessions in a single frame of reference. and A detailed description of the dataset and the methods used to generate it is in the document nclt.pdf. If you use this dataset in your research please cite:
Carlevaris-Bianco, N., Ushani, A., Eustice, R. (2021). The University of Michigan North Campus Long-Term Vision and LIDAR Dataset [Data set]. University of Michigan - Deep Blue. https://doi.org/10.7302/7rnm-6a03
Carlevaris-Bianco, Nicholas, et al. “University of Michigan North Campus Long-Term Vision and Lidar Dataset.” The International Journal of Robotics Research, vol. 35, no. 9, Aug. 2016, pp. 1023–1035, doi:10.1177/0278364915614638.