Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Lin, Brian T. W.
- Description:
- This footage is an output of a USDOT-funded project titled "Development of Machine-Learning Models for Autonomous Vehicle Decisions on Weaving Sections of Freeway Ramps." It showcases an automated weaving maneuver within an augmented reality environment. During the demonstration, Mcity's automated vehicle navigates through a highway weaving section, making a lane change while interacting with a virtual vehicle. In this instance, Mcity's vehicle was operated by automated driving systems, which executed the lane change based on the detection for external environmental factors and parameter inputs received from the virtual vehicle.
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal imaging, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Sheppard, Anja, Sethuraman, Advaith V, Bagoren, Onur, Pinnow, Christopher, Anderson, Jamey, Havens, Timothy C, and Skinner, Katherine A
- Description:
- The AI4Shipwrecks dataset contains sidescan sonar images of shipwrecks and corresponding binary labels collected during 2022 and 2023 at the NOAA Thunder Bay National Marine Sanctuary in Alpena, MI. The data collection platform was an Iver3 Autonomous Underwater Vehicle (AUV) equipped with an EdgeTech 2205 dual-frequency ultra-high resolution sidescan sonar and 3D bathymetric system. The labels were compiled from reference labels created by experts in marine archaeology. The intended use of this dataset is to encourage development of semantic segmentation, object detection, or anomaly detection algorithms in the computer vision field. Comparisons of state-of-the-art segmentation networks on our dataset are shown in the paper. , The file structure is organized as described in the README.txt file, where images in 'images' directories are the waterfall product of sidescan sonar surveys, and images in 'labels' directories are binary representations of expert labels. Images across the 'images' and 'labels' directories are correlated by having identical filenames. In the label images, a pixel value of '0' represents the non-shipwreck/other class and '1' represents the shipwreck class for the correspondingly named image (<wreck_name>_<##>.png) in the images directory. , and The project webpage can be found at: https://umfieldrobotics.github.io/ai4shipwrecks/
- Keyword:
- machine learning, computer vision, field robotics, marine robotics, underwater robotics, sidescan sonar, semantic segmentation, and object detection
- Discipline:
- Engineering
-
- Creator:
- Hawes, Jason K, Goldstein, Benjamin P. , Newell, Joshua P. , Dorr, Erica , Caputo, Silvio , Fox-Kämper, Runrid , Grard, Baptiste , Ilieva, Rositsa T. , Fargue-Lelièvre, Agnès , Poniży, Lidia , Schoen, Victoria , Specht, Kathrin , and Cohen, Nevin
- Description:
- Urban agriculture (UA) is a widely proposed strategy to make cities and urban food systems more sustainable. However, its carbon footprint remains understudied. In fact, the few existing studies suggest that UA may be worse for the climate than conventional agriculture. This is the first large-scale study to resolve this uncertainty across cities and types of UA, employing citizen science at 73 UA sites in Europe and the United States to compare UA products to food from conventional farms. The results reveal that food from UA is six times as carbon intensive as conventional agriculture (420g vs 70g CO2 equivalent per serving). Some UA crops (e.g., tomatoes) and sites (e.g., 25% of individually-managed gardens), however, outperform conventional agriculture. These exceptions suggest that UA practitioners can reduce their climate impacts by cultivating crops that are typically greenhouse grown or air-freighted, maintaining UA sites for many years, and leveraging waste as inputs.This database contains the necessary reference material to trace the path of our analysis from raw garden data to carbon footprint and nutrient results. It also contains the final results of the analyses in various extended forms not available in the publication. For more information, see manuscript at link below. (Introduction partially quoted from Hawes et al., 2023)
- Citation to related publication:
- Hawes, J. K., Goldstein, B. P., Newell, J. P., Dorr, E., Caputo, S., Fox-Kämper, R., Grard, B., Ilieva, R. T., Fargue-Lelièvre, A., Poniży, L., Schoen, V., Specht, K., & Cohen, N. (2024). Comparing the carbon footprints of urban and conventional agriculture. Nature Cities, 1–10. https://doi.org/10.1038/s44284-023-00023-3
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception., The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering