Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Agnit Mukhopadhyay, Sanja Panovska, Raven Garvey, Michael Liemohn, Natalia Ganjushkina, Austin Brenner, Ilya Usoskin, Michael Balikhin, and Daniel Welling
- Description:
- In the recent geological past, Earth’s magnetic field reduced to 4% of the modern values and the magnetic poles moved severely apart from the geographic poles causing the Laschamps geomagnetic excursion, which happened about 41 millennia ago. The excursion lasted for about two millennia, with the peak strength reduction and dipole tilting lasting for a shorter period of 300 years. During this period, the geomagnetic field exhibited significant differences from the modern nearly-aligned dipolar field, causing non-dipole variables to mimic a magnetic field akin to the outer planets while displaying a significantly reduced magnetic strength. However, the precise magnetospheric configuration and their electrodynamic coupling with the atmosphere have remained critically understudied. This dataset contains the first space plasma investigation of the exact geomagnetic conditions in the near-Earth space environment during the excursion. The study contains a full 3D reconstruction and analysis of the geospace system including the intrinsic geomagnetic field, magnetospheric system and the upper atmosphere, linked in sequence using feedback channels for distinct temporal epochs. The reconstruction was conducted using the LSMOD.2 model, Block Adaptive Tree Solar wind-Roe-Upwind Scheme (BATS-R-US) Model and the MAGnetosphere-Ionosphere-Thermosphere (MAGNIT) Auroral Precipitation Model, all of which are publicly-available models. The dataset contains the raw data from each of these models, in addition to the images/post-processing results generated using these models. Paleomagnetic data produced by LSMOD.2 can be visualized using a combination of linear plotting and contour plotting tools available commonly in visualization software like Python (e.g. Python/Matplotlib) or MATLAB. Standard tools to read and visualize BATS-R-US and MAGNIT output are already publicly available using IDL and Python (see SpacePy/PyBats - https://spacepy.github.io/pybats.html). For information and details about the post-processed data, visualization and analysis, please contact the authors for details. The anthropological dataset can be visualized using a shape file reader (e.g. Python/GeoPandas) and a linear plotting tool (e.g. Python/Matplotlib).
- Discipline:
- Engineering and Science
-
- Creator:
- Klinich, Kathleen D, Hu, Jingwen, Boyle, Kyle J, Manary, Miriam A., and Orton, Nichole R
- Description:
- As part of a project to develop side impact test procedures for evaluating wheelchairs, wheelchair tiedowns and occupant restraint systems (WTORS), and vehicle-based occupant protection systems for wheelchair seating stations, we created validated finite element (FE) models to support procedure development. Models were constructed using LS-DYNA. Dynamic sled tests were performed to validate the FE models of surrogate fixtures and commercial hardware. Validated FE models were developed for the Surrogate wheelchair base (SWCB), Surrogate wheelchair for side impact (SWCSI), a manual wheelchair (Ki Mobility Catalyst 5), and a power wheelchair (Quantum Rehab Edge 2.0). Additional FE models of a heavy-duty anchor meeting the Universal Docking Interface Geometry (UDIG), surrogate four-point strap tiedowns (SWTORS), a traditional docking station, and the surrogate wall fixture were also developed.
- Keyword:
- finite element, wheelchair, transportation, and tiedown
- Discipline:
- Engineering
-
- Creator:
- Lee, Shih Kuang, Tsai, Sun Ting, and Glotzer, Sharon C.
- Description:
- The trajectory data and codes were generated for our work "Classification of complex local environments in systems of particle shapes through shape-symmetry encoded data augmentation" (amidst peer review process). The data sets contain trajectory data in GSD file format for 7 test systems, including cubic structures, two-dimensional and three-dimensional patchy particle shape systems, hexagonal bipyramids with two aspect ratios, and truncated shapes with two degrees of truncation. Besides, the corresponding Python code and Jupyter notebook used to perform data augmentation, MLP classifier training, and MLP classifier testing are included.
- Keyword:
- Machine Learning, Colloids Self-Assembly, Crystallization, and Order Parameter
- Citation to related publication:
- https://doi.org/10.48550/arXiv.2312.11822
- Discipline:
- Other, Science, and Engineering
-
- Creator:
- Lin, Brian T. W.
- Description:
- This footage is an output of a USDOT-funded project titled "Development of Machine-Learning Models for Autonomous Vehicle Decisions on Weaving Sections of Freeway Ramps." It showcases an automated weaving maneuver within an augmented reality environment. During the demonstration, Mcity's automated vehicle navigates through a highway weaving section, making a lane change while interacting with a virtual vehicle. In this instance, Mcity's vehicle was operated by automated driving systems, which executed the lane change based on the detection for external environmental factors and parameter inputs received from the virtual vehicle.
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal imaging, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering