Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal imaging, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Yining Shi
- Description:
- Statistical study of residuals between Swarm observations and IGRF-13 geomagnetic field model larger than 300 nT in northern and southern hemisphere. Data analysis done on https://viresclient.readthedocs.io/en/latest/ These data are generated to conduct a statistical study of the locations of large residuals in the two hemispheres for a better understanding of potential error in satellite aviation application when using Earth magnetic field models like IGRF as references, as well as the energy transfer in the magnetosphere-ionosphere-thermosphere coupling. Interhemispheric asymmetries are found in the locations of the large residuals due to the difference in geographic pole locations.
- Discipline:
- Engineering
-
- Creator:
- Rivera-Rivera, Luis Y., Moore, Timothy C., and Glotzer, Sharon C.
- Description:
- The dataset is organized as follows: the data for each of the three target structures is contained within a directory with the structure name (e.g., kagome, pyrocholore and snub-square). Within each structure directory, data obtained from alchemical and self-assembly simulations are separated into alchem and self-assembly directories respectively. An additional suboptimal-self-assembly directory is only present for the snub-square structure and contains the data for the pattern registration analysis discussed in the SI. For a detailed description of each file contained within each directory, please refer to the README file.
- Keyword:
- inverse design, self-assembly, triblock Janus particles, crystallization slot, and digital alchemy
- Citation to related publication:
- Rivera-Rivera, LY, Moore, TC & SC Glotzer. Inverse design of triblock Janus spheres for self-assembly of complex structures in the crystallization slot via digital alchemy. Soft Matter, 2023, 19, 2726-2736 doi: 10.1039/d2sm01593e
- Discipline:
- Engineering
-
- Creator:
- Thompson, Ellen P. and Ellis, Brian R.
- Description:
- Accurate prediction of physical alterations in carbonate reservoirs under dissolution is critical for development of subsurface energy technologies. The impact of mineral dissolution on flow characteristics depends on the connectivity and tortuosity of the pore network. Persistent homology is a tool from algebraic topology that describes the size and connectivity of topological features. When applied to 3D X-ray computed tomography (XCT) imagery of rock cores, it provides a novel metric of pore network heterogeneity. Prior works have demonstrated the efficacy of persistent homology in predicting flow properties in numerical simulations of flow through porous media. Its ability to combine size, spatial distribution, and connectivity information make it a promising tool for understanding reactive transport in complex pore networks, yet limited work has been done to apply persistence analysis to experimental studies on natural rocks. In this study, three limestone cores were imaged by XCT before and after acid-driven dissolution flow through experiments. Each XCT scan was analyzed using persistent homology. In all three rocks, permeability increase was driven by the growth of large, connected pore bodies. The two most homogenous samples saw an increased effect nearer to the flow inlet, suggesting emerging preferential flow paths as the reaction front progresses. The most heterogeneous sample showed an increase in along-core homogeneity during reaction. Variability of persistence showed moderate positive correlation with pore body size increase. Persistence heterogeneity analysis could be used to anticipate where greatest pore size evolution may occur in a reservoir targeted for subsurface development, improving confidence in project viability.
- Keyword:
- Carbonate dissolution, X-ray computed tomography, Porous media, Topology, and Persistent homology
- Citation to related publication:
- Thompson, E.P.; Ellis, B.R. (2023) Persistent Homology as a Heterogeneity Metric for Predicting Pore Size Change in Dissolving Carbonates. In Review.
- Discipline:
- Science and Engineering
-
Novel Sensors for Autonomous Vehicle Perception
User Collection- Creator:
- Skinner, Katherine A, Vasudevan, Ram, Ramanagopal, Manikandasriram S, Ravi, Radhika, Buchan, Austin D, and Carmichael, Spencer
- Description:
- The Novel Sensors for Autonomous Vehicle Perception Collection of datasets are sequences collected with an autonomous vehicle platform including data from novel sensors. The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. Sequences include ~8 km routes, driven repeatedly under varying lighting conditions and/or opposing viewpoints. Further information and resources are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
12Works -
- Creator:
- Skinner, Katherine A. , Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection released in support of an IROS 2023 workshop publication, with a supporting website ( https://sites.google.com/umich.edu/novelsensors2023). To enable new research in the area of novel sensors for autonomous vehicles, these datasets are designed for the task of place recognition with novel sensors. To our knowledge, this new dataset is the first to include stereo thermal cameras together with stereo event cameras and stereo monochrome cameras, which perform better in low-light than RGB cameras., The dataset collection platform is a Ford Fusion vehicle with roof-mounted sensing suite, which consists of forward-facing stereo uncooled thermal cameras (FLIR Boson 640+ ADK), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) aligned with ground truth position from a high precision navigation system. Sequences include ~10 km routes, which may be driven repeatedly under varying lighting conditions and feature instances of direct sunlight and low-light that challenge conventional cameras., and A software toolkit to facilitate efficient use of the dataset including dataset download, application of calibration parameters, and evaluation of place recognition results based on standard metrics (e.g., maximum recall at 100% precision). These software tools for converting, managing, and viewing datafiles can be found at the associated GitHub repository ( https://github.com/umautobots/nsavp_tools).
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023 and https://github.com/umautobots/nsavp_tools
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A. , Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection released in support of an IROS 2023 workshop publication, with a supporting website ( https://sites.google.com/umich.edu/novelsensors2023). To enable new research in the area of novel sensors for autonomous vehicles, these datasets are designed for the task of place recognition with novel sensors. To our knowledge, this new dataset is the first to include stereo thermal cameras together with stereo event cameras and stereo monochrome cameras, which perform better in low-light than RGB cameras., The dataset collection platform is a Ford Fusion vehicle with roof-mounted sensing suite, which consists of forward-facing stereo uncooled thermal cameras (FLIR Boson 640+ ADK), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) aligned with ground truth position from a high precision navigation system. Sequences include ~10 km routes, which may be driven repeatedly under varying lighting conditions and feature instances of direct sunlight and low-light that challenge conventional cameras., and A software toolkit to facilitate efficient use of the dataset including dataset download, application of calibration parameters, and evaluation of place recognition results based on standard metrics (e.g., maximum recall at 100% precision). These software tools for converting, managing, and viewing datafiles can be found at the associated GitHub repository ( https://github.com/umautobots/nsavp_tools).
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023 and https://github.com/umautobots/nsavp_tools
- Discipline:
- Engineering
-
- Creator:
- Wu, Ziyou and Revzen, Shai
- Description:
- The data in this repository is a nearly unique dataset at the time of its making -- precise measurements of all contact forces of a 6-legged robot during multi-legged slipping motions and regular walking. These data were collected to establish the validity of the observation presented in this article: Zhao et al. Walking is like slithering: A unifying, data-driven view of locomotion. (2022) PNAS 119(37): e113222119. DOI: https://doi.org/10.1073/pnas.2113222119
- Keyword:
- robot, locomotion, and multilegged
- Citation to related publication:
- Science Robotics paper being submitted
- Discipline:
- Engineering
-
- Creator:
- Gill, Tate M.
- Description:
- Data included in raw format in addition to the MATLAB scripts used for processing into final results. If there are issues or confusion regarding this data or the codes, feel free to contact me at tategill@umich.edu.
- Keyword:
- Electric Propulsion
- Discipline:
- Engineering
-
- Creator:
- Hepner, Shadrach, T
- Description:
- This data provided evidence of the presence of a lower hybrid drift instability in a magnetic nozzle. It was used in DOI: 10.1063/5.0012668 to estimate the effective electron collision frequency that it induced in the context of cross-field electron transport. It is also used to determine the effective reduction in heat flux resulting from propagation along magnetic field lines in an upcoming work.
- Keyword:
- Magnetic nozzle, heat flux, plasma instabilities
- Citation to related publication:
- Hepner, S., Jorns, B. (2020). Wave-driven non-classical electron transport in a low temperature magnetically expanding plasma. Appl. Phys. Lett, 116(263502). https://doi.org/10.1063/5.0012668
- Discipline:
- Engineering
-
- Creator:
- Jones, Kaylin and Cotel, Aline J
- Description:
- To enhance environmental turbulence measurements, we have designed and constructed a novel Particle Image Velocimetry (PIV) instrument intended for field use. The data contained here was used for either validation of the instrument, or was produced by the instrument in proof-of-concept field testing.
- Keyword:
- particle image velocimetry, environmental turbulence, and field instrumentation
- Citation to related publication:
- Jones, K., and Cotel, A.J. 2023. Low-cost field particle image velocimetry for quantifying environmental turbulence. Journal of Ecohydraulics.
- Discipline:
- Engineering
-
Resources for Training Machine Learning Algorithms Using CAM6 Simple Physics Packages
User Collection- Creator:
- Limon, Garrett
- Description:
- The collection contains the code and the data used to train machine learning algorithms to emulate simplified physical parameterizations within the Community Atmosphere Model (CAM6). CAM6 is the atmospheric general circulation model (GCM) within the Community Earth System Model (CESM) framework, developed by the National Center for Atmospheric Research (NCAR). GCMs are made up of a dynamical core, responsible for the geophysical fluid flow calculations, and physical parameterization schemes, which estimate various unresolved processes. Simple physics schemes were used to train both random forests and neural networks in the interest of exploring the feasibility of machine learning techniques being used in conjunction with the dynamical core for improved efficiency of future climate and weather models. The results of the research show that various physical forcing tendencies and precipitation rates can be effectively emulated by the machine learning models.
- Keyword:
- Machine Learning, Climate Modeling, and Physics Emulators
- Discipline:
- Science and Engineering
2Works -
- Creator:
- Lin, Austin J, Lei, Shunbo, Keskar, Aditya, Hiskens, Ian A, Johnson, Jeremiah X , Mathieu, Johanna L, Kennedy, Tim, DeMink, Scott, Morgan, Kevin, Flynn, Connor, Giessner, Paul, Anderson, David, Dongmo, Jordan, Afshari, Sina, Li, Han, and Ceilsinki, Andrew
- Description:
- This is a subset of the SHIFDR dataset collection containing data from 14 buildings in Southeast Michigan. The full dataset collection can be found at https://deepblue.lib.umich.edu/data/collections/vh53ww273?locale=en and Organization: We include a subfolder for each building, identified by name. All buildings have been renamed after lakes to protect the identity of the building. Within each building subfolder, there is fan power (i.e. current measurements from which fan power can be computed), building automation system (BAS), whole building electrical load (WBEL), and voltage data collected over the course of our experimentation from 2017 to 2021. All experiments were conducted in the summer months and a full schedule of Demand Response (DR) events is included along with each building in the ‘Event_Schedule.csv’ file. The building information file contains general information about the buildings, pertinent to the experiments we conducted. There is also a folder labeled ‘2021 Preprocessed data’ which contains combined BAS and fan power data from the summer of 2021. This data has been lightly processed to calculate fan power from current measurements and interpolate BAS data to 1 minute intervals. These act as an easy-to-use starting point for data analysis.
- Citation to related publication:
- A.J. Lin, S. Lei, A. Keskar, I.A. Hiskens, J.X. Johnson, and J.L. Mathieu. “The Sub-metered HVAC Implemented For Demand Response (SHIFDR) Dataset,” Submitted, 2023.
- Discipline:
- Engineering
-
- Creator:
- Hoffmann, Alex P.
- Description:
- Research Overview: In situ magnetic field measurements are often difficult to obtain due to the presence of stray magnetic fields generated by spacecraft electrical subsystems. The conventional solution is to implement strict magnetic cleanliness requirements and place magnetometers on a deployable boom. However, this method is not always feasible on low-cost platforms due to factors such as increased design complexity, increased cost, and volume limitations. To overcome this problem, we propose using the Quad-Mag CubeSat magnetometer with an improved Underdetermined Blind Source Separation (UBSS) noise removal algorithm. The Quad-Mag consists of four magnetometer sensors in a single CubeSat form-factor card that allows distributed measurements of stray magnetic fields. The UBSS algorithm can remove stray magnetic fields without prior knowledge of the magnitude, orientation, or number of noise sources. UBSS is a two-stage algorithm that identifies signals through cluster analysis and separates them through compressive sensing. We use UBSS with single source point (SSP) detection to improve the identification of noise signals and iteratively-weighted compressed sensing to separate noise signals from the ambient magnetic field. Using a mock CubeSat, we demonstrate in the lab that UBSS reduces four noise signals producing more than 100 nT of noise at each magnetometer to below the expected instrument resolution (near 5 nT). Additionally, we show that the integrated Quad-Mag and improved UBSS system works well for 1U, 2U, 3U, and 6U CubeSats in simulation. Our results show that the Quad-Mag and UBSS noise cancellation package enables high-fidelity magnetic field measurements from a CubeSat without a boom.
- Keyword:
- source separation, demixing, magnetometers, stray magnetic fields, noise removal, and cubesat
- Citation to related publication:
- Hoffmann, A. P., Moldwin, M. B., Strabel, B. P., & Ojeda, L. V. (2023). Enabling Boomless CubeSat Magnetic Field Measurements with the Quad-Mag Magnetometer and an Improved Underdetermined Blind Source Separation Algorithm. Journal of Geophysical Research: Space Physics, 128, e2023JA031662. https://doi-org.proxy.lib.umich.edu/10.1029/2023JA031662
- Discipline:
- Engineering