Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Gill, Tate M.
- Description:
- Data included in raw format in addition to the MATLAB scripts used for processing into final results. If there are issues or confusion regarding this data or the codes, feel free to contact me at tategill@umich.edu.
- Keyword:
- Electric Propulsion
- Discipline:
- Engineering
-
- Creator:
- Wu, Ziyou and Revzen, Shai
- Description:
- The data in this repository is a nearly unique dataset at the time of its making -- precise measurements of all contact forces of a 6-legged robot during multi-legged slipping motions and regular walking. These data were collected to establish the validity of the observation presented in this article: Zhao et al. Walking is like slithering: A unifying, data-driven view of locomotion. (2022) PNAS 119(37): e113222119. DOI: https://doi.org/10.1073/pnas.2113222119
- Keyword:
- robot, locomotion, and multilegged
- Citation to related publication:
- Science Robotics paper being submitted
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A. , Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection released in support of an IROS 2023 workshop publication, with a supporting website ( https://sites.google.com/umich.edu/novelsensors2023). To enable new research in the area of novel sensors for autonomous vehicles, these datasets are designed for the task of place recognition with novel sensors. To our knowledge, this new dataset is the first to include stereo thermal cameras together with stereo event cameras and stereo monochrome cameras, which perform better in low-light than RGB cameras., The dataset collection platform is a Ford Fusion vehicle with roof-mounted sensing suite, which consists of forward-facing stereo uncooled thermal cameras (FLIR Boson 640+ ADK), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) aligned with ground truth position from a high precision navigation system. Sequences include ~10 km routes, which may be driven repeatedly under varying lighting conditions and feature instances of direct sunlight and low-light that challenge conventional cameras., and A software toolkit to facilitate efficient use of the dataset including dataset download, application of calibration parameters, and evaluation of place recognition results based on standard metrics (e.g., maximum recall at 100% precision). These software tools for converting, managing, and viewing datafiles can be found at the associated GitHub repository ( https://github.com/umautobots/nsavp_tools).
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023 and https://github.com/umautobots/nsavp_tools
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A. , Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection released in support of an IROS 2023 workshop publication, with a supporting website ( https://sites.google.com/umich.edu/novelsensors2023). To enable new research in the area of novel sensors for autonomous vehicles, these datasets are designed for the task of place recognition with novel sensors. To our knowledge, this new dataset is the first to include stereo thermal cameras together with stereo event cameras and stereo monochrome cameras, which perform better in low-light than RGB cameras., The dataset collection platform is a Ford Fusion vehicle with roof-mounted sensing suite, which consists of forward-facing stereo uncooled thermal cameras (FLIR Boson 640+ ADK), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) aligned with ground truth position from a high precision navigation system. Sequences include ~10 km routes, which may be driven repeatedly under varying lighting conditions and feature instances of direct sunlight and low-light that challenge conventional cameras., and A software toolkit to facilitate efficient use of the dataset including dataset download, application of calibration parameters, and evaluation of place recognition results based on standard metrics (e.g., maximum recall at 100% precision). These software tools for converting, managing, and viewing datafiles can be found at the associated GitHub repository ( https://github.com/umautobots/nsavp_tools).
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023 and https://github.com/umautobots/nsavp_tools
- Discipline:
- Engineering
-
Novel Sensors for Autonomous Vehicle Perception
User Collection- Creator:
- Skinner, Katherine A, Vasudevan, Ram, Ramanagopal, Manikandasriram S, Ravi, Radhika, Buchan, Austin D, and Carmichael, Spencer
- Description:
- The Novel Sensors for Autonomous Vehicle Perception Collection of datasets are sequences collected with an autonomous vehicle platform including data from novel sensors. The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. Sequences include ~8 km routes, driven repeatedly under varying lighting conditions and/or opposing viewpoints. Further information and resources are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
12Works -
- Creator:
- Thompson, Ellen P. and Ellis, Brian R.
- Description:
- Accurate prediction of physical alterations in carbonate reservoirs under dissolution is critical for development of subsurface energy technologies. The impact of mineral dissolution on flow characteristics depends on the connectivity and tortuosity of the pore network. Persistent homology is a tool from algebraic topology that describes the size and connectivity of topological features. When applied to 3D X-ray computed tomography (XCT) imagery of rock cores, it provides a novel metric of pore network heterogeneity. Prior works have demonstrated the efficacy of persistent homology in predicting flow properties in numerical simulations of flow through porous media. Its ability to combine size, spatial distribution, and connectivity information make it a promising tool for understanding reactive transport in complex pore networks, yet limited work has been done to apply persistence analysis to experimental studies on natural rocks. In this study, three limestone cores were imaged by XCT before and after acid-driven dissolution flow through experiments. Each XCT scan was analyzed using persistent homology. In all three rocks, permeability increase was driven by the growth of large, connected pore bodies. The two most homogenous samples saw an increased effect nearer to the flow inlet, suggesting emerging preferential flow paths as the reaction front progresses. The most heterogeneous sample showed an increase in along-core homogeneity during reaction. Variability of persistence showed moderate positive correlation with pore body size increase. Persistence heterogeneity analysis could be used to anticipate where greatest pore size evolution may occur in a reservoir targeted for subsurface development, improving confidence in project viability.
- Keyword:
- Carbonate dissolution, X-ray computed tomography, Porous media, Topology, and Persistent homology
- Citation to related publication:
- Thompson, E.P.; Ellis, B.R. (2023) Persistent Homology as a Heterogeneity Metric for Predicting Pore Size Change in Dissolving Carbonates. In Review.
- Discipline:
- Science and Engineering
-
- Creator:
- Rivera-Rivera, Luis Y., Moore, Timothy C., and Glotzer, Sharon C.
- Description:
- The dataset is organized as follows: the data for each of the three target structures is contained within a directory with the structure name (e.g., kagome, pyrocholore and snub-square). Within each structure directory, data obtained from alchemical and self-assembly simulations are separated into alchem and self-assembly directories respectively. An additional suboptimal-self-assembly directory is only present for the snub-square structure and contains the data for the pattern registration analysis discussed in the SI. For a detailed description of each file contained within each directory, please refer to the README file.
- Keyword:
- inverse design, self-assembly, triblock Janus particles, crystallization slot, and digital alchemy
- Citation to related publication:
- Rivera-Rivera, LY, Moore, TC & SC Glotzer. Inverse design of triblock Janus spheres for self-assembly of complex structures in the crystallization slot via digital alchemy. Soft Matter, 2023, 19, 2726-2736 doi: 10.1039/d2sm01593e
- Discipline:
- Engineering
-
- Creator:
- Yining Shi
- Description:
- Statistical study of residuals between Swarm observations and IGRF-13 geomagnetic field model larger than 300 nT in northern and southern hemisphere. Data analysis done on https://viresclient.readthedocs.io/en/latest/ These data are generated to conduct a statistical study of the locations of large residuals in the two hemispheres for a better understanding of potential error in satellite aviation application when using Earth magnetic field models like IGRF as references, as well as the energy transfer in the magnetosphere-ionosphere-thermosphere coupling. Interhemispheric asymmetries are found in the locations of the large residuals due to the difference in geographic pole locations.
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal imaging, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Klinich, Kathleen D, Lin, Brian, and Moore, Jamie L.
- Description:
- This dataset allows comparison of the different strategies implemented by vehicle manufacturers being used to communicate with drivers. Spreadsheets were created in MS Excel to summarize data for each vehicle, and include page numbers in each vehicle owner's manual for reference. The photos taken of each vehicle control panel allow detailed inspection of the displays and controls.
- Keyword:
- vehicle, controls, displays, and FMVSS 101
- Discipline:
- Engineering
-
- Creator:
- Klinich, Kathleen D, Hu, Jingwen, Boyle, Kyle J, Manary, Miriam A., and Orton, Nichole R
- Description:
- As part of a project to develop side impact test procedures for evaluating wheelchairs, wheelchair tiedowns and occupant restraint systems (WTORS), and vehicle-based occupant protection systems for wheelchair seating stations, we created validated finite element (FE) models to support procedure development. Models were constructed using LS-DYNA. Dynamic sled tests were performed to validate the FE models of surrogate fixtures and commercial hardware. Validated FE models were developed for the Surrogate wheelchair base (SWCB), Surrogate wheelchair for side impact (SWCSI), a manual wheelchair (Ki Mobility Catalyst 5), and a power wheelchair (Quantum Rehab Edge 2.0). Additional FE models of a heavy-duty anchor meeting the Universal Docking Interface Geometry (UDIG), surrogate four-point strap tiedowns (SWTORS), a traditional docking station, and the surrogate wall fixture were also developed.
- Keyword:
- finite element, wheelchair, transportation, and tiedown
- Discipline:
- Engineering
-
- Creator:
- Luyet, Chloe, Elvati, Paolo, Vinh, Jordan, and Violi, Angela
- Description:
- A growing body of work has linked key biological activities to the mechanical properties of cellular membranes, and as a means of identification. Here, we present a computational approach to simulate and compare the vibrational spectra in the low-THz region for mammalian and bacterial membranes, investigating the effect of membrane asymmetry and composition, as well as the conserved frequencies of a specific cell. We find that asymmetry does not impact the vibrational spectra, and the impact of sterols depends on the mobility of the components of the membrane. We demonstrate that vibrational spectra can be used to distinguish between membranes and, therefore, could be used in identification of different organisms. The method presented, here, can be immediately extended to other biological structures (e.g., amyloid fibers, polysaccharides, and protein-ligand structures) in order to fingerprint and understand vibrations of numerous biologically-relevant nanoscale structures.
- Keyword:
- molecular dynamics, membranes, mechanical vibration, bacterial identification, and Staphylococcus aureus
- Citation to related publication:
- Luyet C, Elvati P, Vinh J, Violi A. Low-THz Vibrations of Biological Membranes. Membranes. 2023; 13(2):139. https://doi.org/10.3390/membranes13020139
- Discipline:
- Engineering
-
- Creator:
- Elvati, Paolo, Luyet, Chloe, Wang, Yichun, Liu, Changjiang, VanEpps, J. Scott, Kotov, Nicholas A., and Violi, Angela
- Description:
- Amyloid nanofibers are abundant in microorganisms and are integral components of many biofilms, serving various purposes, from virulent to structural. Nonetheless, the precise characterization of bacterial amyloid nanofibers has been elusive, with incomplete and contradicting results. The present work focuses on the molecular details and characteristics of PSMa1-derived functional amyloids present in Staphylococcus aureus biofilms, using a combination of computational and experimental techniques, to develop a model that can aid the design of compounds to control amyloid formation. Results from molecular dynamics simulations, guided and supported by spectroscopy and microscopy, show that PSMa1 amyloid nanofibers present a helical structure formed by two protofilaments, have an average diameter of about 12 nm, and adopt a left-handed helicity with a periodicity of approximately 72 nm. The chirality of the self-assembled nanofibers, an intrinsic geometric property of its constituent peptides, is central to determining the fibers' lateral growth.
- Keyword:
- molecular self-assembly, computational nanotechnology, nanobiotechnology, and structural properties
- Citation to related publication:
- Paolo Elvati, Chloe Luyet, Yichun Wang, Changjiang Liu, J. Scott VanEpps, Nicholas A. Kotov, and Angela Violi ACS Applied Nano Materials 2023 6 (8), 6594-6604 DOI: 10.1021/acsanm.3c00174
- Discipline:
- Engineering and Science
-
- Creator:
- Brian, Chen
- Description:
- The procedure followed while creating this data is summarized in Section II of Chen, Brian, et al. "Behavioral cloning in atari games using a combined variational autoencoder and predictor model." 2021 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2021. This data is not a result of a research but an intermediate product that is used in research. This dataset is generated to train a behavioral cloning framework from gameplay screen captures and keystrokes of an "expert" player. The RL agent that is trained using "RL Baselines Zoo package" acts as the "expert" player, whose decision making process we desire to learn. In addition to behavioral cloning experiments, this dataset is further used to demonstrate the efficacy of a novel incremental tensor decomposition algorithm on image-based data streams.
- Keyword:
- Imitation Learning, Behavioral Cloning, Reinforcement Learning, Machine Learning, and Gameplay Data
- Citation to related publication:
- Chen, Brian, et al. "Behavioral cloning in atari games using a combined variational autoencoder and predictor model." 2021 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2021., Aksoy, Doruk, et al. "An Incremental Tensor Train Decomposition Algorithm." arXiv preprint arXiv:2211.12487 (2022)., and Chen, Brian, et al. "Low-Rank Tensor-Network Encodings for Video-to-Action Behavioral Cloning", forthcoming
- Discipline:
- Engineering and Science