Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Klinich, Kathleen D, Hu, Jingwen, Boyle, Kyle J, Manary, Miriam A., and Orton, Nichole R
- Description:
- As part of a project to develop side impact test procedures for evaluating wheelchairs, wheelchair tiedowns and occupant restraint systems (WTORS), and vehicle-based occupant protection systems for wheelchair seating stations, we created validated finite element (FE) models to support procedure development. Models were constructed using LS-DYNA. Dynamic sled tests were performed to validate the FE models of surrogate fixtures and commercial hardware. Validated FE models were developed for the Surrogate wheelchair base (SWCB), Surrogate wheelchair for side impact (SWCSI), a manual wheelchair (Ki Mobility Catalyst 5), and a power wheelchair (Quantum Rehab Edge 2.0). Additional FE models of a heavy-duty anchor meeting the Universal Docking Interface Geometry (UDIG), surrogate four-point strap tiedowns (SWTORS), a traditional docking station, and the surrogate wall fixture were also developed.
- Keyword:
- finite element, wheelchair, transportation, and tiedown
- Discipline:
- Engineering
-
- Creator:
- Lee, Shih Kuang, Tsai, Sun Ting, and Glotzer, Sharon C.
- Description:
- The trajectory data and codes were generated for our work "Classification of complex local environments in systems of particle shapes through shape-symmetry encoded data augmentation" (amidst peer review process). The data sets contain trajectory data in GSD file format for 7 test systems, including cubic structures, two-dimensional and three-dimensional patchy particle shape systems, hexagonal bipyramids with two aspect ratios, and truncated shapes with two degrees of truncation. Besides, the corresponding Python code and Jupyter notebook used to perform data augmentation, MLP classifier training, and MLP classifier testing are included.
- Keyword:
- Machine Learning, Colloids Self-Assembly, Crystallization, and Order Parameter
- Citation to related publication:
- https://doi.org/10.48550/arXiv.2312.11822
- Discipline:
- Other, Science, and Engineering
-
- Creator:
- Lin, Brian T. W.
- Description:
- This footage is an output of a USDOT-funded project titled "Development of Machine-Learning Models for Autonomous Vehicle Decisions on Weaving Sections of Freeway Ramps." It showcases an automated weaving maneuver within an augmented reality environment. During the demonstration, Mcity's automated vehicle navigates through a highway weaving section, making a lane change while interacting with a virtual vehicle. In this instance, Mcity's vehicle was operated by automated driving systems, which executed the lane change based on the detection for external environmental factors and parameter inputs received from the virtual vehicle.
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal imaging, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Sheppard, Anja, Sethuraman, Advaith V, Bagoren, Onur, Pinnow, Christopher, Anderson, Jamey, Havens, Timothy C, and Skinner, Katherine A
- Description:
- The AI4Shipwrecks dataset contains sidescan sonar images of shipwrecks and corresponding binary labels collected during 2022 and 2023 at the NOAA Thunder Bay National Marine Sanctuary in Alpena, MI. The data collection platform was an Iver3 Autonomous Underwater Vehicle (AUV) equipped with an EdgeTech 2205 dual-frequency ultra-high resolution sidescan sonar and 3D bathymetric system. The labels were compiled from reference labels created by experts in marine archaeology. The intended use of this dataset is to encourage development of semantic segmentation, object detection, or anomaly detection algorithms in the computer vision field. Comparisons of state-of-the-art segmentation networks on our dataset are shown in the paper. , The file structure is organized as described in the README.txt file, where images in 'images' directories are the waterfall product of sidescan sonar surveys, and images in 'labels' directories are binary representations of expert labels. Images across the 'images' and 'labels' directories are correlated by having identical filenames. In the label images, a pixel value of '0' represents the non-shipwreck/other class and '1' represents the shipwreck class for the correspondingly named image (<wreck_name>_<##>.png) in the images directory. , and The project webpage can be found at: https://umfieldrobotics.github.io/ai4shipwrecks/
- Keyword:
- machine learning, computer vision, field robotics, marine robotics, underwater robotics, sidescan sonar, semantic segmentation, and object detection
- Discipline:
- Engineering
-
- Creator:
- Hawes, Jason K, Goldstein, Benjamin P. , Newell, Joshua P. , Dorr, Erica , Caputo, Silvio , Fox-Kämper, Runrid , Grard, Baptiste , Ilieva, Rositsa T. , Fargue-Lelièvre, Agnès , Poniży, Lidia , Schoen, Victoria , Specht, Kathrin , and Cohen, Nevin
- Description:
- Urban agriculture (UA) is a widely proposed strategy to make cities and urban food systems more sustainable. However, its carbon footprint remains understudied. In fact, the few existing studies suggest that UA may be worse for the climate than conventional agriculture. This is the first large-scale study to resolve this uncertainty across cities and types of UA, employing citizen science at 73 UA sites in Europe and the United States to compare UA products to food from conventional farms. The results reveal that food from UA is six times as carbon intensive as conventional agriculture (420g vs 70g CO2 equivalent per serving). Some UA crops (e.g., tomatoes) and sites (e.g., 25% of individually-managed gardens), however, outperform conventional agriculture. These exceptions suggest that UA practitioners can reduce their climate impacts by cultivating crops that are typically greenhouse grown or air-freighted, maintaining UA sites for many years, and leveraging waste as inputs.This database contains the necessary reference material to trace the path of our analysis from raw garden data to carbon footprint and nutrient results. It also contains the final results of the analyses in various extended forms not available in the publication. For more information, see manuscript at link below. (Introduction partially quoted from Hawes et al., 2023)
- Citation to related publication:
- Hawes, J. K., Goldstein, B. P., Newell, J. P., Dorr, E., Caputo, S., Fox-Kämper, R., Grard, B., Ilieva, R. T., Fargue-Lelièvre, A., Poniży, L., Schoen, V., Specht, K., & Cohen, N. (2024). Comparing the carbon footprints of urban and conventional agriculture. Nature Cities, 1–10. https://doi.org/10.1038/s44284-023-00023-3
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception., The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception., The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
Novel Sensors for Autonomous Vehicle Perception
User Collection- Creator:
- Skinner, Katherine A, Vasudevan, Ram, Ramanagopal, Manikandasriram S, Ravi, Radhika, Buchan, Austin D, and Carmichael, Spencer
- Description:
- The Novel Sensors for Autonomous Vehicle Perception Collection of datasets are sequences collected with an autonomous vehicle platform including data from novel sensors. The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. Sequences include ~8 km routes, driven repeatedly under varying lighting conditions and/or opposing viewpoints. Further information and resources are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
12Works -
- Creator:
- Luyet, Chloe, Elvati, Paolo, Vinh, Jordan, and Violi, Angela
- Description:
- A growing body of work has linked key biological activities to the mechanical properties of cellular membranes, and as a means of identification. Here, we present a computational approach to simulate and compare the vibrational spectra in the low-THz region for mammalian and bacterial membranes, investigating the effect of membrane asymmetry and composition, as well as the conserved frequencies of a specific cell. We find that asymmetry does not impact the vibrational spectra, and the impact of sterols depends on the mobility of the components of the membrane. We demonstrate that vibrational spectra can be used to distinguish between membranes and, therefore, could be used in identification of different organisms. The method presented, here, can be immediately extended to other biological structures (e.g., amyloid fibers, polysaccharides, and protein-ligand structures) in order to fingerprint and understand vibrations of numerous biologically-relevant nanoscale structures.
- Keyword:
- molecular dynamics, membranes, mechanical vibration, bacterial identification, and Staphylococcus aureus
- Citation to related publication:
- Luyet C, Elvati P, Vinh J, Violi A. Low-THz Vibrations of Biological Membranes. Membranes. 2023; 13(2):139. https://doi.org/10.3390/membranes13020139
- Discipline:
- Engineering
-
- Creator:
- Elvati, Paolo, Luyet, Chloe, Wang, Yichun, Liu, Changjiang, VanEpps, J. Scott, Kotov, Nicholas A., and Violi, Angela
- Description:
- Amyloid nanofibers are abundant in microorganisms and are integral components of many biofilms, serving various purposes, from virulent to structural. Nonetheless, the precise characterization of bacterial amyloid nanofibers has been elusive, with incomplete and contradicting results. The present work focuses on the molecular details and characteristics of PSMa1-derived functional amyloids present in Staphylococcus aureus biofilms, using a combination of computational and experimental techniques, to develop a model that can aid the design of compounds to control amyloid formation. Results from molecular dynamics simulations, guided and supported by spectroscopy and microscopy, show that PSMa1 amyloid nanofibers present a helical structure formed by two protofilaments, have an average diameter of about 12 nm, and adopt a left-handed helicity with a periodicity of approximately 72 nm. The chirality of the self-assembled nanofibers, an intrinsic geometric property of its constituent peptides, is central to determining the fibers' lateral growth.
- Keyword:
- molecular self-assembly, computational nanotechnology, nanobiotechnology, and structural properties
- Citation to related publication:
- Paolo Elvati, Chloe Luyet, Yichun Wang, Changjiang Liu, J. Scott VanEpps, Nicholas A. Kotov, and Angela Violi ACS Applied Nano Materials 2023 6 (8), 6594-6604 DOI: 10.1021/acsanm.3c00174
- Discipline:
- Engineering and Science
-
- Creator:
- Lee, Sophie Y., Schönhöfer Philipp W.A., and Glotzer, Sharon C.
- Description:
- This dataset was generated for our work: "Complex motion of steerable vesicular robots filled with active colloidal rods". In this project, we used Brownian molecular dynamics simulations to study the rich dynamical behavior of rigid kinked vesicles that contain self-propelling rod-shaped particles. We identified that kinks in the vesicle membrane bias the emergent clustering and alignment of the active agents. Based on the system's geometrical and material properties, we were able to design multiple types of directed motion of the vesicle superstructure. This dataset includes simulation data for two-dimensional systems of self-propelling rod particles confined by teardrop-shaped coarse-grained vesicles. The trajectory of each simulation is saved in a GSD format file with parameter metadata in a JSON file. Due to the large number of replicas of each pair of parameters, simulation data were grouped into 5 different folders. Collective quantitative analysis for simulated trajectories was performed with Jupyter Notebook. and Workspaces_simulations.zip contains all the workspaces of simulations Each folder has subfolders called 'dimer' and 'trimer' depending on the length of the propelling rod particles used in the simulation. (Except for the folder 'number-density_16' which has only 'dimer') In the subfolders, we include the Python scripts used in this work for simulating and trajectory analysis for individual trajectory data. The parameter space of each folder is noted in init.py. Analysis_jupyter_notebooks.zip includes Jupyter notebooks that can reproduce the collective analysis done for this work.
- Discipline:
- Engineering
-
- Creator:
- Wallace, Dylan M, Benyamini, Miri, Nason-Tomaszewski, Samuel R, Costello, Joseph T, Cubillos, Luis H, Mender, Matthew J, Temmar, Hisham, Willsey, Matthew S, Patil, Parag P, Chestek, Cynthia A, and Zacksenhouse, Miriam
- Description:
- This is data from Wallace, Benyamini et al., 2023, Journal of Neural Engineering. There are two sets of data included: 1. Neural features and error labels used to train error classifiers for each day used in the study 2. Trial data from an example experiment day (Monkey N, Day 6), with runs for offline calibration, online brain control, error monitoring, and error correction. The purpose of this study was to investigate the use of error signals in motor cortex to improve brain-machine interface (BMI) performance for control of two finger groups. All data is contained in .mat files, which can be opened using MATLAB or the Python SciPy library.
- Keyword:
- Brain-machine interface (BMI), Error detection, and Neural recording
- Citation to related publication:
- Wallace, D. M., Benyamini, M., Nason-Tomaszewski, S. R., Costello, J. T., Cubillos, L. H., Mender, M. J., Temmar, H., Willsey, M. S., Patil, P. G., Chestek, C. A., & Zacksenhouse, M. (2023). Error detection and correction in intracortical brain–machine interfaces controlling two finger groups. Journal of Neural Engineering, 20(4), 046037. https://doi.org/10.1088/1741-2552/acef95
- Discipline:
- Engineering, Science, and Health Sciences