Search Constraints
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Hawes, Jason K, Goldstein, Benjamin P. , Newell, Joshua P. , Dorr, Erica , Caputo, Silvio , Fox-Kämper, Runrid , Grard, Baptiste , Ilieva, Rositsa T. , Fargue-Lelièvre, Agnès , Poniży, Lidia , Schoen, Victoria , Specht, Kathrin , and Cohen, Nevin
- Description:
- Urban agriculture (UA) is a widely proposed strategy to make cities and urban food systems more sustainable. However, its carbon footprint remains understudied. In fact, the few existing studies suggest that UA may be worse for the climate than conventional agriculture. This is the first large-scale study to resolve this uncertainty across cities and types of UA, employing citizen science at 73 UA sites in Europe and the United States to compare UA products to food from conventional farms. The results reveal that food from UA is six times as carbon intensive as conventional agriculture (420g vs 70g CO2 equivalent per serving). Some UA crops (e.g., tomatoes) and sites (e.g., 25% of individually-managed gardens), however, outperform conventional agriculture. These exceptions suggest that UA practitioners can reduce their climate impacts by cultivating crops that are typically greenhouse grown or air-freighted, maintaining UA sites for many years, and leveraging waste as inputs.This database contains the necessary reference material to trace the path of our analysis from raw garden data to carbon footprint and nutrient results. It also contains the final results of the analyses in various extended forms not available in the publication. For more information, see manuscript at link below. (Introduction partially quoted from Hawes et al., 2023)
- Citation to related publication:
- Hawes, J. K., Goldstein, B. P., Newell, J. P., Dorr, E., Caputo, S., Fox-Kämper, R., Grard, B., Ilieva, R. T., Fargue-Lelièvre, A., Poniży, L., Schoen, V., Specht, K., & Cohen, N. (2024). Comparing the carbon footprints of urban and conventional agriculture. Nature Cities, 1–10. https://doi.org/10.1038/s44284-023-00023-3
- Discipline:
- Engineering
-
- Creator:
- Srodawa, Kristy, Cerda, Peter A, Davis Rabosky, Alison R, and Crowe-Riddell, Jenna M
- Description:
- Snake venom research has historically focused on front-fanged species (Viperidae and Elapidae), limiting our knowledge of venom evolution in rear-fanged snakes across their ecologically-diverse phylogeny. Three finger toxins (3FTxs) are a known neurotoxic component in the venoms of some rear-fanged snakes (Colubrinae, Colubridae), but it is unclear how prevalent 3FTxs are both in expression within venom glands and more broadly among colubrine species. Here, we used a transcriptomic approach to characterize the venom expression profiles of four species of colubrine snakes from Neotropics that were dominated by 3FTx expression (in the genera Chironius, Oxybelis, Rhinobothryum, and Spilotes) and reconstructed the gene trees of 3FTxs. Overall, our results highlight the importance of exploring the venoms of understudied species in reconstructing the full evolutionary history of toxins across the tree of life.
- Keyword:
- snake venom, neurotoxin, molecular evolution, gene families, and opisthoglyphous
- Citation to related publication:
- Srodawa, K., Cerda, P.A., Davis Rabosky, A.R., Crowe-Riddell, J.M. Evolution of Three Finger Toxin Genes in Neotropical Colubrine Snakes (Colubridae). Toxins 2023, 15(9), 523; https://doi.org/10.3390/toxins15090523
- Discipline:
- Science
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception., The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception., The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , and Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
-
- Creator:
- Skinner, Katherine A., Vasudevan, Ram, Ramanagopal, Manikandasriram S., Ravi, Radhika, Carmichael, Spencer, and Buchan, Austin D.
- Description:
- This dataset is part of a collection created to facilitate research in the use of novel sensors for autonomous vehicle perception. , The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. , Further information and resources (such as software tools for converting, managing, and viewing data files) are available on the project website: https://umautobots.github.io/nsavp , and CHANGE NOTICE (January 2024): We identified an error in our timestamp post-processing procedure that caused all camera timestamps to be offset by the exposure time of one of the cameras. We corrected the error, applied the corrected post-processing, and reuploaded the corrected files. The change impacts all camera data files. Prior to the change, the timestamps between the cameras were synchronized with submillisecond accuracy, but the camera and ground truth pose timestamps were offset by up to 0.4 ms, 3 ms, and 15 ms in the afternoon, sunset, and night sequences, respectively. This amounted in up to ~0.25 meters of position error in the night sequences. For consistency, camera calibration was rerun with the corrected calibration sequence files. The camera calibration results have therefore been updated as well, although they have not changed significantly. Finally, we previously downsampled the frame data in the uploaded calibration seqeuence, but we decided to provide the full frame data in the reupload.
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://sites.google.com/umich.edu/novelsensors2023, https://github.com/umautobots/nsavp_tools, and https://umautobots.github.io/nsavp
- Discipline:
- Engineering
-
- Creator:
- Zhang, Yingxiao MI
- Description:
- We developed a new model framework based on WRF-Chem, simulating primary biological aerosol particle emissions and their interaction with clouds. We have designed different sensitivity tests to evaluate the effects of pollen and sub-pollen particles (SPPs), respectively. Our results show that SPPs have a larger effect on cloud microphysics and precipitation than whole pollen grains.
- Keyword:
- Aerosol-cloud interactions, Primary biological aerosol particles, Ice nucleating particles, Microphyscis scheme, and Pollen
- Discipline:
- Science
-
Novel Sensors for Autonomous Vehicle Perception
User Collection- Creator:
- Skinner, Katherine A, Vasudevan, Ram, Ramanagopal, Manikandasriram S, Ravi, Radhika, Buchan, Austin D, and Carmichael, Spencer
- Description:
- The Novel Sensors for Autonomous Vehicle Perception Collection of datasets are sequences collected with an autonomous vehicle platform including data from novel sensors. The dataset collection platform is a Ford Fusion vehicle with a roof-mounted novel sensing suite, which specifically consists of forward-facing stereo uncooled thermal cameras (FLIR 40640U050-6PAAX), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) time synchronized with ground truth poses from a high precision navigation system. Sequences include ~8 km routes, driven repeatedly under varying lighting conditions and/or opposing viewpoints. Further information and resources are available on the project website: https://umautobots.github.io/nsavp
- Keyword:
- novel sensing, perception, autonomous vehicles, thermal sensing, neuromorphic imaging, and event cameras
- Citation to related publication:
- https://umautobots.github.io/nsavp, https://github.com/umautobots/nsavp_tools, and https://sites.google.com/umich.edu/novelsensors2023
- Discipline:
- Engineering
12Works -
- Creator:
- Cevidanes, Lucia
- Description:
- Image Pre-Processing To allow reliable detection and comparison of changes between several individuals or within the same individual at different time points, before extracting the quantitative bone texture/morphometry features, all hr-CBCT scans were pre-processed using validated protocols. Extraction of Trabecular Bone Texture-based and Morphometry Imaging Features Using the “crop-volume” tool in 3D Slicer, a rectangular shaped volume of interest (VOI) was cropped from the trabecular bone in the mandibular condyles and the articular fossa. Then, using the average minimum and maximum intensity values of all VOIs, we standardized the grey level intensities of the VOIs to eliminate inaccuracies of textural features calculation and possible dependency on the global characteristics of the images. Lastly, imaging markers were extracted from the standardized VOIs using “BoneTexture” module in 3D-slicer. Measurement of the 3D Articular Joint Space To assess the progression/improvement of osteoarthritic changes in the affected individuals, we measured the 3D superior joint space. We pre-labelled two landmarks in the sagittal view of the oriented CBCT scans: on the most superior point of the condyle and on the opposing surface of the articular fossa. To avoid biasing the landmarks’ placements, pre-labelling was performed simultaneously on T1 and T2 scans, using two independent windows in ITK-SNAP. After the volumetric reconstruction of the identified landmarks, linear measurements were obtained in millimeters using the Q3DC tool in 3D Slicer. Three-dimensional Shape Analyses and Quantification of Remodeling in the Condyles SPHARM-PDM software was used to compute the correspondence across 4002 surface points among all condyles. The output point-based models displayed color-coded maps that enabled visual evaluation of consistent parametrization of all condyles. An average condyle shape for the TMJ OA and control groups was calculated through propagation of original surface point correspondences across all stages of deformations and averaging the condyle surface meshes. For visualization of the 3D qualitative changes of the average models within the same group at different time points or among different groups, semi-transparent overlays were created using 3D Slicer software. The vector differences were presented on the condyle surfaces, scaled according to the magnitude of difference, and pointing towards the direction of bone change. For quantification of remodeling in the condyles, calculation of signed distances across condyles surface meshes reflected the quantitative bone changes in the TMJ OA and control samples. To quantify regional bone changes across the lateral and anterior surfaces of the condyles, we used the Pick ‘n Paint tool in 3D Slicer to propagate regional surface points to the corresponding regions of shapes across all subjects and time points.
- Keyword:
- Degenerative joint disease, Temporomandibular joint osteoarthritis, TMJ OA, Machine learning, Prognosis
- Discipline:
- Health Sciences
-
- Creator:
- Lori, Jody R., Moyer, Cheryl, Lockhart, Nancy, Zielinski, Ruth E., Kukula, Vida, Apetorgbor, Veronica, Awini, Elizabeth, Badu-Gyan, Georgina, and Williams, John
- Description:
- GRAND is a five-year, cluster randomized controlled trial. The study is registered on ClinicalTrials.gov, [ID#: NCT04033003] and is a collaboration between University of Michigan in the United States and the Dodowa Health Research Center in Ghana. , The study setting for GRAND is four districts (Akwapim North, Yilo Krobo, Nsawam-Adoagyiri, and Lower Manya Krobo) within the Eastern Region of Ghana. Health facilities were selected based the number ANC registrants per month and average gestational age of women at registration in each facility., and Facilities were then matched based on facility type, district, and number of monthly ANC registrants. A cluster randomized controlled trial was conducted in 14 facilities in four districts of the Eastern Region of Ghana. Health facilities were randomized using a matched pairs design; each pair was similar in the number of deliveries and average gestational age of the women at enrollment in antenatal care. The locations of the facilities were far enough apart to avoid cross-group contamination. In each pair of facilities, one was randomly assigned to the intervention (G-ANC) and the other to the control (I-ANC). Recruitment began July 2019 and ended when enrollment targets were met. Data collection ended July 2023 when data collection was complete.
- Keyword:
- Antenatal care, Ghana, and Maternal health
- Citation to related publication:
- Lori, J., Kukula, V., Liu, L. et al. Improving health literacy through group antenatal care: results from a cluster randomized controlled trial in Ghana. BMC Pregnancy Childbirth 24, 37 (2024). https://doi.org/10.1186/s12884-023-06224-x
- Discipline:
- International Studies and Health Sciences