Search Constraints
Filtering by:
Keyword
Machine Learning
Remove constraint Keyword: Machine Learning
Resource type
Dataset
Remove constraint Resource type: Dataset
1 - 6 of 6
Number of results to display per page
View results as:
Search Results
-
- Creator:
- Geng, Yina, Van Anders, Greg, and Glotzer, Sharon C.
- Description:
- The data are the 13 target structures used in developing our model for predicting colloidal crystal structures from the geometries of particular shapes. The target structures are: simple cubic (SC), body-centered cubic (BCC), face-centered cubic (FCC), simple chiral cubic (SCC), hexagonal (HEX-1-0.6), diamond (D), graphite (G), honeycomb (H), body-centered tetragonal (BCT-1-1-2.4), high-pressure Lithium (Li), Manganese (beta-Mn), Uranium (beta-U), Tungsten (beta-W). At least nine simulations were run on each of the target structures. All of the data are formatted as .pos files.
- Keyword:
- Inverse Design and Machine Learning
- Discipline:
- Science
-
- Creator:
- Smith, Joeseph P., Gronewold, Andrew D., Read, Laura, Crooks, James L., School for Environment and Sustainability, University of Michigan, Department of Civil and Environmental Engineering, University of Michigan, and Cooperative Institute for Great Lakes Research, University of Michigan
- Description:
- Using the statistical programming package R ( https://cran.r-project.org/), and JAGS (Just Another Gibbs Sampler, http://mcmc-jags.sourceforge.net/), we processed multiple estimates of the Laurentian Great Lakes water balance components -- over-lake precipitation, evaporation, lateral tributary runoff, connecting channel flows, and diversions -- feeding them into prior distributions (using data from 1950 through 1979), and likelihood functions. The Bayesian Network is coded in the BUGS language. Water balance computations assume that monthly change in storage for a given lake is the difference between beginning of month water levels surrounding each month. For example, the change in storage for June 2015 is the difference between the beginning of month water level for July 2015 and that for June 2015., More details on the model can be found in the following summary report for the International Watersheds Initiative of the International Joint Commission, where the model was used to generate a new water balance historical record from 1950 through 2015: https://www.glerl.noaa.gov/pubs/fulltext/2018/20180021.pdf. Large Lake Statistical Water Balance Model (L2SWBM): https://www.glerl.noaa.gov/data/WaterBalanceModel/, and This data set has a shorter timespan to accommodate a prior which uses data not used in the likelihood functions.
- Keyword:
- Water, Balance, Great Lakes, Laurentian, Machine Learning, Machine, Learning, Lakes, Bayesian, and Network
- Citation to related publication:
- Smith, J., Gronewald, A. et al. Summary Report: Development of the Large Lake Statistical Water Balance Model for Constructing a New Historical Record of the Great Lakes Water Balance. Submitted to: The International Watersheds Initiative of the International Joint Commission. Accessible at https://www.glerl.noaa.gov/pubs/fulltext/2018/20180021.pdf, Large Lake Statistical Water Balance Model (L2SWBM). https://www.glerl.noaa.gov/data/WaterBalanceModel/, and Gronewold, A.D., Smith, J.P., Read, L. and Crooks, J.L., 2020. Reconciling the water balance of large lake systems. Advances in Water Resources, p.103505.
- Discipline:
- Science and Engineering
-
- Creator:
- Jiao, Zhenbang, Chen, Yang, and Manchester, Ward
- Description:
- GOES_flare_list: contains a list of more than 12,013 flare events. The list has 6 columns, flare classification, active region number, date, start time end time, emission peak time. SHARP_data.hdf5 files contain time series of 20 physical variables derived from the SDO/HMI SHARP data files. These data are saved at a 12 minute cadence and are used to train the LSTM model.
- Keyword:
- Solar Flare Prediction and Machine Learning
- Citation to related publication:
- Jiao, Z., Sun, H., Wang, X., Manchester, W., Gombosi, T., Hero, A., & Chen, Y. (2020). Solar Flare Intensity Prediction With Machine Learning Models. Space Weather, 18(7), e2020SW002440. https://doi.org/10.1029/2020SW002440 and Chen, Y., & Manchester, W. (2019). Data and Data products for machine learning applied to solar flares [Data set], University of Michigan - Deep Blue. https://doi.org/10.7302/qnsq-cs38
- Discipline:
- Engineering and Science
-
- Creator:
- Limon, Garrett C.
- Description:
- The data represents weekly output from three 60-year CAM6 model runs. The output includes state (.h0. files) and tendency (.h1. files) fields for three difference model configurations of increasing complexity. State fields include temperature, surface pressure, specific humidity, among others; while tendencies include temperature tendencies, specific humidity tendencies, as well as precipitation rates. Using the state variables at a given time step, machine learning techniques can be trained to predict the following tendency field, which can then be applied to the state variables to provide the state at the next physics time step of the model.
- Keyword:
- Machine Learning, Climate Modeling, and Physics Emulation
- Citation to related publication:
- Limon, G. C., Jablonowski, C. (2022) Probing the Skill of Random Forest Emulators for Physical Parameterizations via a Hierarchy of Simple CAM6 Configurations [Preprint]. ESSOAr. https://10.1002/essoar.10512353.1
- Discipline:
- Engineering and Science
-
- Creator:
- Limon, Garrett C.
- Description:
- The work guides the processing of CAM6 data for use in machine learning applications. We also provide workflow scripts for training both random forests and neural networks to emulate physic s schemes from the data, as well as analysis scripts written in both Python and NCL in order to process our results.
- Keyword:
- Machine Learning, Climate Modeling, and Physics Emulation
- Citation to related publication:
- Limon, G. C., Jablonowski, C. (2022) Probing the Skill of Random Forest Emulators for Physical Parameterizations via a Hierarchy of Simple CAM6 Configurations [Pre Print]. ESSOAr. https://10.1002/essoar.10512353.1
- Discipline:
- Engineering and Science
-
- Creator:
- Brian, Chen
- Description:
- The procedure followed while creating this data is summarized in Section II of Chen, Brian, et al. "Behavioral cloning in atari games using a combined variational autoencoder and predictor model." 2021 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2021. This data is not a result of a research but an intermediate product that is used in research. This dataset is generated to train a behavioral cloning framework from gameplay screen captures and keystrokes of an "expert" player. The RL agent that is trained using "RL Baselines Zoo package" acts as the "expert" player, whose decision making process we desire to learn. In addition to behavioral cloning experiments, this dataset is further used to demonstrate the efficacy of a novel incremental tensor decomposition algorithm on image-based data streams.
- Keyword:
- Imitation Learning, Behavioral Cloning, Reinforcement Learning, Machine Learning, and Gameplay Data
- Citation to related publication:
- Chen, Brian, et al. "Behavioral cloning in atari games using a combined variational autoencoder and predictor model." 2021 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2021., Aksoy, Doruk, et al. "An Incremental Tensor Train Decomposition Algorithm." arXiv preprint arXiv:2211.12487 (2022)., and Chen, Brian, et al. "Low-Rank Tensor-Network Encodings for Video-to-Action Behavioral Cloning", forthcoming
- Discipline:
- Engineering and Science