Work Description

Title: A Simple Method for Correcting Empirical Model Densities during Geomagnetic Storms Using Satellite Orbit Data Open Access Deposited

h
Attribute Value
Methodology
  • The Multifaceted Optimization Algorithm (MOA) was run for the first 10 CubeSats of the Flock 3P constellation between the dates of 2017-05-23 and 2017-06-02, with TLEs from those satellites covering dates between 2017-05-02 and 2017-06-02. MOA collected TLEs for the following Flock 3P satellites: - FLOCK3P1 - FLOCK3P2 - FLOCK3P3 - FLOCK3P4 - FLOCK3P5 - FLOCK3P6 - FLOCK3P7 - FLOCK3P8 - FLOCK3P9 - FLOCK3P10 Those TLEs were used to initialize three subprocesses that computed the contributions to drag from satellite orientation, solar radio flux, and geomagnetic activity. Distributions of the CubeSats' cross-sectional areas during the time of interest, and modifications to the 10.7 cm solar radio flux (F10.7) and the terrestrial geomagnetic activity (ap) that minimized orbit error were computed by MOA, after which the median value of those corrections was applied to the NLRMSISE-00 empirical atmospheric model along the orbits of the Swarm spacecraft in order to generate new thermospheric densities.
Description
  • The Multifaceted Optimization Algorithm (MOA) is a tool for generating corrected empirical model thermospheric densities during geomagnetic storms. It consists of a suite of Python functions that operate around the Spacecraft Orbit Characterization Kit (SpOCK), an orbital propagator developed by Charles D. Bussy-Virat, PhD, Joel Getchius, and Aaron J. Ridley, PhD at the University of Michigan, and it estimates new densities for the NRLMSISE-00 atmospheric model. MOA generates new model densities by estimating modifications to inputs to the NLRMSISE-00 model that minimize the orbit error between modeled spacecraft in SpOCK, and their actual altitudes as described in publicly-available Two-Line Element Sets (TLEs), made available online via Space-track.org. MOA consists of three sub-process: (1) The Area Optimization Algorithm (AROPT), (2) the F10.7 Optimization Algorithm (FOPT), and (3) the Ap Optimization Algorithm (APOPT). AROPT computes the contribution to the drag of the modeled spacecraft due to their varying projected area. FOPT estimates modifications to the 10.7 cm solar radio flux in NRLMSISE-00, and APOPT estimates modifications to the Earth's magnetic activity in NRLMSISE-00. MOA finds these modifications across many spacecraft, and the medians of those modifications are then applied in NLRMSISE-00 along the orbit of another satellite to generate new densities for verification. In this instance, modifications are applied along the orbits of the Swarm spacecraft and compared to Swarm GPS-derived densities.
Creator
Depositor
  • branddan@umich.edu
Contact information
Discipline
Funding agency
  • National Aeronautics and Space Administration (NASA)
  • Other Funding Agency
Other Funding agency
  • Naval Research Laboratory
Keyword
Citations to related material
  • Brandt, D. A., Bussy-Virat, C. D., & Ridley, A. J. (2020). A Simple Method for Correcting Empirical Model Densities During Geomagnetic Storms Using Satellite Orbit Data. Space Weather, 18(12), e2020SW002565. https://doi.org/10.1029/2020SW002565
Resource type
Last modified
  • 11/18/2022
Published
  • 10/07/2020
Language
DOI
  • https://doi.org/10.7302/pjng-ef66
License
To Cite this Work:
Brandt, D., Bussy-Virat, C., Ridley, A. (2020). A Simple Method for Correcting Empirical Model Densities during Geomagnetic Storms Using Satellite Orbit Data [Data set], University of Michigan - Deep Blue Data. https://doi.org/10.7302/pjng-ef66

Relationships

This work is not a member of any user collections.

Files (Count: 20; Size: 13.4 GB)

Date: 22 September, 2020

Dataset Title: Multifaceted Optimization Algorithm Version Dataset

Dataset Creators: D. A. Brandt, C. D. Bussy-Virat, A. J. Ridley

Dataset Contact: Daniel Brandt, branddan@umich.edu

Funding: Rackham Merit Fellowship, NRL Grant Number N00173-18-1-G00G, and NASA Grant #NNX15AJ20H (Michigan Space Grant)

Key Points:
- Empirical atmospheric models exhibit significant density underestimation during geomagnetic storms.
- A simple algorithm presents a new way of obtaining improved density model estimates from two-line elements describing satellite orbits.
- The technique is validated against Swarm GPS-derived densities during geomagnetic quiet and active times.

Research Overview:
Empirical models of the thermospheric density are routinely used to perform orbit maintenance, satellite collision avoidance, and estimate time and location of re-entry for spacecraft. These models have characteristic errors in the thermospheric density below 10% during geomagnetic quiet time, but are unable to reproduce the significant increase and subsequent recovery in the density observed during geomagnetic storms. Underestimation of the density during these conditions translates to errors in orbit propagation that reduce the accuracy of any resulting orbit predictions. These drawbacks risk the safety of astronauts and orbiting spacecraft, and also limit understanding of the physics of thermospheric density enhancements. Numerous CubeSats with publicly available ephemeris in the form of two-line element (TLEs) sets orbit in this region. We present the Multifaceted Optimization Algorithm (MOA), a method to estimate the thermospheric density by minimizing the error between a modeled trajectory and a set of TLEs. The algorithm first estimates a representative cross-sectional area for several reference CubeSats during the quiet time three weeks prior to the storm, and then estimates modifications to the inputs of the NRLMSISE-00 empirical density model in order to minimize the difference between the modeled and TLE-provided semi-major axis of the CubeSats. For validation, the median value of the modifications across all CubeSats are applied along31the Swarm spacecraft orbits. This results in orbit-averaged empirical densities below 10% error in magnitude during a geomagnetic storm, compared to errors in excess of 25% for uncalibrated NLRMSISE-00 when compared to Swarm GPS-derived densities.

Methodology:
The data are outputs from the Multifaceted Optimization Algorithm (MOA), along with the most recent version of the files that constitute the MOA source code.

Instrument and/or Software Specifications: N/A

Files contained here:
There are four .tar.gz files that must be uncompressed with the 'tar -xvf' command:
- Code.tar.gz contains various Python modules called upon by the rest of the MOA code
- MOA_OUT_1_sats_2017-05-23_2017-06-02_25.tar.gz contains raw MOA outputs for MOA runs that used the 25th percentile of the distribution of calibration satellite cross-sectional areas
- MOA_OUT_1_sats_2017-05-23_2017-06-02_50.tar.gz contains raw MOA outputs for MOA runs that used the 50th percentile of the distribution of calibration satellite cross-sectional areas
- MOA_OUT_1_sats_2017-05-23_2017-06-02_75.tar.gz contains raw MOA outputs for MOA runs that used the 75th percentile of the distribution of calibration satellite cross-sectional areas

The following Python scripts are required for generating figures that display the outputs from MOA, which include spacecraft cross-sectional areas, modified 10.7 cm solar radio flux (F10.7), modified geomagnetic activity (ap), and new model densities.

- MOA_comparator.py (computes median and mean values of modified F10.7 and ap across all calibration satellites)
- MOA_comparator_hist_lines.py (computes distribution and time-varying plots of calibration satellite cross-sectional areas)
- MOA_percentile_f107_compare.py (compares the modified F10.7 values calculated after MOA's use of the 25th, 50th, and 75th percentile of the cross-sectional areas)
- MOA_verify.py (computes new model densities)
- MOA_validate.py (called by MOA_verify.py)
- OMNIWeb_SWPC_plotter.py (downloads unaltered F10.7 and ap from NASA)

Several prerequisites are necessary before using any of the scripts:
- GNU Emacs
- Python 3.0+, Numpy 1.16
- spacepy (to work with NETCDF files, which requires a proper local build of HDF5)
- python wrapper for msise00 (courtesy of Dr. Michael Hirsch: https://pypi.org/project/msise00/, if on Linux, simply 'pip install msise00')

## GENERATING FIGURES ##

Unaltered F10.7 and ap as collected by NASA:
-----------------
(SWPC and OMNIWeb Geomagnetic Indices)
1. Open a terminal
2. type 'ipython' and hit 'enter' on the keyboard
3. type 'run OMNIWeb_SWPC_plotter.py' and hit 'enter' on the keyboard
4. type 'exit' and hit 'enter' on the keyboard to leave 'ipython'.
5. Figure 9a and 9b can are saved locally to 'OMNIWeb_SWPC_Geomagnetic_Indices_2017-05-23-2017-06-02.pdf'.
NOTE: Lines 38 and 39 are set to the dates '2017-05-23' and '2017-06-02', respectively, and must not be changed in order to be consistent with the figure in the paper.

MOA Outputs 1:
----------
(Histograms of overlapping optimized areas)
1. Open a terminal
2. type 'ipython' and hit 'enter' on the keyboard
3. type 'run MOA_comparator_hist_lines.py' and hit 'enter' on the keyboard
4. type 'exit' and hit 'enter' on the keyboard to leave 'ipython'.
5. Figure 10 is saved to 'Optimized_Area_Histograms_10_sats_2017-05-09-2017-05-24_OMNIWEB.png' in the directory here: /MOA_OUT_1_sats_2017-05-23_2017-06-02_75/MOA_GRIDS_10_2017-05-23_2017-06-02_OMNIWEB
NOTE: No lines of this script ought to be changed.

MOA Outputs 2:
----------
(Linearly-interpolated F10.7 corrections for each percentile of optimized area)
1. Open a terminal
2. type 'ipython' and hit 'enter' on the keyboard
3. type 'run MOA_percentile_f107_compare.py' and hit 'enter' on the keyboard
4. Figure 12 is saved to 'MOA_percentiles_f107_compare_2017-05-23_to_2017-06-02_OMNIWEB.png' in the local working directory.
NOTE: As is the case for Figure 10, no lines of the script run for this figure should be changed.

MOA Outputs 3:
-------------------
(dSMA/dt across all Flock 3P satellites, static and linearly-interpolated F10.7 corrections, and F10.7 and ap corrections along with adjusted F10.7 and ap)
1. Open a terminal
2. type 'ipython' and hit 'enter' on the keyboard
3. type ' run MOA_comparator.py' and hit 'enter' on the keyboard
4. All figures are saved in the directory: 'MOA_OUT_1_sats_2017-05-23_2017-06-02_75/MOA_GRIDS_10_2017-05-23_2017-06-02_OMNIWEB'
5. Figure 9c is titled: 'MOA_comparisons_DSMA_10_2017-05-23-2017-06-02_OMNIWEB_75.png'
6. Figure 11 is titled: 'ALL_CORRECTIONS_2017-05-23_2017-06-02_OMNIWEB.png'
7. Figure 13 is titled: 'BIG_STACK_2017-05-23_2017-06-02_OMNIWEB.png'

New Model Densities:
-------------
(Orbit-averaged densities by percentile along SWARM-A and SWARM-B, and Orbit-averaged densities by MOA, MSISE00, and SWARM along SWARM-A and SWARM-B)
NOTE: All figures can be made by opening the terminal, starting ipython, and typing and entering 'run MOA_verify.py'. The only thing different between making each figure are the arguments in the script in lines 32, 33, and 41.

Figure 14a:
Line 32 = 'SWARM-A'
Line 33 = '39452'
Line 41 = 25
Figure saved locally to: 'VAL_SWARM-A_MOA_verify_ALL_SPLINES_COMP_2017-05-23-2017-2017-06-02_OMNIWEB_interp.png'

Figure 14b:
Line 32 = 'SWARM-B'
Line 33 = '39451'
Line 41 = 25
Figure saved locally to: 'VAL_SWARM-B_MOA_verify_ALL_SPLINES_COMP_2017-05-23-2017-2017-06-02_OMNIWEB_interp.png'

Figure 15a:
Line 32 = 'SWARM-A'
Line 33 = '39452'
Line 41 = 75
Figure saved locally to: 'VAL_SWARM-A_MOA_verify_SPLINES_2017-05-23-2017-2017-06-02_OMNIWEB_75_interp.png'

Figure 15b:
Line 32 = 'SWARM-B'
Line 33 = '39451'
Line 41 = 75
Figure saved locally to: 'VAL_SWARM-B_MOA_verify_SPLINES_2017-05-23-2017-2017-06-02_OMNIWEB_75_interp.png'

NOTE: Do not change any other lines in MOA_verify.py

Related Publication(s):
Brandt, D. A., et al. (2020). A Simple Method for Correcting Empirical Model Densities during Geomagnetic Storms Using Satellite Orbit Data. [Under review]

Use and Access:
This data set is made available under an Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.

To Cite Data:
Brandt, D. A., Bussy-Virat, C. D., and Ridley, A. J. (2020), A Simple Method for Correcting Empirical Model Densities during Geomagnetic Storms Using Satellite Orbit Data [Data set]. University of Michigan - Deep Blue.

Download All Files (To download individual files, select them in the “Files” panel above)

Total work file size of 13.4 GB is too large to download directly. Consider using Globus (see below).



Best for data sets > 3 GB. Globus is the platform Deep Blue Data uses to make large data sets available.   More about Globus

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.