Supporting Information for research article "Life cycle comparison of environmental emissions from three disposal options for unused pharmaceutical". This spreadsheet provides the calculations and values used for this study; please refer to the manuscript and supporting information (as text) available at http://dx.doi.org/10.1021/es203987b for details about how to use this spreadsheet. We use life cycle assessment methodology to compare three disposal options for unused pharmaceuticals: (i) incineration after take-back to a pharmacy, (ii) wastewater treatment after toilet disposal, and (iii) landfilling or incineration after trash disposal. For each option, emissions of active pharmaceutical ingredients to the environment (API emissions) are estimated along with nine other types of emissions to air and water (non-API emissions). Under a scenario with 50% take-back to a pharmacy and 50% trash disposal, current API emissions are expected to be reduced by 93%. This is within 6% of a 100% trash disposal scenario, which achieves an 88% reduction. The 50% take-back scenario achieves a modest reduction in API emissions over a 100% trash scenario while increasing most non-API emissions by over 300%. If the 50% of unused pharmaceuticals not taken-back are toileted instead of trashed, all emissions increase relative to 100% trash disposal. Evidence suggests that 50% participation in take-back programs could be an upper bound. As a result, we recommend trash disposal for unused pharmaceuticals. A 100% trash disposal program would have similar API emissions to a take-back program with 50% participation, while also having significantly lower non-API emissions, lower financial costs, higher convenience, and higher compliance rates.
The Evans Old Field Plant Database contains FileMaker and Excel files of data collected by Dr. Francis C. Evans during a 50-year study on successional change on Evans Old Field on the Edwin S. George Reserve. Data include plant phenology, location, and abundances observed from 1948 to 1997.
As discussion and debates on the digital humanities continue among scholars, so too does discussion about how academic libraries can and should support this scholarship. Through interviews with digital humanities scholars and academic librarians within the Center for Institutional Cooperation, this study aims to explore some points of common perspective and underlying tensions in research relationships. Qualitative interviews revealed that, while both groups are enthusiastic about the future of faculty-librarian collaboration on digital scholarship, there remain certain tensions about the role of the library and the librarian. Scholars appreciate the specialized expertise of librarians, especially in metadata and special collections, but they can take a more active stance in utilizing current library resources or vocalizing their needs for other resources. This expertise and these services can be leveraged to make the library an active and equal partner in research. Additionally, libraries should address internal issues, such as training and re-skilling librarians as necessary; better-coordinated outreach to academic departments is also needed.
The files include an Excel file with the x-, y-, and z- coordinates that make up the nodal coordinates for a surface model of small (5th percentle) female pelvis geometry, the finite element model (.k file) that represents the nodal coordinates, and two surface files that represent the geometry (.obj and .ply).
Raw data and analysis files for the figures corresponding to the manuscript submission entitled "CCL2 enhances macrophage inflammatory responses via miR-9 mediated downregulation of the ERK1/2 phosphatase Dusp6"
The rapid increases in solar wind dynamic pressure, termed sudden impulses (SIs), compress Earth’s dayside magnetosphere and strongly perturb the coupled Magnetosphere-Ionosphere (M-I) system. The compression of the dayside magnetosphere launches magnetohydrodynamic (MHD) waves, which propagate down to the ionosphere, changing the Auroral Field Aligned Currents (FACs), and into nightside magnetosphere. The global response to the compression front sweeping through the coupled system is not yet fully understood due to the sparseness of the measurements, especially those with the necessary time resolution to resolve the propagating disturbances. That’s why a study including modeling is necessary. On 15 August 2015 at 7.44 UT, Advanced Composition Explorer measured a sudden increase in the solar wind dynamic pressure from 1.11 nPa to 2.55 nPa as shown in Figure-1.
We use the magnetospheric spacecraft in the equatorial magnetosphere to identify the signatures of magnetosphere response to this SI event and examine the interaction of the propagating disturbances with the M-I system. With the increased time resolution of Active Magnetosphere and Polar Electrodynamics Response Experiment (AMPERE), the FAC pattern and intensity change due to SI can also be studied in more depth. We further use measurements from ground based magnetometer stations to increase our tracking capability for the disturbances in the ionosphere and to improve our understanding of their propagation characteristics. This is the first step in a comprehensive multi-point observation and a global magnetohydrodynamic simulation based investigation of the response of the coupled M-I system to sudden impulses.
Many data sets come as point patterns of the form (longitude, latitude, time, magnitude). The examples of data sets in this format includes tornado events, origins/destination of internet flows, earthquakes, terrorist attacks and etc. It is difficult to visualize the data with simple plotting. This research project studies and implements non-parametric kernel smoothing in Python as a way of visualizing the intensity of point patterns in space and time. A two-dimensional grid M with size mx, my is used to store the calculation result for the kernel smoothing of each grid points. The heat-map in Python then uses the grid to plot the resulting images on a map where the resolution is determined by mx and my. The resulting images also depend on a spatial and a temporal smoothing parameters, which control the resolution (smoothness) of the figure. The Python code is applied to visualize over 56,000 tornado landings in the continental U.S. from the period 1950 - 2014. The magnitudes of the tornado are based on Fujita scale.