COMPUTATIONAL RESEARCH FOR THE ENVIRONMENT
Media Contact: Linda Vu, email@example.com, (510) 495-2402
Environmental researchers rely on computers to expose the fundamental workings of our planet and atmosphere, to bring complicated questions to light, and to find solutions. By developing and using computational tools to model the Earth's evolution, to simulate climate, and invent new technologies, scientists in the Lawrence Berkeley National Laboratory's Computational Research, Earth Sciences, Environmental Energy Technologies divisions are helping humanity adapt to a changing world.
Here are some of the projects that we are working on:
- Building Environmental Databases for the 21st Century
- Improving the Accuracy of Ice Sheet Models
- Carbon Goes Underground, Then What?
- Extreme Weather and Climate in a Changing World
- Speeding up Deployment of Carbon Capture Research and Development
- Experiments and Simulations Join Forces to Engineer a New Type of Clean, Efficient Burner
- Developing a Carbon-Climate Data Assimilation System
It takes a global village to guard water supplies, protect endangered species and monitor the Earth's carbon cycle. Although hundreds of science groups in multiple countries have been monitoring these trends in their local ecosystems, until recently, there was no way to easily combine and navigate through these disparate datasets to accurately track trends on local, regional and global scales.
That's where advanced data servers developed by Berkeley Lab and Microsoft Research computer scientists, in collaboration with environmental scientists, come in. These databases include domain specific tools to automatically extract important aspects of incoming data; a database and schema to organize and archive information; data cubes that allow researchers to look at the data from multiple perspectives; and tools which automatically convert multiple data versions into one format. This architecture has been applied to research ranging from fish recovery planning for the Russian River and Pajaro watersheds in California, to the global FLUXNET network of carbon dioxide flux measurement systems.
One of the most-cited examples of global climate change is the retreat of ice sheets in Antarctica and Greenland. But the details of how they are melting is a mystery that may be solved with a new generation of computer simulations.
Until recently, most computer models could only provide very crude representations of important physical processes like glacier surges, iceberg calving and grounding-line migration. But all this will change with the help of researchers from the Berkeley Lab's Computational Research Division and Los Alamos National Laboratory who are applying parallel adaptive mesh refinement (AMR) techniques to the Community Ice Sheet Modeling code known as GLIMMER-CISM. These algorithms allow researchers to model points of interest, like the retreating edges of ice sheets, at unprecedented resolution. With more accurate models, they will be better able to make more accurate predictions about how ice sheet melting occurs, and how it is contributing to other phenomena like the rise in global sea level.
B-ISICLES Project to Improve Accuracy of Ice Sheet Models
Image courtesy of George Pau
|This simulation shows CO2-rich "fingers" forming in a saline aquifer where carbon dioxide is being sequestered. The grids represent four different levels of resolution, small grids representing finer resolution. This technique concentrates computing power on areas that are changing most (the fingers) while paying less attention to areas where there is little or no activity (the brine below the fingers). This method is called adaptive mesh refinement, or AMR.|
Researchers at Berkeley Lab have performed the first ever three-dimensional simulation to explore the diffusive mixing of CO 2 and brine that occurs when carbon dioxide is sequestered underground. Unprecedented in detail, simulations, run in both 2-D and 3-D, will help scientists understand the long-term security of CO2 injected into saline aquifers deep underground. It will also help researchers better understand just how much CO2 a given site can sequester.
The new code, PMAMR, developed by the Center for Computational Sciences and Engineering at Berkeley Lab, combines an advanced computational science technique, called adaptive mesh refinement (AMR) with high-performance parallel computing to model subsurface flows. AMR concentrates computing power on more active areas of the simulation by breaking it into finer parts (higher resolution), while calculating less active, less critical portions in coarser parts (lower resolution). The size of the chunks, and thus the resolution, automatically shifts as the model changes, ensuring the most active, critical processes are also the most highly resolved. This ability to shift focus makes AMR a powerful tool for modeling phenomena that involve both small-scale (at the millimeter level) and large-scale (meters or kilometers) processes.
In the case of underground carbon sequestration it's critical to understand processes that range from the chemical and physical interactions involved in the dissolution of CO2 in brine up to tracking and understanding kilometers-long plumes of CO2 as they migrate underground. This code may also help geologists track and predict the migration of hazardous wastes or recover more oil from existing wells.
"Geologic carbon sequestration is a key tool in reducing atmospheric CO2. This research improves our understanding of the processes involved in geologic carbon sequestration. By providing better characterizations of the processes, we can more accurately predict the performance of carbon sequestration projects, such as the storage capacity and long-term security of a potential site."
-George Pau, Berkeley Lab’s Center for Computational Sciences and Engineering.
"If you look how much CO2 is made by a large coal-fired power plant, and you want to inject that all for 30 years, you end up with a CO2 plume on the order of 10 km in dimension. We are very interested in the long-term fate of this thing.... So, we have to look carefully at all aspects of proposed injection sites and look carefully at very long term behaviors that will be induced by CO2. These models will help us understand that behavior."
-Karsten Pruess, Berkeley Lab’s Earth Sciences Division
Image courtesy of Michael Wehner
Michael Wehner, MFWehner@lbl.gov,
Extreme weather and climate events can have serious impacts on human and ecological systems. Changes in the magnitude and frequency of extreme weather like severe heat waves, hurricanes, droughts and intense precipitation, associated with changes in the average climate are likely the most serious consequence of human induced global warming. Understanding what the future portends is vital if society hopes to adapt to a very different world.
"... recent advances in high performance computing are rapidly improving climate models' ability to simulate extreme event statistics. As a result, scientific understanding and confidence in projections of future extreme weather are also rapidly increasing,"
—Michael Wehner, Berkeley Lab's Scientific Computing Group and Lead Author for IPCC Fifth Assessment Report's "long term projections" chapter.
Accelerating Deployment of Carbon Capture Research and Development with Computer Simulations
Many researchers believe that global warming could be greatly mitigated if "carbon capture" technologies, which capture and separate CO2 gas before it reaches the atmosphere, were incorporated into the design of new fossil fuel burning power plants. Although these technologies exist today, they currently consume too much energy to be cost effective. And experts predict that in the existing research and development workflow, it will take at least another 10 to 30 years for novel solutions to be developed, commercialized and widely deployed.
That's why researchers in Berkeley Lab's Computational Research and Materials Science Divisions are teaming up with experts at four other Department of Energy national laboratories as well as partners in academia and industry to build a software framework that enables communications between existing carbon capture modeling tools and provide software development support. Advanced modeling and simulation capabilities have the potential to significantly reduce the time, capital, and operational costs required for the development and deployment of novel carbon capture technologies. Using computer simulations, researchers can watch the molecular-level dynamics of carbon capturing materials, then scale up to explore how this material functions at the device-level and see how the device functions in a production power plant.
Catching Carbon with Computer Simulations
Experiments and Simulations Join Forces to Engineer a New Type of Clean, Efficient Burner
Image courtesy of John Bell and Marc Day
One promising strategy for reducing U.S. dependence on petroleum is to develop new combustion technologies for burning hydrogen or hydrogen-rich syngas (a combination of hydrogen and carbon monoxide) fuels obtained from the gasification of coal and biomass.Thanks to progress in both experiments and numerical simulations at Berkeley Lab, low-swirl combustion is quickly becoming a frontrunner for this application.
Compared to conventional burners and engines, low-swirl combustion technology uses a lower fuel-to-oxygen ratio, and ensures fuel and oxygen are mixed before they enter the combustion chamber. Together, these features result in a lower flame temperature, near zero emissions, and maximum fuel efficiency. However, premixed flames also require device-scale stabilization to ensure safety and reliability, as flames from fuels with high hydrogen content are much more reactive than natural gas flames.
"Even with the best available laser diagnostic techniques, it would be extremely challenging to obtain high fidelity information on the chaotic and highly turbulent flame processes. Numerical simulations gives us a rare window into these 3D time-dependent complex processes to help us gain important insights on the effects of hydrogen on turbulent flame speed and pollution formation,"
-Robert Cheng, Berkeley Lab's Environmental Energy Technologies Division and inventor of the Low-Swirl Burner.
"Theory provides a foundation for basic flame physics but can’t address the complexity of realistic flames. Laboratory measurements are difficult to make and limited in the detail they provide. Computation, with its ability to deal with complexity and unlimited access to data, has the potential for closing the gap between theory and experiment and enabling dramatic progress in combustion science,"
-John Bell, Berkeley Lab's Center for Computational Sciences and Engineering.
Developing a Carbon-Climate Data Assimilation System
Approximately half the CO2 emitted by fossil fuel combustion has remained in the atmosphere and generated the current increase of CO2. The land and oceans have acted as repositories (sinks) for the remainder of the fossil fuel CO 2, plus the CO 2 released as a result of land use modification. The key to predicting future levels of atmospheric CO2 and the timing and magnitude of climate change is the prediction not only of the anthropogenic carbon sources, but also of the biogeochemical processes that determine the changing magnitudes and locations of the carbon sinks. These processes determine the rate of carbon exchange between the atmosphere, land and oceans, as well as the stability and longevity of carbon storage in each of these reservoirs in a changing environment.
To predict these biogeochemical processes more accurately, researchers at the University of California at Berkeley, Lawrence Berkeley National Laboratory, the University of Maryland, and the National Center for Atmospheric Research have undertaken the development of a carbon-climate data assimilation system that will improve the representation of terrestrial carbon processes in coupled carbon-climate models. This framework will function like a global weather prediction system but for carbon sources and sinks.
Carbon-Climate Interactions Group