Computing Sciences masthead Berkeley Lab Computing Sciences Berkeley Lab logo

Summer Student Program 2013 - Presentation Abstracts

Talks/Events:

High-Order Discontinuous Galerkin Methods for Conservation Laws (Top)

Who: Per-Olof Persson
When: June 06, 12:00-1:00 PM
Where: 70-191

It is widely believed that high-order accurate numerical methods, for example discontinuous Galerkin (DG) methods, will eventually replace the traditional low-order methods in the solution of many problems, including fluid flow, solid dynamics, and wave propagation. I will explain what DG methods are and demonstrate what these methods can do in real-world problems, such as the analysis of flapping flight (e.g. the flight of birds and bats), microelectromechanical systems (MEMS), and wind turbines.

Per-Olof Persson is an Assistant Professor in the Department of Mathematics at University of California, Berkeley, and a Mathematician Faculty Scientist/Engineer at the Lawrence Berkeley National Laboratory. He has previously been an Instructor of Applied Mathematics at the Massachusetts Institute of Technology, from where he also received his Ph.D. in 2005 under the supervision of Gilbert Strang and Alan Edelman. He received the Air Force Office of Scientific Research Young Investigator Award in 2010 and an Alfred P. Sloan fellowship in 2011. His current research interests are in high-order discontinuous Galerkin methods for computational fluid and solid mechanics, with applications in aerodynamics, aeroacoustics, and flapping flight.


Introduction to HPC and NERSC Tour (Top)

Who: Richard Gerber
When: June 10, 09:00 AM-12:00 PM
Where: Berkeley Lab's OSF

What is High Performance Computing (and storage!), or "HPC"? Who uses it and why? We'll talk about these questions as well as what makes a "Supercomputer" so super and what's so big about scientific "Big Data". Finally we'll discuss the challenges facing system designers and application scientists as we move into the "many-core" era of HPC.


Big Bang, Big Data, Big Iron (Top)

Who: Julian Borrill
When: June 13, 12:00 -01:00 PM
Where: 70-191

On March 21st 2013 the European Space Agency announced the first cosmology results from its billion-dollar Planck satellite mission. The culmination of 20 years of work, Planck’s observations of the Cosmic Microwave Background – the faint echo of the Big Bang itself – provide profound insights into the foundations of cosmology and fundamental physics.

Planck has been making 10,000 observations of the CMB every second since mid-2009, providing a dataset of unprecedented richness and precision; however the analysis of these data is an equally unprecedented computational challenge. For the last decade we have been developing the high performance computing tools needed to make this analysis tractable, and deploying them at supercomputing centers in the US and Europe.

This first Planck data release required tens of millions of CPU-hours on the NERSC supercomputers. This included generating the largest Monte Carlo simulation set ever fielded in support of a CMB experiment, comprising 1,000 realizations of the mission reduced to 250,000 maps of the Planck sky. However our work is far from done; future Planck data releases will require ten times as many simulations, and next-generation CMB experiments will gather up to a thousand times as much data as Planck.


How to Write a Bad Proposal (Top)

Who: Kathy Yelick
When: June 18, 12:00-1:00 PM
Where: 50A-5132

Whether you spend most of your career in a university, a national laboratory, or an industrial research lab, writing proposals is likely to be an important piece of your career. In any setting, getting funding (or management buy-in) is important to building a large team effort and in some cases keeping yourself and your students and staff employed. This talk will describe some of the common pitfalls in proposal writing and in doing give you some ideas about how to be successful. The talk comes from many experiences, successful and not, over the years.


Using Math and Computing to Model Supernovae (Top)

Who: Andy Nonaka
When: June 20, 12:00-1:00 PM
Where: 70-191

We describe our recent efforts for understanding the physics of Type Ia supernovae - the largest thermonuclear explosions in the universe - by simulating multiple stages of stellar evolution using two massively parallel code frameworks developed at Berkeley Lab. Each code framework was developed using mathematical models well-suited for the character of the burning, whether it be the pre-ignition deflagration phase, or the post-ignition explosion phase. Our results will help scientists form models that describe the ultimate fate of our galaxy.


An Overview of ESnet's Advanced Network Services (Top)

Who: Brian Tierney
When: June 25, 12:00-1:00.
Where: 70-191

ESnet provides a 100Gbps backbone network connecting the Department of Energy laboratories across the USA. ESnet is considered a world leader in advanced network services. This talk will give a brief overview of these services, including our dynamic virtual circuit service called OSCARS, a performance monitoring and troubleshooting service called perfSONAR, and a 100Gbps Testbed that is available to any researcher for high-speed network experiments.


Computing Challenges in Bioinformatics (Top)

Who: Sarah Richardson, Alicia Clum and Alex Boyd
When: June 27, 1:30-3:00 PM
Where: Berkeley Lab's OSF

We have three panelists from the Joint Genome Institute that will discuss the computational challenges they tackle in the bioinformatics space.

Sarah Richardson is a distinguished postdoctoral fellow in the synthetic biology group. She gets to design and build custom bacteria that are needed to help us understand the connection between genes and their functions. She's written software called GeneDesign that is used by researchers around the world to design DNA sequence that can be constructed in the lab.

Alicia Clum is a puzzle master. She's an analyst in the Genome Assembly Group and she takes the thousands of short reads created by next-generation sequencers (puzzle pieces) and finds ways to put them back together in order to reconstruct microbial genomes. She's recently been doing work on metagenome assembly - an even trickier problem where you have 10-100 slightly puzzles all jumbled up. The goal is similar, but you are now trying to reconstruct 10-100 different genomes and figure out how the microbes may work together to perform tasks like carbon sequestration.

Alex Boyd is a software development wiz.In his short time at the JGI he has re-architected several critical pipelines for the sequence data management group (they keep things organized as data comes off the sequencers). Most recently he has helped write an archiving tool that the JGI will use to optimize storage of data on spinning disk and tape. Without this archiving tool, the mass quantities of data generated by sequencers and analysis generated today at the JGI would have a high probability of being completely inaccessible in a decade.