|CCT Colloquium Series|
|Software architecture for simulating fires and explosions|
|Steven G. Parker, School of Computing, University of Utah|
|Johnston Hall 338
March 02, 2007 - 03:00 pm
One of the most challenging problems in software architecture is to integrate disparate components and programs in an expressive yet efficient manner. Computer science has long advocated the benefits of assembling complex software out of smaller pieces. Programming abstractions ranging from the humble function/subroutine to libraries to objects to software components, are all designed to enable the composition of sophisticated applications from isolated, potentially reusable parts. However, achieving a sufficient level of modularity in a scientific simulation can be a significant challenge, especially where high performance and parallelism are required. I will describe these challenges and discuss potential solutions for achieving composable parallel scientific computing applications. Computational scientists continue to push the capabilities of current computer hardware to the limits in order to simulate complex real world phenomena. These simulations necessitate the use of ever increasing computational resources. Furthermore, the software written to model real-world scientific and engineering problems is typically very complex. Grid generation, non-linear and linear solvers, visualization systems, and parallel runtime systems all combine to provide a very powerful environment for solving scientific and engineering problems. However, the complexities are compounded when multiple simulation codes are combined to simulate the interaction of multiple physical phenomena. I will describe how these challenges have been addressed in the Uintah computational framework. Uintah uses a non-traditional approach to achieving parallelism, employing an abstract taskgraph representation to describe computation and communication. This representation has a number of advantages, including efficient fine-grained coupling of multi-physics components, flexible load balancing mechanisms, and a separation of application concerns from parallelism concerns. I will describe the algorithms employed in the Uintah taskgraph representation and the research challenges in making this architecture efficient and scalable. The Uintah architecture enables multi-physics simulations that scale to thousands of processors to be built from smaller simulation components. Uintah has been used by the Center for Simulation of Accidental Fires and Explosions to simulate the response of an explosive device subjected to harsh environments such as a fire.
Steven G. Parker is an Assistant Professor at the School of Computing as well as a member of the Scientific Computing and Imaging (SCI) Institute. He performs research in software architectures for scientific computing, and in interactive ray tracing for large-scale scientific visualization and computer graphics applications. He is the chief software architect for the Center of Accidental Fires and Explosions, where he has helped create a problem solving environment for a complex multi-physics simulation running on thousands of processors. He is also the initial developer of the SCIRun problem-solving environment, the SCIJump distributed component environment, and the Manta interactive ray tracing system. Primary research interests for Dr. Parker are scientific visualization, computational steering, high performance computing, interactive ray tracing, computer graphics, problem solving environments, and component architectures. Professor Parker received his B.S. in Electrical Engineering at the University of Oklahoma in 1992 and his PhD. in Computer Science from the University of Utah in 1999.
|Refreshments will be served.|
|This lecture has a reception.|