USSCC Planning Meeting

USSCC Planning Meeting

Ensuring Our Nations Energy Security Computational Challenges and Directions in the Office of Science Science for DOE and the Nation NCSX Fred Johnson Advanced Scientific Computing Research SOS7, March 2003 Outline Background Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited 2

The Office of Science Supports basic research that underpins DOE missions. Constructs and operates large scientific facilities for the U.S. scientific community. Accelerators, synchrotron light sources, neutron sources, etc. Five Offices Basic Energy Sciences Biological and Environmental Research Fusion Energy Sciences High Energy and Nuclear Physics Advanced Scientific Computing Research 3 Computational Science is Critical to the Office of Science Mission Scientific problems of strategic importance typically: Involve physical scales that range over 5-50 orders of magnitude;

Couple scientific disciplines, e.g., chemistry and fluid dynamics to understand combustion; Must be addressed by teams of mathematicians, computer scientists, and application scientists; and Utilize facilities that generate millions of gigabytes of data shared among scientists throughout the world. The Scale of the Problem Two layers of Fe-Mn-Co containing 2,176 atoms corresponds to a wafer with dimensions approximately fifty nanometers (50x 10-9m) on a side and five nanometers (5 x 10-9m) thick. A simulation of the properties of this configuration was performed on the IBM SP at NERSC. The simulation lasted for 100 hrs. at a calculation rate of 2.46 Teraflops (one trillion floating point operations per second). To explore material imperfections, the simulation would need to be at least 10 times more compute intensive. Outline Background Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited 5

Scientific Discovery Through Advanced Computation (SciDAC) SciDAC brings the power of terascale computing and information technologies to several scientific areas -- breakthroughs through simulation. SciDAC is building community simulation models through collaborations among application scientists, mathematicians and computer scientists -- research tools for plasma physics, climate prediction, combustion, etc. State-of-the-art electronic collaboration tools facilitate the access to these tools by the broader scientific community to bring simulation to a level of parity with theory & observation in the scientific enterprise. 6 Introduction SciDAC is a pilot program for a new way of doing science

spans the entire Office of Science (ASCR, BES, BER, FES, HENP) $37M 2M+ 8M+ 3M 7M involves all DOE labs and many universities builds on 50 years of DOE leadership in computation and mathematical software (EISPACK, LINPACK, LAPACK, BLAS, etc.) 7 Addressing the Performance Gap through Software Peak performance is skyrocketing 1,000 In 1990s, peak performance increased 100x; in 2000s, it will increase 1000x Peak Performance But ...

Efficiency for many science applications declined from 40-50% on the vector supercomputers of 1990s to as little as 510% on parallel supercomputers of today Need research on ... Mathematical methods and algorithms that achieve high performance on a single processor and scale to thousands of processors More efficient programming models for massively parallel supercomputers Teraflops 100

Performance Gap 10 1 Real Performance 0.1 1996 2000 2004 8 Its Not Only Hardware! Updated version of chart appearing in Grand Challenges: High performance computing and communications, OSTP committee on physical, mathematical and Engineering Sciences, 1992.

9 SciDAC Goals an INTEGRATED program to: (1) create a new generation of scientific simulation codes that take full advantage of the extraordinary capabilities of terascale computers (2) create the mathematical and computing systems software to enable scientific simulation codes to effectively and efficiently use terascale computers (3) create a collaboratory software environment to enable geographically distributed scientists to work effectively together as a TEAM and to facilitate remote access, through appropriate hardware and middleware infrastructure, to both facilities and data with the ultimate goal of advancing fundamental research in science central to the DOE mission 10 CSE is Team-Oriented successful CSE usually requires teams with members and/or expertise from at least mathematics, computer science, and (several) application areas

language and culture differences usual reward structures focus on the individual incompatible with traditional academia SciDAC will help break down barriers and lead by example; DOE labs are a critical asset 11 The Computer Scientists View Must study climate! Must have Fortran! Must have cycles! Must move data! 12

Applications Scientist View Computer Scientist Applications Scientist 13 Future SciDAC Issues additional computing and network resources initial SciDAC focus is on software, but new hardware will be needed within the next two years both capability and capacity computing needs are evolving rapidly limited architectural options available in the U.S. today topical computing may be a cost-effective way of providing extra computing resources math and CS research will play a key role expansion of SciDAC program many important SC research areas (e.g., materials/nanoscience,

functional genomics/proteomics) are not yet included in SciDAC; NSRCs, GTL 14 Outline Background Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited 15 Motivation UltraScale Simulation Computing Capability Mission need: Energy production, novel materials, climate science, biological systems Systems too complex for direct calculation; descriptive laws absent. Involve physical scales up to 50 orders of magnitude; Several scientific disciplines, e.g., combustion;

materials science Experimental data may be costly to develop, insufficient, inadequate or unavailable; and Large data files (millions of gigabytes) shared among scientists throughout the world. History of Accomplishments MPI, Math libraries, first dedicated high-performance computing center, SciDAC 16 ASCAC Statement Without robust response to Earth Simulator, U.S. is open to losing its leadership in defining and advancing frontiers of computational science as new approach to science. This area is critical to both our national security and economic vitality. (Advanced Scientific Computing Advisory Committee May 21, 2002). 17

Simulation Capability Needs FY2004-05 Timeframe Application Simulation Need Sustained Computational Capability Needed (Tflops) Significance Climate Science Calculate chemical balances in atmosphere, including clouds, rivers, and vegetation.

Magnetic Fusion Energy Optimize balance between self-heating of plasma and heat leakage caused by electromagnetic turbulence. > 50 Underpins U.S. decisions about future international fusion collaborations. Integrated simulations of burning plasma crucial for quantifying prospects for commercial fusion. Combustion Science Understand interactions between combustion and turbulent fluctuations in

burning fluid. > 50 Understand detonation dynamics (e.g. engine knock) in combustion systems. Solve the soot problem in diesel engines. Environmental Molecular Science Reliably predict chemical and physical properties of radioactive substances. > 100 Develop innovative technologies to remediate contaminated soils and groundwater.

Astrophysics Realistically simulate the explosion of a supernova for first time. >> 100 Measure size and age of Universe and rate of expansion of Universe. Gain insight into inertial fusion processes. > 50 Provides U.S. policymakers with leadership data to support policy decisions. Properly represent and predict extreme weather conditions in changing climate. 18

Key Ideas Deliver a full-suite of leadership class computers for science with broad applicability. Establish a model for computational sciences (SciDAC and base programs) that couples applications scientists, mathematicians, and computational and computer scientists with computer designers, engineers, and semiconductor researchers. Develop partnerships with domestic computer vendors to ensure that leadership class computers are designed, developed, and produced with science needs as an explicit design criterion. Partner with other agencies. Partner with industry on applications. 19 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability

Supporting R&D 30% Research with Domestic Vendors Develop ultrascale hardware and software capabilities for advancing science, focusing on faster interconnects and switches. o Continue 2 partnerships begun in FY2003 o Initiate additional partnerships ( up to 3) in FY2004, based on competitive review Operating Systems, Software Environments, and Tools o Address issues to ensure scalability of operating systems to meet science needs o Develop enhanced numerical libraries for scientific simulations o Develop tools to analyze application performance on ultrascale computer systems University-based Computer Architecture Research Explore future generations of computer architectures for ultrascale science simulation. 20 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability Computing and Network Facilities- 70%

Computer architecture evaluation partnerships- Evaluate computer architectures at levels to ensure that computer hardware and systems software balanced for science and likely to successfully scale o Continue partnership established in FY2002 between ORNL and Cray, Inc. o Initiate one new partnership, comprised of scientists and engineers from a domestic computer vendor, with computer scientists, and applications scientists supported by the Office of Science. o Award partnership from a competition among invited vendors Begin installation of first ultrascale computing system for science 21 Outline Background Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited 22

Next Generation Computer Architecture Goal: Identify and address major hardware and software architectural bottlenecks to the performance of existing and planned DOE science application Main Activities Architecture impacts on application performance OS/runtime research Evaluation testbeds 23 Outline Background Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited 24 How full is the glass?

Support and enthusiasm within the Office of Science Office of Science Strategic Plan Interagency cooperation/coordination NSA SV2 DOD IHEC DARPA HPCS NNSA: program reviews, open source, NAS study, Red Storm, DARPA/DOD/SC USSCC meeting OSTP/NNSA/DOD/SC NGCA meeting OSTP support International coordination Hawaii meeting ES benchmarking

25 Agency Coordination Overview Matrix NNSA Research Coordination Development Coordination Strategy Coordination $17M research funded at NNSA laboratories Red Storm development Formal coordination documents

DOD DUSD Science and Technology IHEC study DARPA NSA All Agencies HPCS review team UPC HPCS evaluation system plan Cray SV2/X1 development HECCWG

26 NNSA Details NNSA Research Coordination Development Coordination Strategy Coordination X $17M research funded at NNSA laboratories, Light weight kernel, common component architecture, performance engineering, X Red Storm

development quarterly review meetings, ASCI Q review, ASCI PSE review, SciDAC reviews, X Formal coordination documents, joint funded NAS study, open source software thrust, platform evaluation 27 NSA Details NSA Research Coordination Development Coordination

X UPC (Lauren Smith), Programming Models (Bill Carlson), Benchmarking (Candy Culhane) X Cray SV2/X1 development, Cray Black Widow development (quarterly review meetings) Strategy Coordination 28 DOD and DARPA Details Research Coordination Development Coordination

DOD DUSD Science and Technology DARPA Strategy Coordination X IHEC study, agreement on SC role in IHEC X HPCS review team Phase I, Phase II and Phase II; Review Cray, IBM, HP, SUN and SGI projects X HPCS evaluation system plan, agreement on SC role as HPCS early evaluator at scale

29 DARPA High Productivity Computing Systems Program (HPCS) Goal: Provide a new generation of economically viable high productivity computing systems for the national security and industrial user community (2007 2010) Impact: Performance (efficiency): critical national security applications by a factor of 10X to 40X Productivity (time-to-solution) Portability (transparency): insulate research and operational application software from system Robustness (reliability): apply all known techniques to protect against outside attacks, hardware faults, & programming errors HPCS Program Focus Areas Applications: Intelligence/surveillance, reconnaissance, cryptanalysis, weapons analysis, airborne contaminant modeling

and biotechnology Fill the Critical Technology and Capability Gap Today (late 80s HPC technology) (Quantum/Bio Computing) 30 Computing Metric Evolution Early Computing Metrics Clock frequency Raw performance (flops) GHz Race Current Computing Metrics

Clock frequency Point performance Acquisition Price Tera-flop Race (Top Ten HPC Centers) HPCS Value Based Metrics

System performance relative-toapplication diversity Scalability (flops-to-petaflops) Idea-to-solution Time-to-solution Mean time-to-recovery Robustness (includes security) Evolvability Application life cycle costs Acquisition (facilities and equipment) costs Ownership (facilities, support staff, training) costs 31 Memory System Performance Limitations Why applications with limited memory reuse perform inefficiently today Year of Introduction (Cray) 1988 1994 2000

Cray C90 Y-MP T90 J90 Intel Alpha SGI MIPS Sun Cray EL-90 10.00% SV1 Microprocessors

Measured %peak (STREAMS ADD) 100.00% 1.00% 10 100 1000 10000 Clock Speed MHz STREAMS ADD: Computes A + B for long vectors A and B (historical data available) New microprocessor generations reset performance to at most 6% of peak Performance degrades to 1% - 3% of peak as clock speed increases within a generation Goal: benchmarks that relate application performance to memory reuse and other factors 32 Phase I HPCS Industry Teams

Cray, Incorporated International Business Machines Corporation(IBM) Silicon Graphics, Inc. (SGI ) Sun Microsystems, Inc. Hewlett-Packard Company 33 The Future of Supercomputing National Academy CSTB study Co-funded by ASCR and NNSA 18 month duration Co-chairs: Susan Graham, Marc Snir Kick-off meeting 3/6/03 The committee will assess the status of supercomputing in the United

States, including the characteristics of relevant systems and architecture research in government, industry, and academia and the characteristics of the relevant market. 34 High End Computing Revitalization Task Force OSTP interagency thrust HEC an administration priority for FY05 Task Force to address: HEC core technology R&D Federal HEC capability, capacity and accessibility Issues related to Federal procurement of HEC systems It is expected that the Task Force recommendations will be considered in preparing the Presidents budget for FY2005 and beyond. Kick-off meeting March 10, 2003 Co-chairs: John Grosh, DOD and Alan Laub, DOE 35

Links SciDAC Genomes to Life Nanoscale Science, Engineering, and Technology Research UltraScale Simulation Planning 36

Recently Viewed Presentations

  • Diapositive 1

    Diapositive 1

    L'explication finale est donnée dans le livre « Decoding the Message of the Pulsars » de l'astrophysicien Paul LaViolette (p. 138-149) : projection par conjugaison de phase d'un champ de force programmé afin de téléimprimer un motif complexe. Les scientifiques...
  • The Apostles' Creed - Sunday Teacher

    The Apostles' Creed - Sunday Teacher

    The Apostles' Creed. Early in Christianity's history, this creed was most likely used as a summary of Christian doctrine for baptismal candidates in the churches of Rome. This creed was possibly created to refute heresy or to teach new converts...
  • 2.3 HW Questions

    2.3 HW Questions

    How did Eli Whitney's cotton gin change slavery in the South? What were the three parts of the American System? What was its overall purpose? How did the North respond to the Tariff of 1816? How did the South? What...
  • Abdominal wall and Retroperitoneum

    Abdominal wall and Retroperitoneum

    Omphaloceles occur in approximately 1:4,000 live births, and include a spectrum of midline defects that range from large (usually containing liver and bowel) to small (which may contain only 1 or 2 bowel loops). The exteriorized viscera are contained by...
  • Lecture notes on Metamorphic Petrology

    Lecture notes on Metamorphic Petrology

    Metamorphic Textures ... Poikiloblastic or sieve texture: porphyroblast containing numerous inclusions of one or more fine grains. E- Textures donating inclusion within or rim on a porphyroblast: Corona or reaction rim: A zone consisting of grains of a new minerals...
  • Navigating the Maze - O'Reilly Media

    Navigating the Maze - O'Reilly Media

    Navigating the Maze How to sell to the public sector Adrian Farley Chief Deputy CIO State of California [email protected] Mapping the Public Sector market Size of the market Trends Priorities Opportunities The Public Sector is a Large and Growing Market...
  • Hypothetical Example of Average Drawdown

    Hypothetical Example of Average Drawdown

    Uses specific yield and not storativity to estimate changes in head. Model may not accurately simulate decreases in groundwater levels due to pumping due to use of higher specific yield value (rather than storativity) Simulated gradient is flatter than reality...
  • Diapositiva 1

    Diapositiva 1

    Diapositiva 1 ... Vaccinazioni