High-Performance Scientific
Computing in the UK –
STFC and the Hartree Centre
Mike Ashworth
Scientific Computing Department and
STFC Hartree Centre STFC Daresbury Laboratory
•
High Performance Computing in the UK
•
STFC‟s Scientific Computing Department
•
STFC‟s Hartree Centre
•
High Performance Computing in the UK
•
STFC‟s Scientific Computing Department
•
STFC‟s Hartree Centre
History of UK
academic HPC provision
1 10 100 1000 10000 100000 1000000 10000000Jan-93 Jan-95 Jan-97 Jan-99 Jan-01 Jan-03 Jan-05 Jan-07 Jan-09 Jan-11 Jan-13
Linp
ac
k Gf
lop/s
TOP500 list
Moore's Law 14 months Total UK Universities
Total UK national provision (incl. HPCx, HECToR, Hartree, DIRAC)
Tera Peta
History of UK
academic HPC provision
1 10 100 1000 10000 100000 1000000 10000000Jan-93 Jan-95 Jan-97 Jan-99 Jan-01 Jan-03 Jan-05 Jan-07 Jan-09 Jan-11 Jan-13
Linp ac k Gf lop/s TOP500 list CSAR HPCx HECToR Hartree & DIRAC Tera Peta SRIF Regional centres
The UK National
Supercomputing facilities
The UK National Supercomputing
Services are managed by EPSRC
on behalf of the UK academic
communities
HPCx ran from 2002-2009 using
IBM POWER4 and POWER5
HECToR is the current system 2007-2014
Located at Edinburgh, operated jointly by EPCC and STFC
HECToR Phase3 90,112 cores Cray XE6 (660 Tflop/s Linpack)
ARCHER is the new service, due for installation around
DiRAC
DiRAC (Distributed Research utilising
Advanced Computing)
Integrated supercomputing facility for
UK research in theoretical modelling and
HPC-based simulation in particle physics,
astronomy and cosmology
Funded by STFC
Computer simulated image of the glow of dark matter(Credit: Virgo)
Flagship DiRAC system is a 6-rack
IBM Blue Gene/Q
98,304 cores
1.0 Pflop/s Linpack
Located at Edinburgh
UK Tier-2 regional
university centres
N8 Research Partnership
• N8 HPC centre
• £3.25M facility
• Based at University of Leeds
• SGI system 5312 cores
• Possibly to become “S6” with Cambridge & Imperial College
• X86 system at University of Southampton
• IBM system 11088 cores
• GPU system at STFC RAL
• HP with 372 Fermi GPUs #3 GPU system in Europe
UK academic
HPC pyramid
Tier-0
Tier-1 National
Tier-2 Regional Universities
Local Universities and Institutes
PRACE systems
HECToR, Hartree, DIRAC
•
High Performance Computing in the UK
•
STFC‟s Scientific Computing Department
•
STFC‟s Hartree Centre
HM Government (& HM Treasury)
RCUK Executive Group
Joint Astronomy Centre Hawaii
Isaac Newton Group of Telescopes
La Palma UK Astronomy Technology
Centre, Edinburgh, Scotland
Polaris House Swindon, Wiltshire
Chilbolton Observatory
Stockbridge, Hampshire
Daresbury Laboratory
Daresbury Science and Innovation Campus Warrington, Cheshire
Rutherford Appleton Laboratory
Harwell Science and Innovation Campus Didcot, Oxfordshire
Understanding our Universe
STFC’s Science Programme
Particle Physics
Large Hadron Collider (LHC), CERN - the structure and forces of nature
Ground based Astronomy
European Southern Observatory (ESO), Chile
Very Large Telescope (VLT), Atacama Large Millimeter Array (ALMA), European Extremely Large Telescope (E-ELT),
Square Kilometre Array (SKA)
Space based Astronomy
European Space Agency (ESA)
Herschel/Planck/GAIA/James Webb Space Telescope (JWST) Bi-laterals – NASA, JAXA, etc.
STFC Space Science Technology Department
Nuclear Physics
Facility for anti-proton and Ion research (FAIR), Germany Nuclear Skills for - medicine (Isotopes and Radiation
applications), energy (Nuclear Power Plants) and environment (Nuclear Waste Disposal)
STFC’s Facilities
Neutron Sources
ISIS - pulsed neutron and muon source/ and Institute Laue-Langevin (ILL), Grenoble
Providing powerful insights into key areas of energy, biomedical research, climate,
environment and security.
High Power Lasers
Central Laser Facility - providing applications on bioscience and nanotechnology
HiPER
Demonstrating laser driven fusion as a future source of sustainable, clean energy
Light Sources
Diamond Light Source Limited (86%) - providing new breakthroughs in medicine, environmental and materials science, engineering, electronics and cultural heritage
Major funded activities
• 160 staff supporting over 7500 users • Applications development and support • Compute and data facilities and services • Research: over 100 publications per annum • Deliver over 3500 training days per annum
• Systems administration, data services, high-performance computing, numerical analysis & software engineering.
Major science themes and capabilities
• Expertise across the length and time scales from processes occurring inside atoms to environmental modelling
Scientific Computing
Department
Director: Adrian Wander Appointed 24th July 2012
Scientific Highlights
Journal of Materials Chemistry 16 no. 20 (May 2006) - issue devoted to HPC in materials chemistry (esp. use of HPCx);
Phys. Stat. Sol.(b) 243 no. 11 (Sept 2006) - issue featuring scientific highlights of the Psi-k Network (the European network on the electronic structure of condensed matter coordinated by our Band Theory Group);
Molecular Simulation 32 no. 12-13 (Oct, Nov 2006) - special issue on applications of the DL_POLY MD program written &
developed by Bill Smith (the 2nd special edition of Mol Sim on
DL_POLY - the 1st was about 5 years ago);
Acta Crystallographica Section D 63 part 1 (Jan 2007) - proceedings of the CCP4 Study Weekend on protein crystallography.
The Aeronautical Journal, Volume 111, Number 1117 (March 2007), UK Applied Aerodynamics Consortium, Special Edition.
Proc Roy Soc A Volume 467, Number 2131 (July 2011), HPC in the Chemistry and Physics of Materials.
Last 5 years metrics:
– 67 grants of order £13M
– 422 refereed papers and 275 presentations
– Three senior staff have joint appointments with Universities
– Seven staff have visiting professorships
– Six members of staff awarded Senior
Fellowships or Fellowships by Research Councils’ individual merit scheme
•
High Performance Computing in the UK
•
STFC‟s Scientific Computing Department
•
STFC‟s Hartree Centre
Opportunities
Political
Opportunity
Business
Opportunity
Scientific
Opportunity
Technical
Opportunity
•Demonstrate growth through economic and societal impact from investments in HPC•Engage industry in HPC simulation for competitive advantage
•Exploit multi-core
•Exploit new Petascale and Exascale architectures
•Adapt to multi-core and hybrid architectures
•Build scale, multi-physics coupled apps
•Tackle complex Grand Challenge problems
Government Investment
in e-infrastructure - 2011
17th Aug 2011: Prime Minister David Cameron
confirmed £10M investment into STFC's Daresbury Laboratory. £7.5M for computing infrastructure
3rd Oct 2011: Chancellor George Osborne
announced £145M for e-infrastructure at the Conservative Party Conference
4th Oct 2011: Science Minister David Willetts
indicated £30M investment in Hartree Centre
30th Mar 2012: John Womersley CEO STFC
and Simon Pendlebury IBM signed major collaboration at the Hartree Centre
Intel collaboration
STFC and Intel have signed an MOU to develop and test
technology that will be required to power the
supercomputers of tomorrow.
STFC and Intel have signed an MOU to develop and test
technology that will be required to power the
supercomputers of tomorrow.
Karl Solchenbach, Director of
European Exascale Computing
at Intel said "We will use STFC's
leading expertise in scalable
applications to address the
challenges of exascale
computing in a co-design
approach."
Tildesley Report
BIS commissioned a report on the
strategic vision for a UK e-Infrastructure
for Science and Business.
Prof Dominic Tildesley led the team
including representatives from
Universities, Research Councils, industry
and JANET. The scope included
compute, software, data, networks,
training and security.
Mike Ashworth, Richard Blake and John
Bancroft from STFC provided input.
Hartree Centre
capital spend 2011/12
12 6 6 6 2.2 5.3 BlueGene/Q iDataPlex Data IntensiveDisk & Tape
Visualization
Infrastructure
approximate capital spend £M Total £37.5M
TOP500
#18
in the Jun 2013 list
#6
in Europe
#1
system in UK
6 racks
• 98,304 cores
• 6144 nodes
• 16 cores & 16 GB
per node
• 1.25 Pflop/s peak
1 rack to be configured as BGAS
(Blue Gene Advanced Storage)
• 16,384 cores
• Up to 1PB Flash memory
Hartree Centre
IBM BG/Q Blue Joule
Hartree Centre
IBM iDataPlex Blue Wonder
TOP500
#222
in the Jun 2013 list
8192 cores, 170 Tflop/s peak
node has 16 cores, 2 sockets
Intel Sandy Bridge (AVX etc.)
252 nodes with 32 GB
4 nodes with 256 GB
12 nodes with X3090 GPUs
256 nodes with 128 GB
ScaleMP virtualization software up
to 4TB virtual shared memory
Hartree Centre
Datastore
Storage:
5.76 PB usable disk storage
15 PB tape store
Hartree Centre
Visualization
Four major facilities:
Hartree Vis-1: a large visualization “wall” supporting stereo
Hartree Vis-2: a large surround and immersive visualization system
Hartree ISIC: a large visualization “wall” supporting stereo at ISIC
Hartree Atlas: a large visualization “wall” supporting stereo in the Atlas Building at RAL, part of the Harwell Imaging Partnership (HIP)
Hartree Centre
Mission
Hartree Centre at the STFC Daresbury Science and
Innovation Campus will be an International Centre of
Excellence for Computational Science and Engineering.
It will bring together academic, government and industry
communities and focus on multi-disciplinary, multi-scale,
efficient and effective computation.
The goal is to provide a step-change in modelling
capabilities for strategic themes including energy, life
sciences, the environment, materials and security.
Douglas Rayner Hartree
Father of Computational Science
•
Hartree–Fock method
•
Appleton–Hartree equation
•
Differential Analyser
•
Numerical Analysis
Douglas Hartree with Phyllis Nicolson at the Hartree Differential Analyser at Manchester University
“It may well be that the
high-speed digital
computer will have as
great an influence on
civilization as the advent
of nuclear power” 1946
Douglas Rayner Hartree PhD, FRS (1897 –1958)
Responding to the Challenges
Expert optimisation of existing software
• Profiling and identification of hotspots, use of libraries, tackling issues associated with large core counts, serial bottlenecks, redesign of I/O etc
Application Co-design
• Software and hardware must evolve together
• Requires close collaboration between hardware architects, computer scientists, and application software experts
Re-engineering software requires a specialised
development platform
• Highest possible core count
• Configured and operated for software development (interactive use, profiling, debugging etc)
Government Investment
in e-infrastructure - 2013
1
stFeb 2013:
Chancellor George Osborne and Science Minister David
Willetts opened the Hartree Centre and announced a further £185M of
funding for e-Infrastructure
£19M for the Hartree Centre for power-efficient computing technologies
£11M for the UK‟s participation in the Square Kilometre Array
George Osborne opens the Hartree Centre, 1st February 2013
This investment forms part of the £600
million investment for science
announced by the Chancellor at the
Autumn Statement 2012.
“By putting out money into science we
are supporting the economy of
Collaboration with
Unilever
1st Feb 2013: Also announced was a key partnership with Unilever in the
development of Computer Aided Formulation (CAF)
Months of laboratory bench work can be completed within minutes by a tool designed to run as an „App‟ on a tablet or laptop which is connected remotely to the Blue Joule supercomputer at Daresbury.
John Womersley, CEO STFC, and Jim Crilly, Senior Vice President, Strategic Science Group at Unilever
This tool predicts the behaviour and structure of
different
concentrations of liquid compounds, both in the bottle and in-use, and helps researchers plan fewer and more focussed experiments.
The aggregation of surfactant molecules into micelles is an important process in product
£19M investment in power-efficient technologies
•
System with latest NVIDIA Kepler GPUs
•
System based on Intel Xeon Phi
•
System based on ARM processors
•
Active storage project using IBM BGAS
•
Dataflow architecture based on FPGAs
•
Instrumented machine room
Power-efficient
Technologies ‘Shopping List’
Systems will be made available
for development and evaluation
projects with Hartree Centre
partners from industry,
•
High Performance Computing in the UK
•
STFC‟s Scientific Computing Department
•
STFC‟s Hartree Centre
Hartree Centre
Projects in the First 12 Months
56 projects; 200M BG/Q hours; 14M iDataPlex hours
Current
Unified Model
100 150 200 250 300 350 400 450 2003 01 2004 01 2005 01 2006 01 2007 01 2008 01 2009 01 2010 01 2011 01 Met Office ECMWF USA France Germany Japan Canada Australia 2003 2011 Performance of the UM (dark blue) versusa basket of models measured by 3-day surface pressure errors
From Nigel Wood, Met Office
Met Office Unified Model
„Unified‟ in the sense of using the same code for weather forecasting and for climate research
Combines dynamics on a lat/long grid with physics (radiation, clouds,
precipitation, convection etc.) Also couples to other models (ocean , sea-ice, land
surface, chemistry/aerosols etc.) for improved forecasting and earth system modelling “New Dynamics” Davies et al (2005) “ENDGame” to be operational in 2013
(17km)
POWER7 Nodes Perfect scaling
Limits to
Scalability of the UM
The problem lies with the spacing of the lat/long grid at the poles
At 25km resolution, grid spacing near poles is 75m
At 10km this reduces to 12m! The current version (New Dynamics)
has limited scalability
The latest ENDGame code improves this, but a more radical solution is required for Petascale and beyond
Challenging Solutions
Cube-sphere
Triangles Yin-Yang
GUNG-HO targets a brand new dynamical core
Scalability – choose a globally uniform grid which has no poles (see below)
Speed – maintain performance at high & low resolution and for high & low core counts
Accuracy – need to maintain standing of the model Space weather implies a 600km deep model
Five year project 2011-2015
Operational weather forecasts around 2020!
From Nigel Wood, Met Office
Globally
Uniform
Next
Generation
Highly
Optimized
“Working together harmoniously”•
HPC is flourishing in the UK
•
New Government investment supports
a wide range of e-Infrastructure
projects (incl. data centres, networks,
ARCHER, Hartree)
•
The Hartree Centre is a new centre
developing and deploying new
software for industry, government
and academia
•
We are very happy to talk about
partnership with centres in China
Summary
For more information on the Hartree Centre see the
http://www.stfc.ac.uk/Hartree
If you have been …
… thank you for listening
Mike Ashworth
mike.ashworth@stfc.ac.uk