Resources

Hartree Centre Logo

Back to Contents Page

Resources Available

Last modified: 2/5/2017

Quick Links

STFC's Scientific Computing Department has over thirty years' experience in the design, implementation and development of world leading scientific software. It is internationally recognised as a centre for parallelisation, optimisation and porting of existing software to leading edge and novel architecture systems. In addition to domain expertise in a wide range of disciplines the Department also has strong software engineering and numerical algorithms expertise.

Following recent capital investments from UK government the Department has established the Hartree Centre in collaboration with IBM. This outward facing centre offers a range of services including collaborative software development and access to a range of novel hardware platforms (98,304 core Blue Gene/Q, Blue Gene Active Storage, iDataPlex, NextScale, ARM, nVidia Tesla GPU, Intel Phi and Maxeler MPX-X).

For access to the Hartree Centre machines and support in your research projects, please contact: David Holder, Head of Programme Support Office, The Hartree Centre, STFC Daresbury Laboratory, WA4 4AD: e-mail: david.holder@stfc.ac.uk; tel: 01925 603120.

For current status and system information see [Status Page|external link: http://community.hartree.stfc.ac.uk/wiki/site/admin/status.html] .

Summary of system compute nodes:

NameSystem TypeCore TypeNumberComments
AceArm-64 ClusterArm-v8 currently being up-graded
BantamBlueGene/Q BGASBGP2,048x16 +512x16 (BGAS)de-commissioned Mar'2017
BifortBlueGene/QBGP6,144x16de-commissioned end Apr'2016
CDSCentral Data Storestorage currently being up-graded
DawsonBigData ClusterX86_64  
DeLoreanMaxeler DFEX86_64 and GPFA6x12? IB + 5x8 DFE 
Fairthorpelogin clustervarious  
IdenIBM iDataPlexX86_64 IvyBridge and Xeon Phi 5110P84x24 IB +42x Phi 
InvictaIBM iDataPlexX86_64 SandyBridge E5-2670512x16de-commissioned end Apr'2016
JadeAtos nVidia DGX1 cluster  currently being installed
NapierIBM NextScaleX86_64 IvyBridge E5-2697360x24 
NealeClusterVision ClusterX86_86 IvyBridge E5-2650120*16 
PalmerstonPower-8 ClusterPower8E 82472x24 (192 threads per node)de-commissioned
PantherPower-8 ClusterPower8 8335-GTA32x16 (128 threads per node) 
ParagonPower-8 ClusterPower8 8335-GTB32x16 (128 threads per node) 
Scafell PikeAtos Sequana cluster  currently being installed
Viceroyinteractive and visualisation cluster2x iDataPlex nodes with nVidia Quaddro 5000  

Phase-1 Computer Systems

The Hartree Centre has a heterogeneous collection of computational, storage, and networking resources. Each resource is designed to fulfill a specific requirement in the field of high performance and scientific computing, and using the right resources for each task is an essential part of getting the most out of any project.

Procurement in early 2012 included refurbishment of the existing HPCx computer room at Daresbury Laboratory. It is now split into two to form a state-of-the art data centre, half with water cooling, and half with conventional air cooling.

Phase-1 computer systems were Invicta (aka Blue Wonder), Bantam and Bifort (aka Blue Joule). These have now been de-commssioned to make way for more efficient and highly performing architectures.

Phase-2 Computer Systems

There is a Flash time-lapse video showing the installation during March 2014 here: external link: http://avon.dl.ac.uk/hartree/phase2_installation.flv .

The login cluster for Iden and Napier is named Calcott.

Iden - iDataPlex and Xeon Phi Cluster (Blue Wonder Phase-2)

Napier - NextScale Cluster (Blue Wonder Phase-2)

Scafell Pike

From 18 April - 12 May, the main components of the new Hartree Centre system, Scafell Pike will be delivered to site and installed in the HDR. This includes 22 racks and 80+ pallets of equipment with water and electrical hook ups to take place in early May. The Impact & Engagement team are arranging time lapse footage of the installation for sharing at a future date.

Jade

The Joint Academic Data Science Endeavour (JADE), an NVIDIA DGX-1 system owned by the University of Oxford and run as part of a Tier-2, EPSRC-funded consortium has been installed at the Hartree Centre. We will host and support the system in partnership with Atos, who are providing the hardware. JADE will be the largest GPU facility in the UK supporting world-leading research in machine learning.

Big Data Systems

Dawson - X86 Cluster for Data Analytics

We have a wide range of Big Data software, for more information contact the business development team.

See separate page on Big Data.

Energy Efficient Computing Research Systems

Note, there is a separate area describing systems related to our Energy Efficient Computing Research Programme and the projects running on them: external link: http://community.hartree.stfc.ac.uk/portal/site/eecrp .

Iden - Xeon Phi Cluster

See above.

Neale - Novel Cooling Technology

This is a novel cooling demonstrator from Insight and ClusterVision:

DeLorean - Data Flow Technologies

This is an FPGA system from Viglen and Maxeler.

Ace - Low Power Processors

This is an Arm-v8 based system from Misco, OCF, IBM and ARM.

For more information on these systems go to Energy Efficient Computing Systems .

Power-8 Research Systems

Login via Fairthorpe

Panther

Paragon

Room Names

How to find your way around The Hartree Centre and how many people the rooms can accommodate:

NumberNamePurposeSeatingStanding
Hartree Centre Glass meeting roomHartree VC1meetings room15 
Hartree Centre S.16Hartree VC2Conference room16 
Hartree Centre Ground floor Seminar roomHartree G.08Seminar room30 
Hartree Centre meeting roomHartree G.09meeting room20 
A52ThomsonMeeting and Seminar Room4070
A53LeverhulmeSurround Wall Visualisation Suite2550
A54CrosfieldQuad Wall Visualisation Suite3060
A56Brunner-MondTraining Workstations60100
C-BlockHDRhigh density (water cooled) data centre  
C-BlockLDRlow density (air cooled) data centre  
C-BlockGreen Roomenergy efficient computing area  

If you would like to organise an event at the Hartree Centre such as a workshop, training course or tour then please see external link: http://www.hartree.stfc.ac.uk/Hartree/Training+and+skills/45515.aspx where there is a link to an Event Proposal Form. Visualisation suites and Brunner-Mond can also be booked via this form or you can send an email to the Hartree Centre Service Desk at hartree@stfc.ac.uk.

A list of events can be seen on the STFC Events Booking page at: external link: https://eventbooking.stfc.ac.uk/news-events/

Visualisation Suite

The visualisation suite comprises two parts: the Surround Wall (or Curved Wall) in the Leverhulme room; and the Quad Wall in the Crosfield room, which is adjacent to the training suite. These facilities support full 3D active stereo and a wide range of visual applications.

For more information, see here: external link: http://tyne.dl.ac.uk/twiki/bin/view/Visualisation/WebHome .

Training Suite

The training suite in the Brunner-Mond room consists of several rows of desks with a total of around 50 workstations. There are two presenters' areas with separate workstations and projectors onto wall mounted screens. This is primarily designed as an area for hands-on computer based training. There are separate facilities for seminars, lectures and workshops.

For more information, see here: external link: http://community.hartree.stfc.ac.uk/library/content/gateway/training.html .

Back to Contents Page