Category Archives: Case study


Planetary Climates

Institute: University of East Anglia

[quote style=”boxed” float=”right”] “My research is about understanding climate and climate change, mostly on Earth, but also on other planets. To further this aim, I use a combination of theory and models, ranging from simple numerical codes to very complex global circulation models.”

– Dr Manoj Joshi, UEA Lecturer – School of Environmental Sciences[/quote]

The benefits of using Grace for High Performance Computing

The models used run on dozens of processors, enabling century-scale integrations of the climate in a matter of hours to days, enabling a step-change in the science that I can do with it. Without the fast interconnected parallel facility, the model would run far more slowly, keeping lots of science questions out of reach.

Using grace rather than a remote HPC – it’s a local machine which means fast access, access to the local filestore when needed and a local support team who I can talk to if all else fails.

My work

An intermediate global circulation model is now running on grace, taking advantage of the parallel architecture. The reason for using “intermediate” models is that they provide an excellent and flexible framework for understanding climate and complement both theory and the more complex state-of-the-art climate models which are far more computationally expensive.

An example of this work is an experimental high-resolution version of the intermediate global circulation model. The movie loop  (beware – 10MB do not view on slow network connections) shows surface brightness (in white), clouds (in shades of grey) and rainfall (in colours) evolving over 3 months in Northern winter. The Greenwich meridian lies on the extreme left and right hand sides, with the dateline in the centre.

Weather systems, and their associated rainfall and clouds, can be seen moving from left to right (eastwards) in the southern and northern extratropics, while tropical weather systems move slowly from right to left (westwards). Snow cover can be seen over northern Eurasia (top left), North America (top centre-right) and over Antarctica (bottom). A few tropical cyclones are generated: these are almost circular blobs which are coloured orange (denoting very high rainfall), and slowly move westwards in the tropics before pulled into the extratropical eastward flow.


Using new computer architecture to help improve cancer treatments

Institute: University College London

[quote style=”boxed” float=”right”] “Although we also have a PRACE (Partnership for Advanced Computing in Europe) allocation of 24 million CPU hours, we can use this up very easily. Each simulation uses multiple replicas (at different temperatures) and so it has billions of sampling steps. The CPU time this takes is greatly reduced using Emerald’s GPU architecture because the code can be very highly parallelised, making it much more efficient.”

– Professor Francesco Gervasio, UCL Chemistry[/quote]

Professor Francesco Gervasio of UCL’s Department of Chemistry used the Emerald supercomputer (built in partnership with the Centre for Innovation) to simulate the effect of gene mutations linked to the spread of cancer. His research could help develop more robust and effective cancer treatments.


Epidermal Growth Factor Receptor (EGFR) is a protein that causes cell growth and differentiation. Gene mutations that lead to EGFR over-activity have been associated with a number of cancers, particularly lung cancer. Not surprisingly, EGFR is the target of a rapidly expanding class of anti-cancer drugs.

Recent efforts in treating cancer have focused on a part of the EGFR: the tyrosine kinase enzyme, which functions like an ‘off’ and ‘on’ switch for many cellular functions. Mutations of the EGFR kinase can cause it to become stuck in the ‘on’ or active position, causing tumour growth. Certain tyrosine kinase inhibitors (TKIs) have proven to be successful in slowing down tumour growth. Unfortunately, many cancer patients develop resistance to them.

Researchers need to understand more about common EGFR kinase mutations such as T790M and L858R, so as to develop more effective inhibitor drugs for cancer patients. Not much is known about these mutations, especially at an atomic level, but it is believed that changes to the transition between their active (‘on’) and inactive (‘off’) states are linked to their cancer-causing potential.

Because proteins are very flexible and can easily change conformation, simulating these transitions requires enormous computational effort. Recent research – using very complex molecular dynamics simulations (involving billions of steps) and conducted on multi-million dollar specialised supercomputer Anton – gave further evidence of how the kinase mutations affect the protein’s conformation. Even with this vast computational power, the researchers only observed two conformational changes, which were barely sufficient to grasp the full impact caused by the mutations.

What we did

Francesco’s work took advantage of recent progress in molecular dynamics and the availability of specialised computer hardware to try and understand more about EGFR kinase’s conformational changes, in particular the interaction between the mutations T790M and L858R. He chose to simulate this double mutant, along with both of the single mutants and a control mutant.

Francesco used an extremely efficient sampling technique called ‘parallel-tempered metadynamics’ to study the dynamics of the kinase very close to its equilibrium state. Here, he could observe a large number of small conformational transitions that occur in a very short timescale.

He used the GROMACS molecular dynamics software with the PLUMED metadynamics plug-in, adapted to run on the Centre for Innovation’s Emerald high performance computer system. Emerald uses specialised, highly parallelised GPU architecture (Graphical Processor Units, similar to the video cards used to accelerate the speed of the display on desktop PCs). Emerald was able to carry out the simulations much more quickly than would be possible on a machine with standard architecture (like Anton).

[quote style=”boxed” float=”right”]”The system got so hot that at least one graphics card broke every month. People think computing power is now very cheap, but they don’t realise the true costs of a system like this.”

– Professor Francesco Gervasio, UCL Chemistry[/quote]

Francesco’s previous research group in Spain had purchased their own high performance system – consisting of rack-mounted servers with cheap PC graphics cards. It was expensive and difficult to keep running, due to the heat generated by the processors.

In contrast, Emerald has a specialised cooling system and dedicated staff to support and maintain it, with expert GPU developers to assist researchers with migrating their codes to the new architecture. UCL has partnered with 3 other universities and the Science & Technology Facilities Council (STFC) to form the Consortium which funds the Centre for Innovation, sharing the costs of this infrastructure.

Results / impact of the work

Francesco’s results have helped shed new light on how mutations of the EGFR kinase affect its likelihood of causing tumour growth.

This new knowledge is important because it can be used to devise new strategies for developing tyrosine kinase inhibitors that are less susceptible to the development of drug resistance.

[toggle_container keep_open=”false” initial_open=””]

[toggle title=”Links”]


[toggle title=”Other”]Vivamus sagittis lacus vel augue laoreet rutrum faucibus dolor auctor. Etiam porta sem malesuada magna mollis euismod.[/toggle]



HemeLB: vascular modelling and blood flow simulation

Institute: University College London


[quote style=”boxed” float=”right”]HemeLB is an open-source computational suite for fluid dynamics simulations of blood flow in the vasculature of the human body.[/quote]


The HemeLB suite is able to generate 3D models of the vasculature of individual human body parts based on medical images such as an angiogram, MRI or CT scan.

These models are then used to run sophisticated fluid dynamics simulations (using the Lattice Boltzmann method) which can provide accurate haemodynamic estimates for blood vessels; for example, blood pressure, flow rate, and wall shear stress at different locations.

HemeLB has been developed by Professor Peter Coveney’s group at the UCL Centre for Computational Science.

The research software development team provided assistance with programming a key component of HemeLB: the ‘setup tool’ which creates 3D models of the vasculature based on medical images.

To date, HemeLB has primarily been applied to the simulation of blood flow in cerebral aneurysms – balloon-like malformations in the arteries of the brain; a potential cause of stroke. However, HemeLB is just one example of a wide array of models that are being developed as part of the Virtual Physiological Human project.

This project aims to bring together computational models, from the molecular to the organ scale, to simulate larger biological systems and eventually the whole body.

A primary concern for the group working on HemeLB was that, owing to the complexities of computing 3D geometrical operations, the setup tool used to generate the initial models of the vasculature failed in approximately 5% of cases.

This problem was likely to be a major blocker in the path towards deployment of HemeLB in production environments.

The Research Software Development Team (RSDT) was brought in to help identify the reason for these failures and to provide a solution to this problem, enabling the setup tool to work reliably in all cases.

What we did

HemeLB uses a 3D image of the vasculature which is split up into small cubes called voxels (like three dimensional pixels).

The primary job of the setup tool is to construct this representation by defining each voxel as interior (fluid) or exterior (solid) to a blood vessel’s surface.

Starting with a known voxel, the status of nearby voxels is determined by counting the number of intersections with the vessel’s surface when travelling in a straight line from one to the other: an odd number of intersections indicate a change in status from fluid to solid or vice-versa.

The HemeLB team identified the problem of resolving voxels on the surface of blood vessels (i.e. those which are partially inside and outside) as being the point of failure for the setup tool.

The cause was the fact that computers are limited in terms of the accuracy to which they can represent numbers and, in some cases, this made it impossible to tell whether a voxel and a surface intersect.

The chances of this occurring were very small when considering an individual voxel, but the setup tool had to perform millions of these calculations for each model.

The result was that, approximately 5% of the time, the setup tool produced inconsistent results because two assessments of the same voxel taken from different directions disagreed as to whether it was fluid or solid.

The RSDT had the difficult job of identifying all possible types of failures and writing exceptions into the code which could detect and resolve these failures. It was quickly determined that the set of geometrical algorithms initially chosen for the job was unsuitable due to numerical accuracy issues. Replacing those algorithms with a state-of-the-art computational graphics library allowed the identification of different, well separated, modes of failure in the intersection counting.

This wasn’t a case of writing just one exception as there were a lot of special cases to detect and each one needed code writing for it.

Jens Nielsen from the RSDT worked methodically to test the code for many possible points of failure and produce a solution for each one. His background in physics made him the ideal person to carry out this difficult mathematical and computational work.

Results / impact of the work

Remedying the problems with the setup tool is a significant step towards producing a version of HemeLB which can be used as a tool in medicine.

Specifically, aneurysm treatment planning should benefit from the availability of accurate haemodynamic estimates in and around the aneurysm.

Miguel Bernabeu from the Centre for Computational Sciences describes the significance of HemeLB:

“Current medical imaging modalities don’t allow doctors to observe the forces exerted by blood on the vessel’s inner surface to a level of detail that becomes useful for treatment planning. The technological developments carried out at UCL, and elsewhere, are opening a window into cardiovascular physiology and pathology.”

So, in future neurosurgeons could use HemeLB to help decide whether or not to operate.

However, HemeLB will have to obtain conformity certification (e.g. CE marking) if it is going to be used in the routine treatment of patients. To achieve this, Professor Coveney’s group will have to demonstrate that good software development practices have been followed by ensuring that HemeLB is well documented and well tested.

RSDT encourage all software development projects to apply good coding practice from the start and regularly run training courses with UCL researchers to introduce these concepts.
A HemeLB analysis of the pressure exerted on the blood vessel walls in the area surrounding an aneurysm.

A HemeLB analysis of the pressure exerted on the blood vessel walls in the area surrounding an aneurysm.

In the academic environment, HemeLB is being applied in other areas of research: Miguel Bernabeu is now using HemeLB to study the relationship between haemodynamics and vascular remodelling.

His recent work, studying the developing mouse retina, has confirmed that vessels with low wall shear stress go on to be pruned later in development. This is the beginning of a larger project which aims to understand the parameters underlying vascular growth and change.

This work has potential significance for the development of drugs to treat cancer as it may help us to understand the processes underlying the quick and extensive vascularisation that supports tumour growth.

One of the next stages for development of HemeLB is to find a way to couple it with other computational models; for example, to model the relationship between fluid dynamics in blood vessels and mechanobiological responses to these stresses at the cellular level.

[toggle_container keep_open=”false” initial_open=””]

[toggle title=”Links”]


[toggle title=”Other”]Nulla vitae elit libero, a pharetra augue. Nullam id dolor id nibh ultricies vehicula ut id elit.[/toggle]



SgurrEnergy: Using High Performance Computing for Large Scale Renewable Energy Projects

Institute: University of Strathclyde


[quote style=”boxed” float=”right”]To date SgurrEnergy has assessed over 85,000MW of renewable energy developments internationally and this figure is growing rapidly every month. Clients include utilities, financiers, developers and many other public and private sector organisations.[/quote]

SgurrEnergy is a leading independent engineering consultancy specialising in worldwide renewable energy projects, who has the capability to deliver at every phase of a project, from the early stages of site selection, feasibility and design, right through to project management of the construction, operation and maintenance phases. The company’s multi-disciplinary consultants have extensive sustainable energy experience worldwide.

ARCHIE-WeSt and the Wind Energy Preparation Programme (WEPP)

SgurrEnergy is the lead partner for the Wind Energy Preparation Programme (WEPP), as part of the larger Malawi Renewable Energy Acceleration Programme (M-REAP). The MREAP program is funded by the Scottish Government and is being led by the University of Strathclyde along with SgurrEnergy, IOD Parc and Community Energy Scotland. Partners in Malawi include the Government of Malawi (GoM), Department of Energy Affairs (DoEA), University of Malawi Polytechnic, Mzuzu University, Concern Universal, Mulanje Renewable Energy Agency, Opportunity International Bank Malawi and Energy Technology Partnership.

This programme has been established to accelerate the adoption of renewable energy technologies, on a domestic and commercial scale in Malawi, with a particular focus on the alleviation of poverty. The WEPP programme itself will encompass initial feasibility studies for a number of small wind farm sites, including anemometry mast installation and data collection on the preferred sites.

As part of an initial feasibility study, SgurrEnergy used the ARCHIE-WeSt HPC facility to run a Weather Research and Forecasting (WRF) mesoscale wind resource model of Malawi (Fig. 1). Running a WRF mesoscale wind resource model over such a large area is very computationally intensive (12,600 core-hours was spent only on preliminary calculations) and as such the use of a dedicated HPC facility allowed the model to be set-up, tested and run in a timely fashion. The outputs of the wind resource model were used to identify areas within Malawi with a promising wind resource. They will also be used on an on-going basis to inform various elements of the detailed feasibility studies within the scope of the (WEPP) programme.

The results of the project will allow the wind farms to progress through to the final development and financing stages, and on to construction and operation with a low level of risk.

Leveraging the power of ARCHIE-WeSt for consultancy services

SgurrEnergy has also been able to leverage the power of ARCHIE-WeSt to provide consultancy services to the industry. For example, energy yield prediction and mesoscale wind resource assessment studies were performed for various proposed offshore wind farms off the north and west coasts of France. This involved performing a 10 year hindcast simulation using the WRF software employing 170,000 core hours of computation time on ARCHIE.


Coronary Artery Stent Design for Challenging Disease

Institute: University of Southampton
Research Team: Georgios Ragkousis
Investigators: Neil Bressloff


[quote style=”boxed” float=”right”]It is well known that, across all populations (based on geographic location, race, ethnicity, age, sex) coronary artery disease (CAD) is the most common cause of death.[/quote]

CAD is a result of atherosclerosis, and affects the coronary arteries that surround the heart. When the inner walls of the coronary arteries thicken due to a build-up of plaque (cholesterol, fatty deposits, calcium, etc.), the vessel narrows and blood flow through the vessel is reduced leading to less oxygen supply to the heart muscle. Percutaneous coronary intervention (PCI) or “stenting” has become the main method for treating coronary occlusions in recent years. Stents are cylindrical mesh shaped devices that undergo plastic deformation so as to scaffold reopened artery walls and restore physiologically healthy blood flow. PCI involves the intra-vascular insertion (typically, through the groin) and deployment of a stent on a balloon-catheter-guidewire system. Once deployed, however, the presence of the stent can lead to unwanted events including in-stent restenosis (ISR) and stent thrombosis (ST).

Although current generation stents have been shown to significantly reduce the rate of ISR, sub-optimal delivery of stents still occurs leading to ISR and, in particular, ST. This is especially a problem in challenging anatomy with long lesions. A common complication of PCI in such challenging cases is stent malapposition or incomplete stent apposition (ISA) which is the lack of contact between stent struts and the underlying wall. This procedural challenge is not new, yet there has been a paucity of data produced with the aim of improving the mechanical impact of the implanted stent platform within a challenging geometry.

A method has been setup to (i) reconstruct diseased patient specific coronary artery segments; (ii) simulate the deployment of state of the art stents into these segments and (iii) assess the degree of stent malapposition. The aim now is to devise a stent delivery system that can mitigate this problem.

[toggle_container keep_open=”true” initial_open=””]

[toggle title=”Categories”]

  • Life sciences simulation: Biomedical
  • Physical Systems and Engineering simulation: Biomechanics, Structural dynamics
  • Algorithms and computational methods: Finite elements, Optimisation
  • Simulation software: Abaqus
  • Programming languages and libraries: Python
  • Computational platforms: Iridis, Linux, Windows
  • Transdisciplinary tags: Design


[toggle title=”Other”]Lorem ipsum dolor sit amet, consectetur adipiscing elit. Proin sit amet magna et risus cursus adipiscing sed non enim. Integer accumsan sapien ac nisl iaculis, vel vulputate odio hendrerit. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Cras placerat feugiat est, ut tempus arcu lobortis tristique. Maecenas vitae est in lorem hendrerit pulvinar vitae in dolor. Vestibulum et turpis augue. Nam adipiscing urna nec dolor faucibus, eu consectetur mi aliquam. Duis eget arcu sed nisi commodo lobortis in ac risus.[/toggle]