by Thomas Macaulay

Met Office CIO Charles Ewen on how supercomputers forecast the weather

Interview
Feb 15, 2018
Cloud ComputingData CenterIT Leadership

Met Office Director of Technology and CIO Charles Ewenhad much to celebrate in 2017, as the world’s most powerful supercomputer dedicated to weather entered its first year in operation on time and under budget.

“It’s one of the largest machines in the world, and it’s capable of doing more calculations in a second then there are grains of sand on every beach in the world,” says Ewen.

The CIO 100 highflyer started his current role in 2014, 160 years after the Met Office was founded. The forecaster is the oldest national meteorological service in the world and has been a tech pioneer since its inception.

It began life in 1854 as an experimental government department to establish meteorology as a science and developed the first storm warning service five years later, after the 1859 sinking of the Royal Charter steamship caused the deaths of nearly 500 passengers.

The Met Office has changed much since once of its first systems – a combination of cones and drums attached to a staff, which would warn ships of approaching gales – but the innovation mindset remains the same under the current CIO.

“Technology has always been at the very heart of weather forecasting and, more latterly, climate simulation and analysis,” says Ewen, who’s responsible for the Met Office’s ICT strategy and for the technical teams in its technology directorate.

“The reason for that is the mathematical and computational complexity of the physical simulations used. Whilst generally we call this weather forecasting but at a technical level, what the Met Office is doing is simulating the future state of the atmosphere.”

The Cray XC40 supercomputer helps those simulations reach new levels of power and precision. It’s capable of completing more than 14,000 trillion arithmetic operations per second. This power allows the Met Office to take in 215 billion weather observations every day from around the world. This means the Met Office can make more regular and accurate forecasts.

If the team runs a lot of different simulations at once these will have different outcomes, and the team can then weigh the merits of these against each other to find out which result is most likely.

“When you do that you get a different outcome, and if you do that, say five times, you will get five different outcomes,” explains Ewen.

“Now sometimes those five different outcomes are very different, and sometimes those outcomes are tightly grouped. This gives us a statistical understanding of the confidence and probability of a given forecast.”

The supercomputer also provides the power to run larger simulations that extend coverage of the UK further out into westerly conditions coming from the Atlantic Ocean and eastern winds from Europe.

Worldwide weather

The international component is crucial in the interconnected world of weather forecasting.

“For 70 or 80 years now, there’s been a formal interchange of open data between about 200 agencies that do weather across the world so that all of us can build a picture of the current state of the global atmosphere,” says Ewen.

“This has been going on for a long time across geopolitical boundaries. It’s all orchestrated by the United Nations, through an organisation called the the World Meteorological Organisation (WMO). They lay out how that interoperability works.”

All the WMO’s member nations exchange their state of the atmosphere reports, which are based on information transmitted to the earth from satellites.

About 70% of the information the Met Office uses to initialise simulations of the atmosphere comes from satellites. The other 30% draws on a wide range of observations from other space-based, atmosphere-based, marine-based and land-based platforms.

The Met Office runs about 400 of these weather stations, which capture information such as rainfall, temperature and humidity. All of this helps paint a picture of the state of the atmosphere, which is fed into the Cray XC40.

The supercomputer then analyses the data through huge simulation codes developed in the Met Office’s science programme of about 600 scientists.

“Their job is to constantly develop our understanding of how the atmosphere and climate system work at a physical level and encode that learning in the simulation codes,” explains Ewen.

“The scientists compile simulation codes so that when we feed the supercomputer with an understanding of the current state of the atmosphere, we can set the supercomputer off and it can wind time forwards and deliver a simulation of the future state of the atmosphere.”

The observations programme alone is a vast data endeavour. The Met Office exchanges about 400 million observations messages a day, the equivalent of billions of discrete observations.

The supercomputer crunches all this data at a speed of up to 15 petaflops to produce enormous simulations.

“To put some scale around that, we’re currently working at about two terabytes an hour of operational weather data alone, and weather data itself isn’t actually the biggest of the big data,” says Ewen.

“The biggest of the big data are the climate simulations. We’re now archiving something in the order of a petabyte a week.”

These simulations rely on quickly making and delivering a set of calculations on the data.

The Cray XC40 operates at a latency of less than a microsecond, a speed that can’t be matched by today’s cloud high performance computing vendors.

“It’s not saying it can’t be done – it has been done,” explains Ewen. “But when you do it, if you’d try to use our scale, it would take too long and it would cost too much money. That’s why we’ve got a supercomputer.”

The benefits of better forecasts

All this science and technology serves a range of practical purposes. The Met Office advises the NHS on the impact of weather on hospital admissions, the transport sector on the effects on roads, air and sea travel, the military on its bearing on strategic decisions, and the government on the consequences of climate change.

The supercomputer alone is expected to provide £2 billion in socio-economic benefits by providing enhanced resilience to weather and related hazards.

“There is huge diversity in the way that the simulations are used ranging from helping a fighter pilot decide if it is safe to take off to helping a store manager understand when to grit a supermarket car park,” says Ewen.

In every case, the objective is to help people make better-informed decisions by understanding what will happen in the future.

To improve on this service, the Met Office recently completed a project called ‘decoupler’. It enables the organisation to quickly create innovative, data-driven products and services by using a highly scaled set of APIs powered by a range of open source technologies.

“We’re shipping that to Amazon Web services as a native artefact and then we’re rebuilding, re-engineering, re-platforming – all those words – all of the technologies that you need to go ahead and generate something useful like an app or a website or a product or a service in the cloud,” says Ewen.

“That’s the future and that’s the big change in our strategy. It’s a very committed, customer-focused approach.”

Digital strategy at the Met Office

Even is sceptical about the trend to employ a range of C-level tech leaders at a single organisation to guide policy. He prefers to communicate the Met Office’s digital strategy through interdisciplinary working methods, such as embedding various members of the team in the same cross-functional projects.

The aim is to devolve knowledge and skills across the Met Office so different teams can make their own informed decisions

Ewen takes a multi-modal approach to software development, in which a waterfall model can be used for big infrastructure projects and more innovative ventures such as the Met Office app can be run as agile projects. The approach offers a deeper alternative to Gartner’s bi-modal model.

“We have squads and tribes and we work in sprints and we have retrospectives and epics and all that kind of stuff, and we deploy that where it’s needed, but equally we can work in more traditional project approaches,” he explains.

“We have a very flexible delivery model that is essentially a combination of an [Agile to Waterfall] delivery style and also in-source, internally managed and full out-source as a kind of matrix. We are always clear which approach we are taking when we initialise production activity.”

Around 315 dedicated engineers work under Head of Technology Richard Bevan, but a lot of Met Office technology is developed outside of those core teams.

The Met Office has also set up an Informatics Lab headed up by a physicist called Dr. Alberto Arribas. His 10-person multi-disciplinary team includes business people, engineers, scientists and designers focused on addressing the problems of the future, which helps the Met Office develop longer-term strategies.

The team identified the advantage of migrating to the cloud instead of using Red Hat’s OpenShift container application platform, and that using Hadoop didn’t offer the long-term value to justify the investment. Both decisions saved the Met Office a lot of money.

The lab also allows the Met Office to experiment with new technology from startups. Ewen balances their benefits with work with larger vendors in more traditional supply chains and specialist manufacturers such as Cray.

Even though the current Cray supercomputer only recently celebrated its first birthday at the Met Office, the organisation is already making plans for its successor. The current Cray is about 15 times more powerful than the last one, but the need for more power will eventually grow beyond its enormous capacity.

Ewen says the Met Office will address this by “becoming the cloud” through a high-scale connection of supercomputing, edge and cloud platforms: “What’s next is really understanding in detail how we’re going to extend our successful technology approach more widely across the rest of our [organisation] and our partner organisations.”