by Bob Violino

IT confronts climate change

Apr 22, 202113 mins
Artificial IntelligenceGreen IT

Data analytics and AI are taking center stage in efforts to combat the fallout of climate change and environmental disasters across the globe.

A heart-shaped leaf lies on a circuit board. [Green IT / environmental impact / climate change]
Credit: Weerapatkiatdumrong / Getty Images

As the world celebrates Earth Day, it’s a good time to acknowledge IT’s important role in helping to fight or address climate change. Here are some examples of how organizations are using IT for projects related to environmental issues.

Keeping cities on pace with climate actions

Nonprofit C40 Cities connects mayors in 97 cities worldwide — representing more than 700 million people and one quarter of the global economy — to take action against climate change. Data analytics plays a key role in the effort.

“The goal of C40 Cities is to enable as many cities as possible to develop climate action plans and individual climate actions that are in line with or exceed the Paris Climate Agreement,” says Jared Pruzan, head of knowledge management at the organization. “Quality data management and analysis is a critical component in planning and implementation, and we are investing in building capacity within the organization and our cities to make the data we collect as productive as possible.”

The data mining and analytics strategy supports C40’s current three-year business plan, and is overseen by Rachel Huxley, director of knowledge and learning, and Pruzan.

C40’s knowledge management team develops internal dashboards that help track engagement, commitments, and city actions. “This real-time analysis helps to focus our efforts on key opportunities and identify priorities for further attention at the earliest possible moment,” Pruzan says.

In parallel, C40 builds and maintains a portfolio of analytics tools on themes that are central to city climate action, including greenhouse gas emissions, clean energy, air quality, adaptation, buildings and construction, transport and mobility, and waste.

“These tools enable city decision-makers to better understand the factors and contexts they will need to consider as they make certain decisions,” Pruzan says. They are published and made accessible to all cities, regardless of membership, on C40’s online Knowledge Hub. “Ultimately, the goal of these initiatives is to complement our direct-support and network services and expand the audience of cities who are able to benefit from environment and climate data,” he says.

A key component of C40’s efforts is a data and analytics platform from Qlik. The platform enables the organization to track critical key performance indicators (KPIs) on member cities, including their climate actions and other metrics.

“Our external server and a public Qlik stream allows us to share dashboards publicly with everyone, without users being required to log in or authenticate,” Pruzan says. “Transparency is crucial for holding member cities accountable to our leadership standards and supporting our community as they strive to reach their goals.” The analytics platform is connected with C40’s Salesforce customer relationship management (CRM) system and its internal data warehouse to pull the data we need and maintain the dashboards. The platform is also integrated with Google Analytics, through Google Analytics connector, which allows the organization to pull Google Analytics data from the C40 Knowledge Hub directly into dashboards. This enables users to conduct a deeper analysis of metrics “that we otherwise wouldn’t have been able to,” Pruzan says. The organization also uses Qlik GeoAnalytics extensions that enable it to conduct geospatial analyses and create maps for analysis. In addition, C40 embeds graphs and dashboards in websites through iframes and mashups. “We also use the ‘tasks’ functionalities that allow us to create periodical data refresh tasks daily and weekly, and maintain our internal dashboards,” Pruzan says.

Monitoring glacial melt — and preventing catastrophes

Peru contains about the 70% of the tropical glacier mass, and in the past four decades has seen a loss of about 54% of glacier mass, according to Christian Yarlequé, director of the Directorate of Information and Knowledge Management (DIGC) at the National Institute for Research on Glaciers and Mountain Ecosystems (INAIGEM). This is due to current warming generate by climate change.

A number of communities situated across the tropical Andes are at increased risk of being affected by these glacial changes, due to the sudden occurrence of floods from avalanches and glacier lake outburst flooding, Yarlequé says.

INAIGEM is a research institute established by the Peruvian government in 1970 to focus on reducing the impact of future hazards such as avalanches from glacier lakes across the Peruvian Andes. Its work has become more crucial as a result of the glacial activity driven by climate change. Nearly 30% of Peru’s glaciers have melted away since 2000.

One important objective of the institute is to reduce the amount of time it takes to warn people about the possibility of these potentially life-threatening events. To achieve that, the organization has been working with cloud provider Amazon Web Services (AWS) to implement a real-time monitoring system of glacier lakes that have a high probability of a glacier ice avalanche that could impact the local population.

Specifically, in recent years, INAIGEM has been developing early warning and monitoring systems, in order to inform the population about the dangers of these events.

The first real-time monitoring system was installed into the Palcacocha Lake cryospheric system in 2017. The lake currently holds about 16 million cubic meters of water, fed by melting runoff from the glacier bodies around it. This monitoring system uses a set of telecommunication antennas to transfer real-time data, as high-definition video records the lake, the glacier landscape, and climatic data from an automatic weather station.

The recorded data is being analyzed to generate new, real-time avalanche detection using machine learning and AI. “Those kinds of tools help us to optimize the early avalanche detection” and quantify properties such as the speed, volume, mass, direction, and impact of glaciers, and the possibility that an avalanche can present a risk to people living downhill, Yarlequé says. Such early detection could help save lives, he says.

INAIGEM is using Amazon Elastic Compute Cloud instances to run its applications and relies on Amazon Simple Storage Service to store and help safeguard its growing data resources. It also leverages Amazon CloudFront for highly secure content delivery and AWS CloudFormation to quickly provision resources.

Cameras connected by radio link continuously record areas of interest, and videos are stored in the cloud. Any scientist can access the videos from anywhere in the world, and select a part to analyze. “In this way, scientists no longer need to camp in the mountains for days or weeks near the glacier to obtain the data,” Yarlequé says. “Technology brings the scientists closer to their study objective and they can collect data from miles away.”

INAIGEM, DICG, and the institute’s IT department are also working on a platform that enables scientists to analyze glacier dynamics, to see how they behave over time to understand what time avalanches are more likely to occur, in what months, and at what rate, for example.

“This analysis of glacial dynamics is extremely useful because it allows us to better understand the situation in the area and to prevent a disaster,” Yarlequé says. “We have developed some artificial intelligence algorithms to detect movements and certain thresholds of changes.”

INAIGEM is working with AWS to allow anyone to see video of the surveillance areas from a cell phone in real-time; and to access any part of the recorded videos from wherever they are. They’re also working on a notification system that would alert subscribers of the service whenever there’s an avalanche.

Saving sea turtles

Researchers at Texas A&M University–Corpus Christi (TAMUCC) are using data analytics and machine learning to better understand how climate change is affecting local sea turtle populations. They’re also using the data to build predictive tools to help the response to extreme events and mitigate their impact on threatened and endangered species.

The program was started in the mid-2000s by a group that includes the TAMUCC Conrad Blucher Institute, the Coastal Conservation Association, Texas Parks and Wildlife, and the Gulf Intracoastal Canal Association. The Padre Island National Seashore Division of Sea Turtle Science and Recovery also plays a critical role in the effort, and many other agencies, nonprofit organizations, and volunteers are involved in the mitigation, rescue, and rehabilitation of the wildlife.

The initial goal was to better understand and mitigate the impact of cold-water events in the Laguna Madre, according to Philippe Tissot, associate research professor at the university. “The Laguna is a unique ecosystem, one of the longest hypersaline lagunas in the world, and home to sea turtles and many fish species,” he says.

The team developed a system to mitigate the impact of cold-water events by predicting them and issuing recommendations for voluntary interruption of navigation and activities, Tissot says.

Machine learning models were at work in February 2021 during a record “cold stunning” event. Cold stunning is a phenomenon that occurs when sea turtles are exposed to unusually cold water for an extended period of time, causing shock, pneumonia, frostbite, and even death.

To more accurately assess the likelihood of these conditions occurring, TAMUCC worked with IBM to create forecasts, plotting 100 air temperature forecast scenarios and then linking those trajectories with an artificial intelligence (AI)-based water temperature model to more accurately produce the probability of cold stunning conditions.

Having an idea of the range of possible temperatures helps to better plan for best- and worst-case scenarios. With a longer forecast lead time — now 15 days from what was previously one week — volunteers are now mobilizing faster to rescue the stunned sea turtles and to better mitigate the impact of climate change on the ecosystem.

TAMUCC is a collaborator in the new U.S. National Science Foundation AI Institute for Research on Trustworthy AI in Weather, Climate and Coastal Oceanography, a team led by University of Oklahoma. The team aims to generate AI model confidence intervals, to ultimately help improve trust in these type of predictions.

“Preparation is essential to mitigate the impact, reroute cargo, [and] alert volunteers to rescue potentially thousands of sea turtles,” Tissot says. The February 2021 event saw a record of more than 10,000 cold-stunned sea turtles, he says.

Air temperature predictions and other inputs such as recent measurements are fed into a machine learning algorithm to predict the start of the cold-stunning events. The system has worked well, but longer prediction lead times are needed, particularly to provide guidance for the end of the cold-stunning events, Tissot says.

The university, IBM, and the other partners are working on developing more sophisticated machine learning models and better predictions, which will help the organizations involved better address the impact of extreme event, Tissot says.

“Climate change is heightening the need for adaptations and our work is providing better tools to help mitigate the impact of these events, and, in this case to help save threatened and endangered sea turtle species,” Tissot says.

Cleaning up water waste

The government in Denmark is continuously looking for ways to improve the environment. For example, in 2019, the Danish Parliament passed a Climate Law that makes the parliament accountable for reducing CO2 emissions by 70% by 2030. All new laws, initiatives, and public investments need to be examined for their ability to help reduce CO2.

IT tools are seen as ideal for creating sustainable solutions that can help the country obtain its environmental goals, says Martin Skjold Grøntved, special consultant for the Danish Agency for Data Supply in the Danish Climate Ministry.

“Digital means are used both to improve the environment but also [for] creating a modern administration, providing citizen-centric service,” Grøntved says.

The City of Aarhus, the second largest city in Denmark, provides a good example of technology’s impact. The Danish Ministry of Climate has created a research and development testbed called TAPAS (testbed in Aarhus for precise positioning and autonomous systems) with the aim of developing solutions to create a more livable city to attract citizens, enterprises, and tourism.

TAPAS acts “as a real-life laboratory” to test products such as drones and robots to help support green efforts, Grøntved says. One program to come out of the testbed is CityShark, which produced a drone system to detect and clean up oil spills and trash in the harbor area of Aarhus.

The sailing drone, called WasteShark, will autonomously roam the waters in the area where the Aarhus River flows into the harbor and pick up solid waste such as plastic bottles, single use cups, plastic bags, etc.

When not hunting for solid waste, the drone will be paired in a network with a flying drone mounted with a camera. The flying drone uses an oil-detecting algorithm developed by Danish Technical University and cloud-based image sensing on streaming data that’s analyzed from a data warehouse provided by Kinetica.

The data is used to detect even small amounts of oil or gasoline waste on the water surface and provide a location for the sailing drone to clean up. “When working together with the flying drone, the sailing drone will be equipped with an oil skimming unit that enables it to clean up oil spills,” Grøntved says. The two drones will coordinate their efforts in real-time using the TAPAS testbed, 5G communications, Oracle Cloud, and algorithms from Kinetica.

Using an on-site weather station feeding data through a local open data platform, the CityShark project when fully implemented is expected to become close to fully autonomous and take ultra-local weather conditions into consideration when calculating the best routes to get to oil spills.

The aim of the project “is to develop a feasible technology that can be expanded to areas that [have] more severe waste and pollution burdens than Aarhus, and in this way help to clean up the oceans,” Grøntved says.