As the world celebrates Earth Day, it\u2019s a good time to acknowledge IT\u2019s important role in helping to fight or address climate change. Here are some examples of how organizations are using IT for projects related to environmental issues.\nKeeping cities on pace with climate actions\nNonprofit C40 Cities connects mayors in 97 cities worldwide \u2014 representing more than 700 million people and one quarter of the global economy \u2014 to take action against climate change. Data analytics plays a key role in the effort.\n[ Learn the secrets of highly successful data analytics teams. | Beware the 12 myths of data analytics and the sure-fire ways organizations fail at data analytics. | Get the latest on data analytics by signing up for CIO newsletters. ]\n\u201cThe goal of C40 Cities is to\u00a0enable as many cities as possible to develop climate action plans and individual climate actions that are in line with or exceed the Paris Climate Agreement,\u201d says Jared Pruzan, head of knowledge management at the organization.\u00a0\u201cQuality data management and analysis is a critical component in planning and implementation, and we are investing in building capacity within the organization and our cities to make the data we collect as productive as possible.\u201d\nThe data mining and analytics strategy supports C40's current three-year business plan, and is overseen by Rachel Huxley, director of knowledge and learning, and Pruzan.\nC40's knowledge management team develops internal dashboards that help track engagement, commitments, and city actions.\u00a0\u201cThis real-time analysis helps to\u00a0focus our efforts on key opportunities and identify priorities for further attention at the earliest possible moment,\u201d Pruzan says.\nIn parallel,\u00a0C40 builds and maintains a portfolio of analytics tools on themes that are central to city climate action, including greenhouse gas emissions, clean energy, air quality, adaptation, buildings and construction, transport and mobility, and waste.\n\u201cThese tools enable city\u00a0decision-makers to better understand\u00a0the factors and contexts they will need to consider as they make certain decisions,\u201d Pruzan says. They are published and made accessible to all cities, regardless of membership, on C40\u2019s online Knowledge Hub. \u201cUltimately, the goal of these initiatives is to complement our direct-support\u00a0and network services and expand the audience of cities who are able to benefit\u00a0from environment and climate data,\u201d he says.\nA key component of C40\u2019s efforts is a data and analytics platform from Qlik. The platform enables the organization\u00a0to\u00a0track critical key performance indicators (KPIs) on member cities,\u00a0including\u00a0their climate actions and other metrics.\n\u201cOur external server and a public Qlik stream allows us to share dashboards publicly with everyone, without users being required to log in or authenticate,\u201d Pruzan says.\u00a0\u201cTransparency is crucial for holding member cities accountable to our leadership standards and supporting our community as they strive to reach their goals.\u201dThe analytics platform is connected with C40\u2019s Salesforce customer relationship management (CRM) system and its internal data warehouse to pull the data we need and maintain the dashboards. The platform is also integrated with Google Analytics, through Google Analytics connector, which allows the organization to pull Google Analytics data from the C40 Knowledge Hub directly into dashboards. This enables users to conduct a deeper analysis of metrics \u201cthat we otherwise wouldn\u2019t have been able to,\u201d Pruzan says.The organization also uses Qlik GeoAnalytics extensions that enable it to conduct geospatial analyses and create maps for analysis. In addition, C40 embeds graphs and dashboards in websites through iframes and mashups. \u201cWe also use the \u2018tasks\u2019 functionalities that allow us to create periodical data refresh tasks daily and weekly, and maintain our internal dashboards,\u201d Pruzan says.\nMonitoring glacial melt \u2014 and preventing catastrophes\nPeru contains about the 70% of the tropical glacier mass, and in the past four decades has seen a loss of about 54% of glacier mass, according to Christian Yarlequ\u00e9, director of the Directorate of Information and Knowledge Management (DIGC) at the National Institute for Research on Glaciers and Mountain Ecosystems (INAIGEM). This is due to current warming generate by climate change.\nA number of communities situated across the tropical Andes are at increased risk of being affected by these glacial changes, due to the sudden occurrence of floods from avalanches and glacier lake outburst flooding, Yarlequ\u00e9 says.\nINAIGEM is a research institute established by the Peruvian government in 1970 to focus on reducing the impact of future hazards such as avalanches from glacier lakes across the Peruvian Andes. Its work has become more crucial as a result of the glacial activity driven by climate change. Nearly 30% of Peru\u2019s glaciers have melted away since 2000.\nOne important objective of the institute is to reduce the amount of time it takes to warn people about the possibility of these potentially life-threatening events. To achieve that, the organization has been working with cloud provider Amazon Web Services (AWS) to implement a real-time monitoring system of glacier lakes that have a high probability of a glacier ice avalanche that could impact the local population.\nSpecifically, in recent years, INAIGEM has been developing early warning and monitoring systems, in order to inform the population about the dangers of these events.\nThe first real-time monitoring system was installed into the Palcacocha Lake cryospheric system in 2017. The lake currently holds about 16 million cubic meters of water, fed by melting runoff from the glacier bodies around it. This monitoring system uses a set of telecommunication antennas to transfer real-time data, as high-definition video records the lake, the glacier landscape, and climatic data from an automatic weather station.\nThe recorded data is being analyzed to generate new, real-time avalanche detection using machine learning and AI. \u201cThose kinds of tools help us to optimize the early avalanche detection\u201d and quantify properties such as the speed, volume, mass, direction, and impact of glaciers, and the possibility that an avalanche can present a risk to people living downhill, Yarlequ\u00e9 says. Such early detection could help save lives, he says.\nINAIGEM is using Amazon Elastic Compute Cloud instances to run its applications and relies on Amazon Simple Storage Service to store and help safeguard its growing data resources. It also leverages Amazon CloudFront for highly secure content delivery and AWS CloudFormation to quickly provision resources.\nCameras connected by radio link continuously record areas of interest, and videos are stored in the cloud. Any scientist can access the videos from anywhere in the world, and select a part to analyze. \u201cIn this way, scientists no longer need to camp in the mountains for days or weeks near the glacier to obtain the data,\u201d Yarlequ\u00e9 says. \u201cTechnology brings the scientists closer to their study objective and they can collect data from miles away.\u201d\nINAIGEM, DICG, and the institute\u2019s IT department are also working on a platform that enables scientists to analyze glacier dynamics, to see how they behave over time to understand what time avalanches are more likely to occur, in what months, and at what rate, for example.\n\u201cThis analysis of glacial dynamics is extremely useful because it allows us to better understand the situation in the area and to prevent a disaster,\u201d Yarlequ\u00e9 says. \u201cWe have developed some artificial intelligence algorithms to detect movements and certain thresholds of changes.\u201d\nINAIGEM is working with AWS to allow anyone to see video of the surveillance areas from a cell phone in real-time; and to access any part of the recorded videos from wherever they are. They\u2019re also working on a notification system that would alert subscribers of the service whenever there\u2019s an avalanche.\nSaving sea turtles\nResearchers at Texas A&M University\u2013Corpus Christi (TAMUCC) are using data analytics and machine learning to better understand how climate change is affecting local sea turtle populations. They\u2019re also using the data to build predictive tools to help the response to extreme events and mitigate their impact on threatened and endangered species.\nThe program was started in the mid-2000s by a group that includes the TAMUCC Conrad Blucher Institute, the Coastal Conservation Association, Texas Parks and Wildlife, and the Gulf Intracoastal Canal Association.\u00a0The Padre Island National Seashore Division of Sea Turtle Science and Recovery also plays a critical role in the effort, and many other agencies, nonprofit organizations, and volunteers are involved in the mitigation, rescue, and rehabilitation of the wildlife.\nThe initial goal was to better understand and mitigate the impact of cold-water events in the Laguna Madre, according to Philippe Tissot, associate research professor at the university. \u201cThe Laguna is a unique ecosystem, one of the longest hypersaline lagunas in the world, and home to sea turtles and many fish species,\u201d he says.\nThe team developed a system to mitigate the impact of cold-water events by predicting them and issuing recommendations for voluntary interruption of navigation and activities, Tissot says.\nMachine learning models were at work in February 2021 during a record \u201ccold stunning\u201d event.\u00a0Cold stunning is a phenomenon that occurs when sea turtles\u00a0are exposed to unusually cold water for an extended period of time, causing shock, pneumonia, frostbite, and even death.\nTo more accurately assess the likelihood of these conditions occurring, TAMUCC worked with IBM to create forecasts, plotting 100 air temperature forecast scenarios and then linking those trajectories with an artificial intelligence (AI)-based water temperature model to more accurately produce the probability of cold stunning conditions.\nHaving an idea of the range of possible temperatures helps to better plan for best- and worst-case scenarios. With a longer forecast lead time \u2014 now 15 days from what was previously one week \u2014 volunteers are now mobilizing faster to rescue the stunned\u00a0sea\u00a0turtles and to better mitigate the impact of climate change on the ecosystem.\nTAMUCC is a collaborator in the new U.S. National Science Foundation AI Institute for Research on Trustworthy AI in Weather, Climate and Coastal Oceanography, a team led by University of Oklahoma. The team aims to generate AI model confidence intervals, to ultimately help improve trust in these type of predictions.\n\u201cPreparation is essential to mitigate the impact, reroute cargo, [and] alert volunteers to rescue potentially thousands of sea turtles,\u201d Tissot says. The February 2021 event saw a record of more than 10,000 cold-stunned sea turtles, he says.\nAir temperature predictions and other inputs such as recent measurements are fed into a machine learning algorithm to predict the start of the cold-stunning events. The system has worked well, but longer prediction lead times are needed, particularly to provide guidance for the end of the cold-stunning events, Tissot says.\nThe university, IBM, and the other partners are working on developing more sophisticated machine learning models and better predictions, which will help the organizations involved better address the impact of extreme event, Tissot says.\n\u201cClimate change is heightening the need for adaptations and our work is providing better tools to help mitigate the impact of these events, and, in this case to help save threatened and endangered sea turtle species,\u201d Tissot says.\nCleaning up water waste\nThe government in Denmark is continuously looking for ways to improve the environment. For example, in 2019, the Danish Parliament passed a Climate Law that makes the parliament accountable for reducing CO2 emissions by 70% by 2030. All new laws, initiatives, and public investments need to be examined for their ability to help reduce CO2.\nIT tools are seen as ideal for creating sustainable solutions that can help the country obtain its environmental goals, says Martin Skjold Gr\u00f8ntved, special consultant for the Danish Agency for Data Supply in the Danish Climate Ministry.\n\u201cDigital means are used both to improve the environment but also [for] creating a modern administration, providing citizen-centric service,\u201d Gr\u00f8ntved says.\nThe City of Aarhus, the second largest city in Denmark, provides a good example of technology\u2019s impact. The Danish Ministry of Climate has created a research and development testbed called TAPAS (testbed in Aarhus for precise positioning and autonomous systems) with the aim of developing solutions to create a more livable city to attract citizens, enterprises, and tourism.\nTAPAS acts \u201cas a real-life laboratory\u201d to test products such as drones and robots to help support green efforts, Gr\u00f8ntved says. One program to come out of the testbed is CityShark, which produced a drone system to detect and clean up oil spills and trash in the harbor area of Aarhus.\nThe sailing drone, called WasteShark, will autonomously roam the waters in the area where the Aarhus River flows into the harbor and pick up solid waste such as plastic bottles, single use cups, plastic bags, etc.\nWhen not hunting for solid waste, the drone will be paired in a network with a flying drone mounted with a camera. The flying drone uses an oil-detecting algorithm developed by Danish Technical University and cloud-based image sensing on streaming data that\u2019s analyzed from a data warehouse provided by Kinetica.\nThe data is used to detect even small amounts of oil or gasoline waste on the water surface and provide a location for the sailing drone to clean up. \u201cWhen working together with the flying drone, the sailing drone will be equipped with an oil skimming unit that enables it to clean up oil spills,\u201d Gr\u00f8ntved says. The two drones will coordinate their efforts in real-time using the TAPAS testbed, 5G communications, Oracle Cloud, and algorithms from Kinetica.\nUsing an on-site weather station feeding data through a local open data platform, the CityShark project when fully implemented is expected to become close to fully autonomous and take ultra-local weather conditions into consideration when calculating the best routes to get to oil spills.\nThe aim of the project \u201cis to develop a feasible technology that can be expanded to areas that [have] more severe waste and pollution burdens than Aarhus, and in this way help to clean up the oceans,\u201d Gr\u00f8ntved says.