mercoledì 24 febbraio 2016

Written by Andrea Ballor

Sceptics of manmade climate change offer various natural causes to explain why the Earth has warmed 0.8 degrees Celsius since 1880. In this piece, our aim will be not to argue about men’s faults. We’ll just assume the fact that climate is changing, which is a fact, beyond any criticism.
The term ‘climate change’, can cover many things, some natural and some manmade, including global warming, as well as loss of wildlife habitat. Each of these brings its own challenges but, increasingly, big data and analytics are being put to use to come up with new solutions and research methods. Climate scientists have been gathering a great deal of data for a long time, but analytics technology’s catching up is comparatively recent. Now that massive amounts of processing power are affordable for almost everyone, those data sets are being put to use. On top of that, the growing number of Internet of Things devices we are carrying around bring huge amounts of data to the collectors. The rise of social media means more and more people are reporting environmental data and uploading photos and videos of their environment, which also can be analysed for clues.

One of the most ambitious projects that employ big data to study the environment is Microsoft’s Madingley, said to be the first ‘General Ecosystem Model’, or GEM. The project already provides a working simulation of the global carbon cycle, and it is hoped that, eventually, everything from deforestation to animal migration, pollution, and overfishing will be modelled in a real-time “virtual biosphere”. It offers decision-makers a tool to explore the potential effects of their decisions on the environment, in a computer, before decisions are rolled out in the real world. The model has been released as open source code, allowing anyone to inspect the current version of the model or develop it further. They hope this project will encourage other scientists to become involved in developing this, or analogous global models of life on planet Earth. Just a few years ago, this idea would have seemed ridiculous. Today, one of the world’s biggest companies is pouring serious money in the project, sign that they believe that analytical technology has finally caught up with the ability to collect and store data.

Adding evidence to the trend, last year the UN, launched the Big Data Climate Challenge, a competition aimed to promote innovative data-driven climate change projects. Among the first to receive recognition under the program is Global Forest Watch, which combines satellite imagery, crowd-sourced witness accounts, and public datasets to track deforestation around the world. The project has been promoted as a way for ethical businesses to ensure that their supply chain is not complicit in deforestation. Other initiatives are targeted at a more personal level, for example by analysing transit routes that could be used for individual journeys, using Google Maps, and making recommendations based on carbon emissions for each route. Others more specific, such as Weathersafe, which helps coffee growers adapt to changing weather patterns and soil conditions.

Analytics can as well help smart cities to grow. The Internet of Things, the idea that everyday objects and tools are becoming increasingly connected, interactive, and intelligent, and capable of communicating with each other independently of humans, is becoming more and more central, and provides amount of significant data we never even imagined before. Smart metering allows utility companies to increase or restrict the flow of electricity, gas, or water to reduce waste and ensure adequate supply at peak periods. IBM has recently entered this sector. Back in 2014 they’ve been helping Beijing combat its air pollution crisis using a data analysis platform called Green Horizons. This software uses machine learning to analyse previous weather forecasts, crunching data to determine how good those predictions were in different scenarios, and then build better forecasting models over time. It’s been developed because it has been discovered that weather conditions have a direct effect upon how city residents experience the effects of air pollution. Depending on the forecasts provided, the software is able to reduce pollution emissions in the areas where they could hit worst. "Knowing where pollution is coming from and how much is in the air will drive action to reduce it," told Bob Perciasepe, president of the Centre for Climate and Energy Solutions "Experience shows that when measurement happens, pollution levels go down and public health is improved. This near-term action improves the liveability of communities and the wellbeing of citizens”. Using data analysis to more accurately source, model and mitigate air pollution is a key strategy for combating climate change in urban environments. Knowing more about the problems drives more cities towards solving them.

It’s apparent that data – big or small – can tell us if, how, and why climate change is happening. Of course, this is only really valuable to us if it also can tell us what we can do about it. All these projects are built around the principle of predictive modelling. Once a working simulation of a climate change system – deforestation, overfishing, ice cap melt, or carbon emissions – has been built based on real, observed data, then by adjusting variables we can see how it might be possible to halt or even, in some cases, reverse the damage that is being done. After all, the whole point of big data analysis, in climate science or otherwise, is to generate actionable insights that can drive growth or change.

Thanks to the growth of big data analysis, it is becoming apparent that the actions of individuals can make a difference, a measurable difference, when they are able to make decisions based on sophisticated analysis of accurate data.



Posta un commento