climate''prediction''.net (CPDN) is a
volunteer computing project to investigate and reduce uncertainties in
climate modelling. It aims to do this by running hundreds of thousands of different models (a large
climate ensemble) using the donated idle time of ordinary
personal computers, thereby leading to a better understanding of how models are affected by small changes in the many
parameters known to influence the global climate.
The project relies on the
BOINC framework where voluntary participants agree to run some processes of the project at the
client-side in their personal computers after receiving tasks from the
server-side for treatment.
CPDN, which is run primarily by
Oxford University in
England, has harnessed more computing power and generated more data than any other climate modelling project. It has produced over 100 million model years of data so far. , there are more than 12,000 active participants from 223 countries with a total
BOINC credit of more than 27 billion, reporting about 55
teraflops
In computing, floating point operations per second (FLOPS, flops or flop/s) is a measure of computer performance, useful in fields of scientific computations that require floating-point calculations. For such cases, it is a more accurate meas ...
(55 trillion operations per second) of processing power.
Aims

The aim of the climate''prediction''.net project is to investigate the uncertainties in various parameterizations that have to be made in state-of-the-art climate models. The model is run thousands of times with slight perturbations to various physics parameters (a 'large
ensemble') and the project examines how the model output changes. These parameters are not known exactly, and the variations are within what is subjectively considered to be a plausible range. This will allow the project to improve understanding of how sensitive the models are to small changes and also to things like changes in
carbon dioxide and
sulphur cycle
The sulfur cycle is a biogeochemical cycle in which the sulfur moves between rocks, waterways and living systems. It is important in geology as it affects many minerals and in life because sulfur is an essential element ( CHNOPS), being a con ...
. In the past, estimates of climate change have had to be made using one or, at best, a very small ensemble (tens rather than thousands) of model runs. By using participants' computers, the project will be able to improve understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists.
The climate''prediction''.net experiment is intended to help "improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models", identified by the
Intergovernmental Panel on Climate Change
The Intergovernmental Panel on Climate Change (IPCC) is an intergovernmental body of the United Nations. Its job is to advance scientific knowledge about climate change caused by human activities. The World Meteorological Organization (WMO) a ...
(IPCC) in 2001 as a high priority. Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century.
As shown in the graph above, the various models have a fairly wide distribution of results over time. For each curve, on the far right, there is a bar showing the final temperature range for the corresponding model version. The further into the future the model is extended, the wider the variances between them. Roughly half of the variation depends on the future
climate forcing scenario rather than uncertainties in the model. Any reduction in those variations, whether from better scenarios or improvements in the models, are wanted. climate''prediction''.net is working on model uncertainties, not the scenarios.
Currently, scientists can run models and see that x% of the models warm y degrees in response to z climate forcings, but are uncertain as to whether x% is a good representation of the probability of that happening in the real world. Some models will be good and some poor at producing past climate when given past climate forcings and initial conditions (a
hindcast). It does make sense to trust the models that do well at recreating the past more than those that do poorly. Therefore, models that do poorly will be down weighted.
The experiments
The different models that climate''prediction''.net has and will distribute are detailed below in chronological order. Therefore, anyone who has joined recently is likely to be running the
transient coupled model.
* Classic Slab Model - The original experiment not under
BOINC. See
#The original model for further details. This model remains in use solely for the OU short course.
* BOINC Slab Model - The same as the classic
Slab Model, but released under
BOINC.
*
ThermoHaline Circulation (THC) Model - An investigation of how the climate might change in the event of a decrease in the strength of the ''T''hermo''H''aline ''C''irculation. This experiment has now been closed to new participants as they have sufficient results. It was a
four phase model totaling 60 model years. The first three phases were identical to the above
Slab Models. The fourth phase imposed the effects of a 50% slowdown in the Thermohaline circulation by imposing
SST changes in the north Atlantic derived from other runs.
*
Sulfur Cycle Model - An investigation of the effect of sulfate aerosols on the climate. The experiment will model
sulfur
Sulfur (or sulphur in British English) is a chemical element with the symbol S and atomic number 16. It is abundant, multivalent and nonmetallic. Under normal conditions, sulfur atoms form cyclic octatomic molecules with a chemical formula ...
in a number of compound forms including
dimethyl sulfide
Dimethyl sulfide (DMS) or methylthiomethane is an organosulfur compound with the formula (CH3)2S. Dimethyl sulfide is a flammable liquid that boils at and has a characteristic disagreeable odor. It is a component of the smell produced from cook ...
and sulfate aerosols. This experiment started in August 2005 and was a pre-requirement for the
Hindcast. It is a 5 phase model totalling 75 model years. Timesteps are around 70% longer, making the model around 2.8 times longer than the initial slab model. While a few models are still tricking, model have not been issued since 2006.
* Coupled Spin-Up Model - Inclusion of oceanic influences into the basic model in a more dynamic and realistic way than the initial
Slab Model. This was a pre-requirement for the
Hindcast. This has been completed and, as planned, was not publicly released. The fastest 200 - 500 computers were invited to join because it is a 200-year model and results were needed by February 2006 for the
transient coupled model launch.
*
Transient coupled Model - This comprises an 80-year
Hindcast and an 80-year
forecast. The Hindcast is to test how well the models perform at recreating the climate of 1920 to 2000. It was launched February 2006 under
BBC Climate Change Experiment branding and later also released from the CPDN site.
*
Seasonal Attribution Project - This is a high resolution model for a single model year to look at extreme precipitation events. This experiment is much shorter due to its single model year, but there are 13.5 times as many cells and timesteps are only 10 minutes instead of 30 minutes. This extra resolution means it requires at least 1.5
gigabyte
The gigabyte () is a multiple of the unit byte for digital information. The prefix ''giga'' means 109 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB.
This defini ...
s of
RAM. It uses the HadAM3-N144 climate model.
History
Myles Allen first thought about the need for large
climate ensembles in 1997, but was only introduced to the success of
SETI@home in 1999. The first funding proposal in April 1999 was rejected as utterly unrealistic.
Following a presentation at the
World Climate Conference
The World Climate Conferences are a series of international meetings, organized by the World Meteorological Organization (WMO), about global climate issues principally global warming in addition to climate research and forecasting.
Conferences
...
in
Hamburg in September 1999 and a commentary in
''Nature'' in October 1999, thousands signed up to this supposedly imminently available program. The
dot-com bubble bursting did not help and the project realised they would have to do most of the programming themselves rather than outsourcing.
It was launched September 12, 2003, and on September 13, 2003, the project exceeded the capacity of the
Earth Simulator
The is a series of supercomputers deployed at Japan Agency for Marine-Earth Science and Technology Yokohama Institute of Earth Sciences.
Earth Simulator (first generation)
The first generation of Earth Simulator, developed by the Japanese g ...
to become the world's largest climate modelling facility.
The 2003 launch only offered a
Windows "classic" client. On 26 August 2004 a
BOINC client was launched which supported Windows,
Linux and
Mac OS X clients. "Classic" will continue to be available for a number of years in support of the
Open University course. BOINC has stopped distributing classic models in favour of sulfur cycle models. A more user friendly BOINC client and website called GridRepublic, which supports climate''prediction''.net and other BOINC projects, was released in beta in 2006.
A
thermohaline circulation slowdown experiment was launched in May 2004 under the classic framework to coincide with the film ''
The Day After Tomorrow''. This program can still be run but is no longer downloadable. The scientific analysis has been written up in
Nick Faull's thesis. A paper about the thesis is still to be completed. There is no further planned research with this model.
A sulfur cycle model was launched in August 2005. They took longer to complete than the original models as a result of having five phases instead of three. Each timestep was also more complicated.
By November 2005, the number of completed results totalled 45,914 classic models, 3,455 thermohaline models, 85,685 BOINC models and 352 sulfur cycle models. This represented over 6 million model years processed.
In February 2006, the project moved on to more realistic climate models. The BBC Climate Change Experiment was launched, attracting around 23,000 participants on the first day. The
transient climate simulation A transient climate simulation is a mode of running a global climate model (GCM) in which a period of time (typically 1850–2100) is simulated with continuously-varying concentrations of greenhouse gases so that the climate of the model represe ...
introduced realistic oceans. This allowed the experiment to investigate changes in the climate response as the
climate forcings are changed, rather than an equilibrium response to a significant change like doubling the
carbon dioxide level. Therefore, the experiment has now moved on to doing a hindcast of 1920 to 2000 as well as a forecast of 2000 to 2080. This model takes much longer.
The
BBC gave the project publicity with over 120,000 participating computers in the first three weeks.
In March 2006, a high resolution model was released as another project, the
Seasonal Attribution Project.
In April 2006, the coupled models were found to have a data input problem. The work was useful for a different purpose than advertised. New models had to be handed out.
Results to date
The first results of the experiment were published in ''
Nature'' in January 2005, showing that with only slight changes to the parameters within plausible ranges, the models can show climate sensitivities from less than 2 °C to more than 11 °C.
The higher climate sensitivities have been challenged as implausible. For example, by Gavin Schmidt (a climate modeler with the NASA Goddard Institute for Space Studies in New York).
Explanation
Climate sensitivity is defined as the equilibrium response of global mean temperature to doubling levels of carbon dioxide. Current levels of carbon dioxide are around 420 ppm and growing at a rate of 1.8 ppm per year compared with preindustrial levels of 280 ppm.
Climate sensitivities of greater than 5 °C are widely accepted as being catastrophic. The possibility of such high sensitivities being plausible given observations had been reported prior to the climate''prediction''.net experiment but "this is the first time
GCMs have produced such behaviour".
Even the models with very high climate sensitivity were found to be "as realistic as other state-of-the-art climate models". The test of realism was done with a root mean square error test. This does not check on realism of seasonal changes and it is possible that more diagnostic measures may place stronger constraints on what is realistic. Improved realism tests are being developed.
It is important to the experiment and the goal of obtaining a
probability distribution function Probability distribution function may refer to:
* Probability distribution
* Cumulative distribution function
* Probability mass function
* Probability density function
In probability theory, a probability density function (PDF), or density ...
(pdf) of climate outcomes to get a very wide range of behaviours even if only to rule out some behaviours as unrealistic. Larger sets of simulations have more reliable pdfs. Therefore, models with climate sensitivities as high as 11 °C are included despite their limited accuracy. The sulfur cycle experiment is likely to extend the range downwards.
Piani ''et al.'' (2005)
Published in ''Geophysical Review Letters'', this paper concludes:
When an internally consistent representation of the origins of model-data discrepancy is used to calculate the probability density function of climate sensitivity, the 5th and 95th percentiles are 2.2 K and 6.8 K respectively. These results are sensitive, particularly the upper bound, to the representation of the origins of model data discrepancy.
Use in education
There is an
Open University short course
and teaching material available for schools to teach subjects relating to climate and climate modelling. There is also teaching material available for use in Key Stage 3/4 Science, A level Physics (Advanced Physics), Key Stage 3/4 Mathematics, Key Stage 3/4 Geography, 21st Century Science, Science for Public Understanding, Use of Mathematics, Primary.
The original model
The original experiment is run with
HadSM3
HadCM3 (abbreviation for ''Hadley Centre Coupled Model, version 3'') is a coupled atmosphere-ocean general circulation model (AOGCM) developed at the Hadley Centre in the United Kingdom. It was one of the major models used in the IPCC Third Ass ...
, which is the
HadAM3
HadCM3 (abbreviation for ''Hadley Centre Coupled Model, version 3'') is a coupled atmosphere-ocean general circulation model (AOGCM) developed at the Hadley Centre in the United Kingdom. It was one of the major models used in the IPCC Third Ass ...
atmosphere from the
HadCM3 model but with only a "slab" ocean rather than a full dynamic ocean. This is faster (and requires less memory) than the full model, but lacks dynamical feedbacks from the ocean, which are incorporated into the full coupled-ocean-atmosphere models used to make projections of climate change out to 2100.
Each downloaded model comes with a slight variation in the various model
parameters.
In the initial "calibration phase" of 15 model years, the model calculates the "flux correction"; extra ocean-atmosphere fluxes that are needed to keep the model ocean in balance (the model ocean does not include currents; these fluxes to some extent replace the heat that would be transported by the missing currents).
In the "control phase" of 15 years, the ocean temperatures are allowed to vary. The flux correction ought to keep the model stable, but
feedback
Feedback occurs when outputs of a system are routed back as inputs as part of a chain of cause-and-effect that forms a circuit or loop. The system can then be said to ''feed back'' into itself. The notion of cause-and-effect has to be handled ...
s developed in some of the runs. There is a quality control check, based on the annual mean temperatures, and models which fail this check are discarded.
In the "double CO
2 phase", the CO
2 content is instantaneously doubled and the model run for a further 15 years, which in some cases is not quite sufficient model time to settle down to a new (warmer) equilibrium. In this phase some models which produced physically unrealistic results were again discarded.
The quality control checks in the control and 2*CO
2 phases were quite weak: they suffice to exclude obviously unphysical models but do not include (for example) a test of the simulation of the seasonal cycle; hence some of the models passed may still be unrealistic. Further quality control measures are being developed.
The temperature in the doubled CO
2 phase is exponentially extrapolated to work out the equilibrium temperature. Difference in temperature between this and the control phase then gives a measure of the
climate sensitivity of that particular version of the model.
Visualisations
Many volunteer computing projects have
screensavers to visually indicate the activity of the application, but they do not usually show its results as they are being calculated. By contrast, climate''prediction''.net not only uses a built-in visualisation to show the climate of the world being modelled, but it is interactive which allows different aspects of climate (temperature, rainfall, etc.) to be displayed. In addition, there are other, more advanced visualisation programs that allow the user to see more of what the model is doing (usually by analysing previously generated results) and to compare different runs and models.
The real-time desktop visualisation for the model launched in 2003 was developed by Jeremy Walton at
NAG, enabling users to track the progress of their simulation as the cloud cover and temperature changes over the surface of the globe. Other, more advanced visualisation programs in use include ''CPView'' and ''IDL Advanced Visualisation''. They have similar functionality. CPView was written by Martin Sykes, a participant in the experiment. The IDL Advanced Visualisation was written by Andy Heaps of the
University of Reading (
UK), and modified to work with the BOINC version by Tesella Support Services plc.
Only CPView allows you to look at unusual diagnostics, rather than the usual Temperature, Pressure, Rainfall, Snow, and Clouds. Up to 5 sets of data can be displayed on a map. It also has a wider range of functions like Max, Min, further memory functions, and other features.
The Advanced Visualisation has functions for graphs of local areas and over 1 day, 2 days, and 7 days, as well as the more usual graphs of season and annual averages (which both packages do). There are also Latitude - Height plots and Time - Height plots.
The download size is much smaller for CPView and CPView works with
Windows 98.
As of December 2008 there is no visualisation tool that works with the newer CPDN models. Neither CPView nor Advanced Visualisation have been updated to display data gathered from those models. So users can only visualize the data through the screensaver.
BBC Climate Change Experiment
The BBC Climate Change Experiment was a
BOINC project led by
Oxford University with several partners including the UK
Met Office
The Meteorological Office, abbreviated as the Met Office, is the United Kingdom's national weather service. It is an executive agency and trading fund of the Department for Business, Energy and Industrial Strategy and is led by CEO Penelope E ...
, the
BBC, the
Open University and
Reading University. It is the
transient coupled model of the climateprediction.net project.
Many participants joined the project with over 120,000 people signing up in teams.
Results continued to be collected for some time with the follow-up television program being aired in January 2007. On 8 March 2009, climate''prediction''.net officially declared that BBC Climate Change Experiment was finished, before shutting down the project.
BBC Experiment Finished
climateprediction.net Official Website Project News
See also
* Climate model
* Global climate model
* Climate ensemble
* Volunteer computing
* List of volunteer computing projects
* BOINC
* Sensitivity analysis and Uncertainty analysis
Uncertainty analysis investigates the uncertainty of variables that are used in decision-making problems in which observations and models represent the knowledge base. In other words, uncertainty analysis aims to make a technical contribution to d ...
References
External links
*
Statistics for climate''prediction''.net
*
*
{{Atmospheric, Oceanographic and Climate Models
Volunteer computing projects
Information technology organisations based in the United Kingdom
Numerical climate and weather models
Meteorology and climate education
Science and technology in Oxfordshire
Science in society