Data assimilation
   HOME

TheInfoList



OR:

Data assimilation is a mathematical discipline that seeks to optimally combine theory (usually in the form of a numerical model) with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using (e.g. physical) knowledge of the system being observed, to set numerical parameters based on training a model from observed data. Depending on the goal, different solution methods may be used. Data assimilation is distinguished from other forms of machine learning, image analysis, and statistical methods in that it utilizes a dynamical model of the system being analyzed. Data assimilation initially developed in the field of
numerical weather prediction Numerical weather prediction (NWP) uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in th ...
. Numerical weather prediction models are equations describing the dynamical behavior of the atmosphere, typically coded into a computer program. In order to use these models to make forecasts, initial conditions are needed for the model that closely resemble the current state of the atmosphere. Simply inserting point-wise measurements into the numerical models did not provide a satisfactory solution. Real world measurements contain errors both due to the quality of the instrument and how accurately the position of the measurement is known. These errors can cause instabilities in the models that eliminate any level of skill in a forecast. Thus, more sophisticated methods were needed in order to initialize a model using all available data while making sure to maintain stability in the numerical model. Such data typically includes the measurements as well as a previous forecast valid at the same time the measurements are made. If applied iteratively, this process begins to accumulate information from past observations into all subsequent forecasts. Because data assimilation developed out of the field of numerical weather prediction, it initially gained popularity amongst the geosciences. In fact, one of the most cited publication in all of the geosciences is an application of data assimilation to reconstruct the observed history of the atmosphere.


Details of the data assimilation process

Classically, data assimilation has been applied to chaotic dynamical systems that are too difficult to predict using simple extrapolation methods. The cause of this difficulty is that small changes in initial conditions can lead to large changes in prediction accuracy. This is sometimes known as the butterfly effect – the sensitive dependence on
initial condition In mathematics and particularly in dynamic systems, an initial condition, in some contexts called a seed value, is a value of an evolving variable at some point in time designated as the initial time (typically denoted ''t'' = 0). Fo ...
s in which a small change in one state of a
deterministic Determinism is a philosophical view, where all events are determined completely by previously existing causes. Deterministic theories throughout the history of philosophy have developed from diverse and sometimes overlapping motives and cons ...
nonlinear system In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many othe ...
can result in large differences in a later state. At any update time, data assimilation usually takes a forecast (also known as the first guess, or background information) and applies a correction to the forecast based on a set of observed data and estimated errors that are present in both the observations and the forecast itself. The difference between the forecast and the observations at that time is called the departure or the innovation (as it provides new information to the data assimilation process). A weighting factor is applied to the innovation to determine how much of a correction should be made to the forecast based on the new information from the observations. The best estimate of the state of the system based on the correction to the forecast determined by a weighting factor times the innovation is called the analysis. In one dimension, computing the analysis could be as simple as forming a weighted average of a forecasted and observed value. In multiple dimensions the problem becomes more difficult. Much of the work in data assimilation is focused on adequately estimating the appropriate weighting factor based on intricate knowledge of the errors in the system. The measurements are usually made of a real-world system, rather than of the model's incomplete representation of that system, and so a special function called the observation operator (usually depicted by ''h()'' for a nonlinear operator or H for its linearization) is needed to map the modeled variable to a form that can be directly compared with the observation.


Data assimilation as statistical estimation

One of the common mathematical philosophical perspectives is to view data assimilation as a Bayesian estimation problem. From this perspective, the analysis step is an application of
Bayes' theorem In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule), named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For examp ...
and the overall assimilation procedure is an example of recursive Bayesian estimation. However, the probabilistic analysis is usually simplified to a computationally feasible form. Advancing the probability distribution in time would be done exactly in the general case by the
Fokker–Planck equation In statistical mechanics, the Fokker–Planck equation is a partial differential equation that describes the time evolution of the probability density function of the velocity of a particle under the influence of drag forces and random forces, ...
, but that is not feasible for high-dimensional systems; so, various approximations operating on simplified
representations ''Representations'' is an interdisciplinary journal in the humanities published quarterly by the University of California Press. The journal was established in 1983 and is the founding publication of the New Historicism movement of the 1980s. It ...
of the probability distributions are used instead. Often the probability distributions are assumed
Gaussian Carl Friedrich Gauss (1777–1855) is the eponym of all of the topics listed below. There are over 100 topics all named after this German mathematician and scientist, all in the fields of mathematics, physics, and astronomy. The English eponym ...
so that they can be represented by their mean and covariance, which gives rise to the
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estima ...
. Many methods represent the probability distributions only by the mean and input some pre-calculated covariance. An example of a direct (or sequential) method to compute this is called optimal statistical interpolation, or simply optimal interpolation (OI). An alternative approach is to iteratively solve a cost function that solves an identical problem. These are called variational methods, such as 3D-Var and 4D-Var. Typical minimization algorithms are the
conjugate gradient method In mathematics, the conjugate gradient method is an algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a c ...
or the
generalized minimal residual method In mathematics, the generalized minimal residual method (GMRES) is an iterative method for the numerical solution of an indefinite nonsymmetric system of linear equations. The method approximates the solution by the vector in a Krylov subspace wit ...
. The ensemble Kalman filter is sequential method that uses a Monte Carlo approach to estimate both the mean and the covariance of a Gaussian probability distribution by an ensemble of simulations. More recently, hybrid combinations of ensemble approaches and variational methods have become more popular (e.g. they are used for operational forecasts both at the
European Centre for Medium-Range Weather Forecasts The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organisation supported by most of the nations of Europe. It is based at three sites: Shinfield Park, Reading, United Kingdom; Bologna, Italy; an ...
(ECMWF) and at the
NOAA The National Oceanic and Atmospheric Administration (abbreviated as NOAA ) is an United States scientific and regulatory agency within the United States Department of Commerce that forecasts weather, monitors oceanic and atmospheric conditio ...
National Centers for Environmental Prediction The United States National Centers for Environmental Prediction (NCEP) delivers national and global weather, water, climate and space weather guidance, forecasts, warnings and analyses to its Partners and External User Communities. These p ...
(NCEP)).


Weather forecasting applications

In numerical weather prediction applications, data assimilation is most widely known as a method for combining observations of meteorological variables such as
temperature Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measured with a thermometer. Thermometers are calibrated in various temperature scales that historically have relied o ...
and
atmospheric pressure Atmospheric pressure, also known as barometric pressure (after the barometer), is the pressure within the atmosphere of Earth. The standard atmosphere (symbol: atm) is a unit of pressure defined as , which is equivalent to 1013.25 millibars, ...
with prior forecasts in order to initialize numerical forecast models.


Why it is necessary

The
atmosphere An atmosphere () is a layer of gas or layers of gases that envelop a planet, and is held in place by the gravity of the planetary body. A planet retains an atmosphere when the gravity is great and the temperature of the atmosphere is low. A ...
is a
fluid In physics, a fluid is a liquid, gas, or other material that continuously deforms (''flows'') under an applied shear stress, or external force. They have zero shear modulus, or, in simpler terms, are substances which cannot resist any shear ...
. The idea of numerical weather prediction is to sample the state of the fluid at a given time and use the equations of
fluid dynamics In physics and engineering, fluid dynamics is a subdiscipline of fluid mechanics that describes the flow of fluids— liquids and gases. It has several subdisciplines, including ''aerodynamics'' (the study of air and other gases in motion) a ...
and
thermodynamics Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws ...
to estimate the state of the fluid at some time in the future. The process of entering observation data into the model to generate
initial conditions In mathematics and particularly in dynamic systems, an initial condition, in some contexts called a seed value, is a value of an evolving variable at some point in time designated as the initial time (typically denoted ''t'' = 0). For ...
is called ''initialization''. On land, terrain maps available at resolutions down to globally are used to help model atmospheric circulations within regions of rugged topography, in order to better depict features such as downslope winds,
mountain wave In meteorology, lee waves are atmospheric stationary waves. The most common form is mountain waves, which are atmospheric internal gravity waves. These were discovered in 1933 by two German glider pilots, Hans Deutschmann and Wolf Hirth, above ...
s and related cloudiness that affects incoming solar radiation. The main inputs from country-based weather services are observations from devices (called
radiosonde A radiosonde is a battery-powered telemetry instrument carried into the atmosphere usually by a weather balloon that measures various atmospheric parameters and transmits them by radio to a ground receiver. Modern radiosondes measure or calcula ...
s) in weather balloons that measure various atmospheric parameters and transmits them to a fixed receiver, as well as from
weather satellite A weather satellite or meteorological satellite is a type of Earth observation satellite that is primarily used to monitor the weather and climate of the Earth. Satellites can be polar orbiting (covering the entire Earth asynchronously), or ...
s. The
World Meteorological Organization The World Meteorological Organization (WMO) is a specialized agency of the United Nations responsible for promoting international cooperation on atmospheric science, climatology, hydrology and geophysics. The WMO originated from the Inter ...
acts to standardize the instrumentation, observing practices and timing of these observations worldwide. Stations either report hourly in
METAR METAR is a format for reporting weather information. A METAR weather report is predominantly used by aircraft pilots, and by meteorologists, who use aggregated METAR information to assist in weather forecasting. Raw METAR is the most common fo ...
reports, or every six hours in
SYNOP SYNOP (surface synoptic observations) is a numerical code (called FM-12 by WMO) used for reporting weather observations made by manned and automated weather stations. SYNOP reports are typically sent every six hours by Deutscher Wetterdienst on s ...
reports. These observations are irregularly spaced, so they are processed by data assimilation and objective analysis methods, which perform quality control and obtain values at locations usable by the model's mathematical algorithms. Some global models use
finite differences A finite difference is a mathematical expression of the form . If a finite difference is divided by , one gets a difference quotient. The approximation of derivatives by finite differences plays a central role in finite difference methods for the ...
, in which the world is represented as discrete points on a regularly spaced grid of latitude and longitude; other models use spectral methods that solve for a range of wavelengths. The data are then used in the model as the starting point for a forecast. A variety of methods are used to gather observational data for use in numerical models. Sites launch radiosondes in weather balloons which rise through the
troposphere The troposphere is the first and lowest layer of the atmosphere of the Earth, and contains 75% of the total mass of the planetary atmosphere, 99% of the total mass of water vapour and aerosols, and is where most weather phenomena occur. Fro ...
and well into the
stratosphere The stratosphere () is the second layer of the atmosphere of the Earth, located above the troposphere and below the mesosphere. The stratosphere is an atmospheric layer composed of stratified temperature layers, with the warm layers of air h ...
. Information from weather satellites is used where traditional data sources are not available. Commerce provides
pilot report A pilot report or PIREP is a report of actual flight or ground conditions encountered by an aircraft. Reports commonly include information about atmospheric conditions (like temperature, icing, turbulence) or airport conditions (like runway con ...
s along aircraft routes and ship reports along shipping routes. Research projects use
reconnaissance aircraft A reconnaissance aircraft (colloquially, a spy plane) is a military aircraft designed or adapted to perform aerial reconnaissance with roles including collection of imagery intelligence (including using photography), signals intelligence, as ...
to fly in and around weather systems of interest, such as
tropical cyclone A tropical cyclone is a rapidly rotating storm system characterized by a low-pressure center, a closed low-level atmospheric circulation, strong winds, and a spiral arrangement of thunderstorms that produce heavy rain and squalls. Dep ...
s. Reconnaissance aircraft are also flown over the open oceans during the cold season into systems which cause significant uncertainty in forecast guidance, or are expected to be of high impact from three to seven days into the future over the downstream continent. Sea ice began to be initialized in forecast models in 1971. Efforts to involve
sea surface temperature Sea surface temperature (SST), or ocean surface temperature, is the ocean temperature close to the surface. The exact meaning of ''surface'' varies according to the measurement method used, but it is between and below the sea surface. Air mas ...
in model initialization began in 1972 due to its role in modulating weather in higher latitudes of the Pacific.


History

In 1922,
Lewis Fry Richardson Lewis Fry Richardson, FRS (11 October 1881 – 30 September 1953) was an English mathematician, physicist, meteorologist, psychologist, and pacifist who pioneered modern mathematical techniques of weather forecasting, and the application of s ...
published the first attempt at forecasting the weather numerically. Using a hydrostatic variation of Bjerknes's
primitive equations The primitive equations are a set of nonlinear partial differential equations that are used to approximate global atmospheric flow and are used in most atmospheric models. They consist of three main sets of balance equations: # A ''continuity eq ...
, Richardson produced by hand a 6-hour forecast for the state of the atmosphere over two points in central Europe, taking at least six weeks to do so. His forecast calculated that the change in surface pressure would be , an unrealistic value incorrect by two orders of magnitude. The large error was caused by an imbalance in the pressure and wind velocity fields used as the initial conditions in his analysis, indicating the need for a data assimilation scheme. Originally "subjective analysis" had been used in which numerical weather prediction (NWP) forecasts had been adjusted by meteorologists using their operational expertise. Then "objective analysis" (e.g. Cressman algorithm) was introduced for automated data assimilation. These objective methods used simple interpolation approaches, and thus were 3DDA (three-dimensional data assimilation) methods. Later, 4DDA (four-dimensional data assimilation) methods, called "nudging", were developed, such as in the MM5 model. They are based on the simple idea of Newtonian relaxation (the 2nd axiom of Newton). They introduce into the right part of dynamical equations of the model a term that is proportional to the difference of the calculated meteorological variable and the observed value. This term that has a negative sign keeps the calculated state vector closer to the observations. Nudging can be interpreted as a variant of the
Kalman-Bucy filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimat ...
(a continuous time version of the
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estima ...
) with the gain matrix prescribed rather than obtained from covariances. A major development was achieved by L. Gandin (1963) who introduced the "statistical interpolation" (or "optimal interpolation") method, which developed earlier ideas of Kolmogorov. This is a 3DDA method and is a type of
regression analysis In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one ...
which utilizes information about the spatial distributions of
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the le ...
functions of the errors of the "first guess" field (previous forecast) and "true field". These functions are never known. However, the different approximations were assumed. The optimal interpolation algorithm is the reduced version of the Kalman filtering (KF) algorithm and in which the covariance matrices are not calculated from the dynamical equations but are pre-determined in advance. Attempts to introduce the KF algorithms as a 4DDA tool for NWP models came later. However, this was (and remains) a difficult task because the full version requires solution of the enormous number of additional equations (~N*N~10**12, where N=Nx*Ny*Nz is the size of the state vector, Nx~100, Ny~100, Nz~100 – the dimensions of the computational grid). To overcome this difficulty, approximate or suboptimal Kalman filters were developed. These include the Ensemble Kalman filter and the Reduced-Rank Kalman filters (RRSQRT). Another significant advance in the development of the 4DDA methods was utilizing the
optimal control Optimal control theory is a branch of mathematical optimization that deals with finding a control for a dynamical system over a period of time such that an objective function is optimized. It has numerous applications in science, engineering and ...
theory (variational approach) in the works of Le Dimet and Talagrand (1986), based on the previous works of J.-L. Lions and G. Marchuk, the latter being the first to apply that theory in the environmental modeling. The significant advantage of the variational approaches is that the meteorological fields satisfy the dynamical equations of the NWP model and at the same time they minimize the functional, characterizing their difference from observations. Thus, the problem of constrained minimization is solved. The 3DDA variational methods were developed for the first time by Sasaki (1958). As was shown by Lorenc (1986), all the above-mentioned 4DDA methods are in some limit equivalent, i.e. under some assumptions they minimize the same cost function. However, in practical applications these assumptions are never fulfilled, the different methods perform differently and generally it is not clear what approach (Kalman filtering or variational) is better. The fundamental questions also arise in application of the advanced DA techniques such as convergence of the computational method to the global minimum of the functional to be minimised. For instance, cost function or the set in which the solution is sought can be not convex. The 4DDA method which is currently most successful is hybrid incremental 4D-Var, where an ensemble is used to augment the climatological background error covariances at the start of the data assimilation time window, but the background error covariances are evolved during the time window by a simplified version of the NWP forecast model. This data assimilation method is used operationally at forecast centres such as the
Met Office The Meteorological Office, abbreviated as the Met Office, is the United Kingdom's national weather service. It is an executive agency and trading fund of the Department for Business, Energy and Industrial Strategy and is led by CEO Penelop ...
.


Cost function

The process of creating the analysis in data assimilation often involves minimization of a cost function. A typical cost function would be the sum of the squared deviations of the analysis values from the observations weighted by the accuracy of the observations, plus the sum of the squared deviations of the forecast fields and the analyzed fields weighted by the accuracy of the forecast. This has the effect of making sure that the analysis does not drift too far away from observations and forecasts that are known to usually be reliable.


3D-Var

J(\mathbf) = (\mathbf-\mathbf_)^\mathbf^(\mathbf-\mathbf_) + (\mathbf-\mathit mathbf^\mathbf^(\mathbf-\mathit mathbf, where \mathbf denotes the background error covariance, \mathbf the observational error covariance. \nabla J(\mathbf) = 2\mathbf^(\mathbf-\mathbf_) - 2\mathit^T\mathbf^(\mathbf-\mathit mathbf


4D-Var

J(\mathbf) = (\mathbf-\mathbf_)^\mathbf^(\mathbf-\mathbf_) + \sum_^(\mathbf_-\mathit_ mathbf_^\mathbf_^(\mathbf_-\mathit_ mathbf_ provided that \mathit is a linear operator (matrix).


Future development

Factors driving the rapid development of data assimilation methods for NWP models include: * Utilizing the observations currently offers promising improvement in
forecast skill In the fields of forecasting and prediction, forecast skill or prediction skill is any measure of the accuracy and/or degree of association of prediction to an observation or estimate of the actual value of what is being predicted (formally, the pre ...
at a variety of spatial scales (from global to highly local) and time scales. * The number of different kinds of available observations ( sodars,
radar Radar is a detection system that uses radio waves to determine the distance (''ranging''), angle, and radial velocity of objects relative to the site. It can be used to detect aircraft, Marine radar, ships, spacecraft, guided missiles, motor v ...
s,
satellite A satellite or artificial satellite is an object intentionally placed into orbit in outer space. Except for passive satellites, most satellites have an electricity generation system for equipment on board, such as solar panels or radioiso ...
) is rapidly growing.


Other applications


Monitoring water and energy transfers

Data assimilation has been used, in the 1980s and 1990s, in several HAPEX (Hydrologic and Atmospheric Pilot Experiment) projects for monitoring energy transfers between the soil, vegetation and atmosphere. For instance:
HAPEX-MobilHy
HAPEX-Sahel, - the "Alpilles-ReSeDA" (Remote Sensing Data Assimilation) experiment, a European project in th
FP4-ENV
program which took place in the
Alpilles The Chaîne des Alpilles is a small range of low mountains in Provence, southern France, located about south of Avignon. Geography The range is an extension of the much larger Luberon range. Although it is not high - some 498 m (1,634  ...
region, South-East of France (1996–97). The Flow-chart diagram (right), excerpted from the final report of that project, shows how to infer variables of interest such as canopy state, radiative fluxes, environmental budget, production in quantity and quality, from remote sensing data and ancillary information. In that diagram, the small blue-green arrows indicate the direct way the models actually run.


Other forecasting applications

Data assimilation methods are currently also used in other environmental forecasting problems, e.g. in
hydrological Hydrology () is the scientific study of the movement, distribution, and management of water on Earth and other planets, including the water cycle, water resources, and environmental watershed sustainability. A practitioner of hydrology is calle ...
forecasting. Bayesian networks may also be used in a data assimilation approach to assess natural hazards such as landslides. Given the abundance of spacecraft data for other planets in the solar system, data assimilation is now also applied beyond the Earth to obtain re-analyses of the atmospheric state of extraterrestrial planets. Mars is the only extraterrestrial planet to which data assimilation has been applied so far. Available spacecraft data include, in particular, retrievals of temperature and dust/water/ice optical thicknesses from the
Thermal Emission Spectrometer The Thermal Emission Spectrometer (TES) is an instrument on board Mars Global Surveyor. TES collects two types of data, hyperspectral thermal infrared data from 6 to 50 micrometres (μm) and bolometric visible-NIR (0.3 to 2.9 μm) measurements. T ...
onboard NASA's
Mars Global Surveyor ''Mars Global Surveyor'' (MGS) was an American robotic space probe developed by NASA's Jet Propulsion Laboratory and launched November 1996. MGS was a global mapping mission that examined the entire planet, from the ionosphere down through t ...
and the Mars Climate Sounder onboard NASA's
Mars Reconnaissance Orbiter ''Mars Reconnaissance Orbiter'' (MRO) is a spacecraft designed to study the geology and climate of Mars, provide reconnaissance of future landing sites, and relay data from surface missions back to Earth. It was launched on August 12, 2005, an ...
. Two methods of data assimilation have been applied to these datasets: an Analysis Correction scheme and two Ensemble Kalman Filter schemes,http://www.eps.jhu.edu/~mjhoffman/pages/research.html both using a global circulation model of the martian atmosphere as forward model. The Mars Analysis Correction Data Assimilation (MACDA) dataset is publicly available from the British Atmospheric Data Centre.http://badc.nerc.ac.uk/home/ Data assimilation is a part of the challenge for every forecasting problem. Dealing with biased data is a serious challenge in data assimilation. Further development of methods to deal with biases will be of particular use. If there are several instruments observing the same variable then intercomparing them using probability distribution functions can be instructive. The numerical forecast models are becoming of higher resolution due to the increase of
computational power Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empir ...
, with operational atmospheric models now running with horizontal resolutions of order of 1 km (e.g. at the German National Meteorological Service, th
Deutscher Wetterdienst (DWD)
and
Met Office The Meteorological Office, abbreviated as the Met Office, is the United Kingdom's national weather service. It is an executive agency and trading fund of the Department for Business, Energy and Industrial Strategy and is led by CEO Penelop ...
in the UK). This increase in horizontal resolutions is starting to allow to resolve more chaotic features of the non-linear models, e.g. to resolve
convection Convection is single or multiphase fluid flow that occurs spontaneously due to the combined effects of material property heterogeneity and body forces on a fluid, most commonly density and gravity (see buoyancy). When the cause of the c ...
on the grid scale, or clouds, in the atmospheric models. This increasing non-linearity in the models and observation operators poses a new problem in the data assimilation. The existing data assimilation methods such as many variants of ensemble Kalman filters and variational methods, well established with linear or near-linear models, are being assessed on non-linear models. Many new methods are being developed, e.g.
particle filter Particle filters, or sequential Monte Carlo methods, are a set of Monte Carlo algorithms used to solve filtering problems arising in signal processing and Bayesian statistical inference. The filtering problem consists of estimating the inte ...
s for high-dimensional problems, and hybrid data assimilation methods. Other uses include trajectory estimation for the Apollo program, GPS, and
atmospheric chemistry Atmospheric chemistry is a branch of atmospheric science in which the chemistry of the Earth's atmosphere and that of other planets is studied. It is a multidisciplinary approach of research and draws on environmental chemistry, physics, meteorol ...
.


See also

*
Calibration In measurement technology and metrology, calibration is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of kno ...


References


Further reading

* * * * * * * * * *


External links

Examples of how variational assimilation is implemented weather forecasting at: * * Other examples of assimilation:
CDACentral (an example analysis from Chemical Data Assimilation)

PDFCentral (using PDFs to examine biases and representativeness)

OpenDA – Open Source Data Assimilation package

PDAF – open-source Parallel Data Assimilation Framework

SANGOMA New Data Assimilation techniques
{{DEFAULTSORT:Data Assimilation Weather forecasting Numerical climate and weather models Estimation theory Control theory Bayesian statistics Climate and weather statistics Statistical forecasting