For
mathematical analysis and
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, Leave-one-out error can refer to the following:
* Leave-one-out cross-validation Stability (CVloo, for ''stability of Cross Validation with leave one out''): An
algorithm f has CVloo stability β with respect to the
loss function
In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost ...
V if the following holds:
* Expected-to-leave-one-out error Stability (
, for ''Expected error from leaving one out''): An algorithm f has
stability if for each n there exists a
and a
such that:
, with
and
going to zero for
Preliminary notations
With X and Y being a
subset
In mathematics, Set (mathematics), set ''A'' is a subset of a set ''B'' if all Element (mathematics), elements of ''A'' are also elements of ''B''; ''B'' is then a superset of ''A''. It is possible for ''A'' and ''B'' to be equal; if they are ...
of the
real numbers R, or X and Y ⊂ R, being respectively an input space X and an output space Y, we consider a
training set:
of size m in
drawn
independently and identically distributed
In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usual ...
(i.i.d.) from an unknown distribution, here called "D". Then a
learning algorithm is a function
from
into
which
maps a
learning set S onto a function
from the input space X to the output space Y. To avoid complex notation, we consider only
deterministic algorithms. It is also assumed that the algorithm
is symmetric with respect to S, i.e. it does not depend on the order of the elements in the
training set. Furthermore, we assume that all functions are measurable and all sets are countable which does not limit the interest of the results presented here.
The loss of an
hypothesis f with respect to an example
is then defined as
.
The
empirical error of
f can then be written as
.
The
true error of
f is
Given a training set S of size m, we will build, for all i = 1....,m, modified training sets as follows:
* By removing the i-th element
* and/or by replacing the i-th element
See also
*
Constructive analysis
*
History of calculus
Calculus, originally called infinitesimal calculus, is a mathematical discipline focused on limits, continuity, derivatives, integrals, and infinite series. Many elements of calculus appeared in ancient Greece, then in China and the Middle East, a ...
*
Hypercomplex analysis
*
Jackknife resampling
*
Statistical classification
*
Timeline of calculus and mathematical analysis
References
*S. Mukherjee, P. Niyogi, T. Poggio, and R. M. Rifkin. Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization. Adv. Comput. Math., 25(1-3):161–193, 2006
Machine learning