HOME

TheInfoList



OR:

In statistics, kernel regression is a non-parametric technique to estimate the conditional expectation of a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
. The objective is to find a non-linear relation between a pair of random variables ''X'' and ''Y''. In any nonparametric regression, the conditional expectation of a variable Y relative to a variable X may be written: : \operatorname(Y \mid X) = m(X) where m is an unknown function.


Nadaraya–Watson kernel regression

Nadaraya and
Watson Watson may refer to: Companies * Actavis, a pharmaceutical company formerly known as Watson Pharmaceuticals * A.S. Watson Group, retail division of Hutchison Whampoa * Thomas J. Watson Research Center, IBM research center * Watson Systems, make ...
, both in 1964, proposed to estimate m as a locally weighted average, using a
kernel Kernel may refer to: Computing * Kernel (operating system), the central component of most operating systems * Kernel (image processing), a matrix used for image convolution * Compute kernel, in GPGPU programming * Kernel method, in machine lea ...
as a weighting function. The Nadaraya–Watson estimator is: : \widehat_h(x)=\frac where K_h is a kernel with a bandwidth h.


Derivation

: \operatorname(Y \mid X=x) = \int y f(y\mid x) \, dy = \int y \frac \, dy Using the
kernel density estimation In statistics, kernel density estimation (KDE) is the application of kernel smoothing for probability density estimation, i.e., a non-parametric method to estimate the probability density function of a random variable based on '' kernels'' as ...
for the joint distribution ''f''(''x'',''y'') and ''f''(''x'') with a kernel ''K'', : \hat(x,y) = \frac\sum_^n K_h(x-x_i) K_h(y-y_i), : \hat(x) = \frac \sum_^n K_h(x-x_i), we get : \begin \operatorname(Y \mid X=x) &= \int \frac \,dy,\\ pt&= \frac,\\ pt&= \frac, \end which is the Nadaraya–Watson estimator.


Priestley–Chao kernel estimator

: \widehat_(x) = h^ \sum_^n (x_i - x_) K\left(\frac\right) y_i where h is the bandwidth (or smoothing parameter).


Gasser–Müller kernel estimator

: \widehat_(x) = h^ \sum_^n \left int_^ K\left(\frac\right) \, du\righty_i where s_i = \frac.


Example

This example is based upon Canadian cross-section wage data consisting of a random sample taken from the 1971 Canadian Census Public Use Tapes for male individuals having common education (grade 13). There are 205 observations in total. The figure to the right shows the estimated regression function using a second order Gaussian kernel along with asymptotic variability bounds.


Script for example

The following commands of the R programming language use the npreg() function to deliver optimal smoothing and to create the figure given above. These commands can be entered at the command prompt via cut and paste. install.packages("np") library(np) # non parametric library data(cps71) attach(cps71) m <- npreg(logwage~age) plot(m, plot.errors.method="asymptotic", plot.errors.style="band", ylim=c(11, 15.2)) points(age, logwage, cex=.25)


Related

According to David Salsburg, the algorithms used in kernel regression were independently developed and used in fuzzy systems: "Coming up with almost exactly the same computer algorithm, fuzzy systems and kernel density-based regressions appear to have been developed completely independently of one another."


Statistical implementation

* GNU Octave mathematical program package * Julia
KernelEstimator.jl
*
MATLAB MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementa ...
: A free MATLAB toolbox with implementation of kernel regression, kernel density estimation, kernel estimation of hazard function and many others is available o
these pages
(this toolbox is a part of the book ). * Python: the KernelReg
/code> class for mixed data types in the
/code> sub-package (includes other kernel density related classes), the packag
kernel_regression
as an extension of scikit-learn (inefficient memory-wise, useful only for small datasets) * R: the function npreg of the ''np'' package can perform kernel regression. * Stata
npregress


See also

* Kernel smoother *
Local regression Local regression or local polynomial regression, also known as moving regression, is a generalization of the moving average and polynomial regression. Its most common methods, initially developed for scatterplot smoothing, are LOESS (locally e ...


References


Further reading

* * * * {{cite book , last=Simonoff , first=Jeffrey S. , title=Smoothing Methods in Statistics , publisher=Springer , year=1996 , isbn=0-387-94716-7 , url=https://books.google.com/books?id=dgHaBwAAQBAJ


External links


Scale-adaptive kernel regression
(with Matlab software).

(with
Microsoft Excel Microsoft Excel is a spreadsheet developed by Microsoft for Windows, macOS, Android and iOS. It features calculation or computation capabilities, graphing tools, pivot tables, and a macro programming language called Visual Basic for ...
).
An online kernel regression demonstration
Requires .NET 3.0 or later.
Kernel regression with automatic bandwidth selection
(with Python) Nonparametric regression Articles with example R code