HOME





Design Matrix
In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Each row represents an individual object, with the successive columns corresponding to the variables and their specific values for that object. The design matrix is used in certain statistical models, e.g., the general linear model. It can contain indicator variables (ones and zeros) that indicate group membership in an ANOVA, or it can contain values of continuous variables. The design matrix contains data on the independent variables (also called explanatory variables), in a statistical model that is intended to explain observed data on a response variable (often called a dependent variable). The theory relating to such models uses the design matrix as input to some linear algebra : see for example linear regression. A notable feature of the concept of a design matrix i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industrial, or social problem, it is conventional to begin with a statistical population or a statistical model to be studied. Populations can be diverse groups of people or objects such as "all people living in a country" or "every atom composing a crystal". Statistics deals with every aspect of data, including the planning of data collection in terms of the design of statistical survey, surveys and experimental design, experiments. When census data (comprising every member of the target population) cannot be collected, statisticians collect data by developing specific experiment designs and survey sample (statistics), samples. Representative sampling assures that inferences and conclusions can reasonably extend from the sample ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Arithmetic Mean
In mathematics and statistics, the arithmetic mean ( ), arithmetic average, or just the ''mean'' or ''average'' is the sum of a collection of numbers divided by the count of numbers in the collection. The collection is often a set of results from an experiment, an observational study, or a Survey (statistics), survey. The term "arithmetic mean" is preferred in some contexts in mathematics and statistics because it helps to distinguish it from other types of means, such as geometric mean, geometric and harmonic mean, harmonic. Arithmetic means are also frequently used in economics, anthropology, history, and almost every other academic field to some extent. For example, per capita income is the arithmetic average of the income of a nation's Human population, population. While the arithmetic mean is often used to report central tendency, central tendencies, it is not a robust statistic: it is greatly influenced by outliers (Value (mathematics), values much larger or smaller than ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Design Of Experiments
The design of experiments (DOE), also known as experiment design or experimental design, is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associated with experiments in which the design introduces conditions that directly affect the variation, but may also refer to the design of quasi-experiments, in which natural conditions that influence the variation are selected for observation. In its simplest form, an experiment aims at predicting the outcome by introducing a change of the preconditions, which is represented by one or more independent variables, also referred to as "input variables" or "predictor variables." The change in one or more independent variables is generally hypothesized to result in a change in one or more dependent variables, also referred to as "output variables" or "response variables." The experimental design may also identify ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Matrices (mathematics)
Matrix (: matrices or matrixes) or MATRIX may refer to: Science and mathematics * Matrix (mathematics), a rectangular array of numbers, symbols or expressions * Matrix (logic), part of a formula in prenex normal form * Matrix (biology), the material in between a eukaryotic organism's cells * Matrix (chemical analysis), the non-analyte components of a sample * Matrix (geology), the fine-grained material in which larger objects are embedded * Matrix (composite), the constituent of a composite material * Hair matrix, produces hair * Nail matrix, part of the nail in anatomy Technology * Matrix (mass spectrometry), a compound that promotes the formation of ions * Matrix (numismatics), a tool used in coin manufacturing * Matrix (printing), a mould for casting letters * Matrix (protocol), an open standard for real-time communication * Matrix (record production), or master, a disc used in the production of phonograph records ** Matrix number, of a gramophone record * D ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Vandermonde Matrix
In linear algebra, a Vandermonde matrix, named after Alexandre-Théophile Vandermonde, is a matrix with the terms of a geometric progression in each row: an (m + 1) \times (n + 1) matrix :V = V(x_0, x_1, \cdots, x_m) = \begin 1 & x_0 & x_0^2 & \dots & x_0^n\\ 1 & x_1 & x_1^2 & \dots & x_1^n\\ 1 & x_2 & x_2^2 & \dots & x_2^n\\ \vdots & \vdots & \vdots & \ddots &\vdots \\ 1 & x_m & x_m^2 & \dots & x_m^n \end with entries V_ = x_i^j , the ''j''th power of the number x_i, for all zero-based indices i and j . Some authors define the Vandermonde matrix as the transpose of the above matrix. The determinant of a square Vandermonde matrix (when n=m) is called a Vandermonde determinant or Vandermonde polynomial. Its value is: :\det(V) = \prod_ (x_j - x_i). This is non-zero if and only if all x_i are distinct (no two are equal), making the Vandermonde matrix invertible. Applications The polynomial interpolation problem is to find a polynomial p(x) = a_0 + a_1 x + a_2 x^2 + \dots + a_n x^n ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]




Gram Matrix
In linear algebra, the Gram matrix (or Gramian matrix, Gramian) of a set of vectors v_1,\dots, v_n in an inner product space is the Hermitian matrix of inner products, whose entries are given by the inner product G_ = \left\langle v_i, v_j \right\rangle., p.441, Theorem 7.2.10 If the vectors v_1,\dots, v_n are the columns of matrix X then the Gram matrix is X^\dagger X in the general case that the vector coordinates are complex numbers, which simplifies to X^\top X for the case that the vector coordinates are real numbers. An important application is to compute linear independence: a set of vectors are linearly independent if and only if the Gram determinant (the determinant of the Gram matrix) is non-zero. It is named after Jørgen Pedersen Gram. Examples For finite-dimensional real vectors in \mathbb^n with the usual Euclidean dot product, the Gram matrix is G = V^\top V, where V is a matrix whose columns are the vectors v_k and V^\top is its transpose whose rows are the vectors ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Scatter Matrix
: ''For the notion in quantum mechanics, see scattering matrix.'' In multivariate statistics and probability theory, the scatter matrix is a statistic that is used to make estimation of covariance matrices, estimates of the covariance matrix, for instance of the multivariate normal distribution. Definition Given ''n'' samples of ''m''-dimensional data, represented as the m-by-n matrix, X=[\mathbf_1,\mathbf_2,\ldots,\mathbf_n], the sample mean is :\overline = \frac\sum_^n \mathbf_j where \mathbf_j is the ''j''-th column of X. The scatter matrix is the ''m''-by-''m'' Positive definite matrix, positive semi-definite matrix :S = \sum_^n (\mathbf_j-\overline)(\mathbf_j-\overline)^T = \sum_^n (\mathbf_j-\overline)\otimes(\mathbf_j-\overline) = \left( \sum_^n \mathbf_j \mathbf_j^T \right) - n \overline \overline^T where (\cdot)^T denotes matrix transpose, and multiplication is with regards to the outer product. The scatter matrix may be expressed more succinctly as :S = X\,C_n\,X^T ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Jacobian Matrix And Determinant
In vector calculus, the Jacobian matrix (, ) of a vector-valued function of several variables is the matrix of all its first-order partial derivatives. If this matrix is square, that is, if the number of variables equals the number of components of function values, then its determinant is called the Jacobian determinant. Both the matrix and (if applicable) the determinant are often referred to simply as the Jacobian. They are named after Carl Gustav Jacob Jacobi. The Jacobian matrix is the natural generalization to vector valued functions of several variables of the derivative and the differential of a usual function. This generalization includes generalizations of the inverse function theorem and the implicit function theorem, where the non-nullity of the derivative is replaced by the non-nullity of the Jacobian determinant, and the multiplicative inverse of the derivative is replaced by the inverse of the Jacobian matrix. The Jacobian determinant is fundamentally use ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Projection Matrix
In statistics, the projection matrix (\mathbf), sometimes also called the influence matrix or hat matrix (\mathbf), maps the vector of response values (dependent variable values) to the vector of fitted values (or predicted values). It describes the influence each response value has on each fitted value. The diagonal elements of the projection matrix are the leverages, which describe the influence each response value has on the fitted value for that same observation. Definition If the vector of response values is denoted by \mathbf and the vector of fitted values by \mathbf, :\mathbf = \mathbf \mathbf. As \mathbf is usually pronounced "y-hat", the projection matrix \mathbf is also named ''hat matrix'' as it "puts a hat on \mathbf". Application for residuals The formula for the vector of residuals \mathbf can also be expressed compactly using the projection matrix: :\mathbf = \mathbf - \mathbf = \mathbf - \mathbf \mathbf = \left( \mathbf - \mathbf \right) \mathbf. where \math ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Moment Matrix
In mathematics, a moment matrix is a special symmetric square matrix whose rows and columns are indexed by monomials. The entries of the matrix depend on the product of the indexing monomials only (cf. Hankel matrices.) Moment matrices play an important role in polynomial fitting, polynomial optimization (since positive semidefinite moment matrices correspond to polynomials which are sums of squares) and econometrics. Application in regression A multiple linear regression model can be written as :y = \beta_ + \beta_ x_ + \beta_ x_ + \dots \beta_ x_ + u where y is the dependent variable, x_, x_ \dots, x_ are the independent variables, u is the error, and \beta_, \beta_ \dots, \beta_ are unknown coefficients to be estimated. Given observations \left\_^, we have a system of n linear equations that can be expressed in matrix notation. :\begin y_ \\ y_ \\ \vdots \\ y_ \end = \begin 1 & x_ & x_ & \dots & x_ \\ 1 & x_ & x_ & \dots & x_ \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Linear Regression
In statistics, linear regression is a statistical model, model that estimates the relationship between a Scalar (mathematics), scalar response (dependent variable) and one or more explanatory variables (regressor or independent variable). A model with exactly one explanatory variable is a ''simple linear regression''; a model with two or more explanatory variables is a multiple linear regression. This term is distinct from multivariate linear regression, which predicts multiple correlated dependent variables rather than a single dependent variable. In linear regression, the relationships are modeled using linear predictor functions whose unknown model parameters are estimation theory, estimated from the data. Most commonly, the conditional mean of the response given the values of the explanatory variables (or predictors) is assumed to be an affine function of those values; less commonly, the conditional median or some other quantile is used. Like all forms of regression analysis, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]