Tucker Decomposition
   HOME

TheInfoList



OR:

In mathematics, Tucker decomposition decomposes a
tensor In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects associated with a vector space. Tensors may map between different objects such as vectors, scalars, and even other ...
into a set of matrices and one small core tensor. It is named after Ledyard R. Tucker although it goes back to Hitchcock in 1927. Initially described as a three-mode extension of
factor analysis Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. For example, it is possible that variations in six observe ...
and
principal component analysis Principal component analysis (PCA) is a linear dimensionality reduction technique with applications in exploratory data analysis, visualization and data preprocessing. The data is linearly transformed onto a new coordinate system such that th ...
it may actually be generalized to higher mode analysis, which is also called
higher-order singular value decomposition In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one type of generalization of the matrix singular value decomposition. It has application ...
(HOSVD) or the M-mode SVD. The algorithm to which the literature typically refers when discussing the Tucker decomposition or the HOSVD is the M-mode SVD algorithm introduced by Vasilescu and Terzopoulos, but misattributed to Tucker or De Lathauwer etal. It may be regarded as a more flexible
PARAFAC In multilinear algebra, the tensor rank decomposition or rank-''R'' decomposition is the decomposition of a tensor as a sum of ''R'' rank-1 tensors, where ''R'' is minimal. Computing this decomposition is an open problem. Canonical polyadic decom ...
(parallel factor analysis) model. In PARAFAC the core tensor is restricted to be "diagonal". In practice, Tucker decomposition is used as a modelling tool. For instance, it is used to model three-way (or higher way) data by means of relatively small numbers of components for each of the three or more modes, and the components are linked to each other by a three- (or higher-) way core array. The model parameters are estimated in such a way that, given fixed numbers of components, the modelled data optimally resemble the actual data in the least squares sense. The model gives a summary of the information in the data, in the same way as principal components analysis does for two-way data. For a 3rd-order tensor T \in F^, where F is either \mathbb or \mathbb, Tucker Decomposition can be denoted as follows, T = \mathcal \times_ U^ \times_ U^ \times_ U^ where \mathcal \in F^ is the ''core tensor'', a 3rd-order tensor that contains the 1-mode, 2-mode and 3-mode singular values of T, which are defined as the ''
Frobenius norm In the field of mathematics, norms are defined for elements within a vector space. Specifically, when the vector space comprises matrices, such norms are referred to as matrix norms. Matrix norms differ from vector norms in that they must also ...
'' of the 1-mode, 2-mode and 3-mode slices of tensor \mathcal respectively. U^, U^, U^ are unitary matrices in F^, F^, F^ respectively. The ''k''-mode product (''k'' = 1, 2, 3) of \mathcal by U^ is denoted as \mathcal \times U^ with entries as :\begin (\mathcal \times_ U^)(i_, j_, j_) &= \sum_^ \mathcal(j_, j_, j_)U^(j_, i_) \\ (\mathcal \times_ U^)(j_, i_, j_) &= \sum_^ \mathcal(j_, j_, j_)U^(j_, i_) \\ (\mathcal \times_ U^)(j_, j_, i_) &= \sum_^ \mathcal(j_, j_, j_)U^(j_, i_) \end Altogether, the decomposition may also be written more directly as : T(i_, i_, i_) = \sum_^ \sum_^ \sum_^ \mathcal(j_, j_, j_) U^(j_, i_) U^(j_, i_) U^(j_, i_) Taking d_i = n_i for all i is always sufficient to represent T exactly, but often T can be compressed or efficiently approximately by choosing d_i < n_i. A common choice is d_1 = d_2 = d_3 = \min(n_1, n_2, n_3), which can be effective when the difference in dimension sizes is large. There are two special cases of Tucker decomposition: Tucker1: if U^ and U^ are identity, then T = \mathcal \times_ U^ Tucker2: if U^ is identity, then T = \mathcal \times_ U^ \times_ U^ . RESCAL decomposition can be seen as a special case of Tucker where U^ is identity and U^ is equal to U^ .


See also

*
Higher-order singular value decomposition In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one type of generalization of the matrix singular value decomposition. It has application ...
* Multilinear principal component analysis


References

Dimension reduction {{statistics-stub