HOME

TheInfoList



OR:

Direct linear transformation (DLT) is an algorithm which solves a set of variables from a set of similarity relations: : \mathbf_ \propto \mathbf \, \mathbf_   for \, k = 1, \ldots, N where \mathbf_ and \mathbf_ are known vectors, \, \propto denotes equality up to an unknown scalar multiplication, and \mathbf is a matrix (or linear transformation) which contains the unknowns to be solved. This type of relation appears frequently in
projective geometry In mathematics, projective geometry is the study of geometric properties that are invariant with respect to projective transformations. This means that, compared to elementary Euclidean geometry, projective geometry has a different setting, pr ...
. Practical examples include the relation between 3D points in a scene and their projection onto the image plane of a
pinhole camera A pinhole camera is a simple camera without a lens but with a tiny aperture (the so-called '' pinhole'')—effectively a light-proof box with a small hole in one side. Light from a scene passes through the aperture and projects an inverted image ...
, and homographies.


Introduction

An ordinary system of linear equations : \mathbf_ = \mathbf \, \mathbf_   for \, k = 1, \ldots, N can be solved, for example, by rewriting it as a matrix equation \mathbf = \mathbf \, \mathbf where matrices \mathbf and \mathbf contain the vectors \mathbf_ and \mathbf_ in their respective columns. Given that there exists a unique solution, it is given by : \mathbf = \mathbf \, \mathbf^ \, (\mathbf \, \mathbf^)^ . Solutions can also be described in the case that the equations are over or under determined. What makes the direct linear transformation problem distinct from the above standard case is the fact that the left and right sides of the defining equation can differ by an unknown multiplicative factor which is dependent on ''k''. As a consequence, \mathbf cannot be computed as in the standard case. Instead, the similarity relations are rewritten as proper linear homogeneous equations which then can be solved by a standard method. The combination of rewriting the similarity equations as homogeneous linear equations and solving them by standard methods is referred to as a direct linear transformation algorithm or DLT algorithm. DLT is attributed to Ivan Sutherland.


Example

Suppose that k \in \ . Let \mathbf_ = (x_, x_) \in \mathbb^ and \mathbf_ = (y_, y_, y_) \in \mathbb^ be two known vectors, and we want to find the 2 \times 3 matrix \mathbf such that : \alpha_ \, \mathbf_ = \mathbf \, \mathbf_ where \alpha_ \neq 0 is the unknown scalar factor related to equation ''k''. To get rid of the unknown scalars and obtain homogeneous equations, define the anti-symmetric matrix : \mathbf = \begin 0 & -1 \\ 1 & 0 \end and multiply both sides of the equation with \mathbf_^ \, \mathbf from the left : \begin (\mathbf_^ \, \mathbf) \, \alpha_ \, \mathbf_ &= (\mathbf_^ \, \mathbf) \, \mathbf \, \mathbf_ \\ \alpha_ \, \mathbf_^ \, \mathbf \, \mathbf_ &= \mathbf_^ \, \mathbf \, \mathbf \, \mathbf_ \end Since \mathbf_^ \, \mathbf \, \mathbf_ = 0, the following homogeneous equations, which no longer contain the unknown scalars, are at hand : \mathbf_^ \, \mathbf \, \mathbf \, \mathbf_ = 0 In order to solve \mathbf from this set of equations, consider the elements of the vectors \mathbf_ and \mathbf_ and matrix \mathbf : : \mathbf_ = \begin x_ \\ x_ \end ,   \mathbf_ = \begin y_ \\ y_ \\ y_ \end ,   and   \mathbf = \begin a_ & a_ & a_\\ a_ & a_ & a_ \end and the above homogeneous equation becomes : 0 = a_ \, x_ \, y_ - a_ \, x_ \, y_ + a_ \, x_ \, y_ - a_ \, x_ \, y_ + a_ \, x_ \, y_ - a_ \, x_ \, y_   for \, k = 1, \ldots, N. This can also be written in the matrix form: : 0 = \mathbf_^ \, \mathbf   for \, k = 1, \ldots, N where \mathbf_ and \mathbf both are 6-dimensional vectors defined as : \mathbf_ = \begin x_ \, y_ \\ -x_ \, y_ \\ x_ \, y_ \\ -x_ \, y_ \\ x_ \, y_ \\ -x_ \, y_ \end   and   \mathbf = \begin a_ \\ a_ \\ a_ \\ a_ \\ a_ \\ a_ \end. So far, we have 1 equation and 6 unknowns. A set of homogeneous equations can be written in the matrix form : \mathbf = \mathbf \, \mathbf where \mathbf is a N \times 6 matrix which holds the known vectors \mathbf_ in its rows. The unknown \mathbf can be determined, for example, by a
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is r ...
of \mathbf ; \mathbf is a right singular vector of \mathbf corresponding to a singular value that equals zero. Once \mathbf has been determined, the elements of matrix \mathbf can rearranged from vector \mathbf. Notice that the scaling of \mathbf or \mathbf is not important (except that it must be non-zero) since the defining equations already allow for unknown scaling. In practice the vectors \mathbf_ and \mathbf_ may contain noise which means that the similarity equations are only approximately valid. As a consequence, there may not be a vector \mathbf which solves the homogeneous equation \mathbf = \mathbf \, \mathbf exactly. In these cases, a
total least squares In applied statistics, total least squares is a type of errors-in-variables regression, a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalizat ...
solution can be used by choosing \mathbf as a right singular vector corresponding to the smallest singular value of \mathbf.


More general cases

The above example has \mathbf_ \in \mathbb^ and \mathbf_ \in \mathbb^ , but the general strategy for rewriting the similarity relations into homogeneous linear equations can be generalized to arbitrary dimensions for both \mathbf_ and \mathbf_. If \mathbf_ \in \mathbb^ and \mathbf_ \in \mathbb^ the previous expressions can still lead to an equation : 0 = \mathbf_^ \, \mathbf \, \mathbf \, \mathbf_   for   \, k = 1, \ldots, N where \mathbf now is 2 \times q. Each ''k'' provides one equation in the 2q unknown elements of \mathbf and together these equations can be written \mathbf \, \mathbf = \mathbf for the known N \times 2 \, q matrix \mathbf and unknown ''2q''-dimensional vector \mathbf. This vector can be found in a similar way as before. In the most general case \mathbf_ \in \mathbb^ and \mathbf_ \in \mathbb^ . The main difference compared to previously is that the matrix \mathbf now is p \times p and anti-symmetric. When p > 2 the space of such matrices is no longer one-dimensional, it is of dimension : M = \frac. This means that each value of ''k'' provides ''M'' homogeneous equations of the type : 0 = \mathbf_^ \, \mathbf_ \, \mathbf \, \mathbf_   for   \, m = 1, \ldots, M   and for \, k = 1, \ldots, N where \mathbf_ is a ''M''-dimensional basis of the space of p \times p anti-symmetric matrices.


Example ''p'' = 3

In the case that ''p'' = 3 the following three matrices \mathbf_ can be chosen : \mathbf_ = \begin 0 & 0 & 0 \\ 0 & 0 & -1 \\ 0 & 1 & 0 \end ,   \mathbf_ = \begin 0 & 0 & 1 \\ 0 & 0 & 0 \\ -1 & 0 & 0 \end ,   \mathbf_ = \begin 0 & -1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end . In this particular case, the homogeneous linear equations can be written as : \mathbf = mathbf_ \, \mathbf \, \mathbf_   for   \, k = 1, \ldots, N where mathbf_ is the matrix representation of the vector cross product. Notice that this last equation is vector valued; the left hand side is the zero element in \mathbb^ . Each value of ''k'' provides three homogeneous linear equations in the unknown elements of \mathbf . However, since mathbf_ has rank = 2, at most two equations are linearly independent. In practice, therefore, it is common to only use two of the three matrices \mathbf_ , for example, for ''m''=1, 2. However, the linear dependency between the equations is dependent on \mathbf_ , which means that in unlucky cases it would have been better to choose, for example, ''m''=2,3. As a consequence, if the number of equations is not a concern, it may be better to use all three equations when the matrix \mathbf is constructed. The linear dependence between the resulting homogeneous linear equations is a general concern for the case ''p'' > 2 and has to be dealt with either by reducing the set of anti-symmetric matrices \mathbf_ or by allowing \mathbf to become larger than necessary for determining \mathbf.


References

*{{cite book , author=Richard Hartley and Andrew Zisserman , title=Multiple View Geometry in computer vision , publisher=Cambridge University Press, year=2003 , isbn=978-0-521-54051-3


External links


Homography Estimation
by Elan Dubrofsky (§2.1 sketches the "Basic DLT Algorithm")
A DLT Solver based on MATLAB
by Hsiang-Jen (Johnny) Chien Geometry in computer vision Projective geometry