Monday, June 6, 2016

To Estimate the Transformation Between Two Images

The transformation of a 3D object from one point cloud to another point cloud is characterized by a unitary matrix and a offset. For 2D image, if multiple images are for the same scene, then we are facing with the similar task of matching between different images. Assuming that by feature point matching or tracking or other algorithms, it is found that N matching pairs of points \((x, y)\) and \((x', y')\) in two images. Our task is to identify a transformation for \((x, y) \rightarrow (x', y')\).


For 3D points under similar scenario, given two matrices with the size of \(3 \times N\) in each, we find the cross product of them, run SVD on the cross product, then the transform matrix is \(U \times V\) whereas \(U\) and \(V\) are the two unitary matrices obtained by SVD. For 2D matching, the procedure is more complicated. The first step is to add one dimension. Instead of  \((x, y)\) and \((x', y')\), now we have  \((x, y, 1)\) and \((x', y', 1)\). This added dimension can be treated as a dimension for the infinite horizon. This new triplet is insensitive to scaling, which means that \((x, y, 1)\) and \((kx, ky, k)\) represent the same pixel in the image. The cross product between these two triplets are \({\bf 0}\):
\[ \begin{bmatrix}
    i   &    j & k  \\
    x    &   y & 1  \\
    kx     &  ky & k  \\
\end{bmatrix}={\bf 0}
\]
The transformation between two images is defined as the projective transformation matrix \({\bf P}\) which is
\[

\begin{bmatrix}   kx'   \\
   ky'   \\
    k
\end{bmatrix} = {\bf P} \begin{bmatrix}
   x   \\
   y   \\
    1
\end{bmatrix}
\]
Assuming that
\[ {\bf P} = \begin{bmatrix}
    h_{1}   &    h_{2}&  h_{3}  \\
    h_{4}    &   h_{5}& h_{6}  \\
    h_{7}     &  h_{8}& h_{9}  \\
\end{bmatrix}\
\]
Then
\[

\begin{bmatrix}
   kx'   \\
   ky'   \\
    k
\end{bmatrix} = \begin{bmatrix}
   h_{1}x+h_{2}y+h_{3}   \\
  h_{4}x+h_{5}y+h_{6}   \\
    h_{7}x+h_{8}y+h_{9}
\end{bmatrix}
\]
Using the property that the cross product between \((x', y', 1)\) and \((kx', ky', k)\) is 0, we have
\[ \begin{bmatrix}
    i   &    j & k  \\
    h_{1}x+h_{2}y+h_{3}    &   h_{4}x+h_{5}y+h_{6} & h_{7}x+h_{8}y+h_{9}  \\
    x'     &  y' & 1  \\
\end{bmatrix}={\bf 0}
\]
This gives us three equations but only two of them are linear independent. The reason is that given
\[ \begin{bmatrix}
    i   &    j & k  \\
    a    &   b & c  \\
    u     &  v & w  \\
\end{bmatrix}={\bf 0}
\]
If \(aw-cu=0\) and \(bw-cv=0\), then \(av-bu=0\) for sure.

With two linearly independent equations, we have
\[ \begin{bmatrix}
    0    &   0 & 0 & x & y & 1 & -xy' & -yy' & -y'  \\
    x    &  y & 1 & 0 & 0 & 0 & -x'x & -x'y & -x' \\
\end{bmatrix}
\begin{bmatrix}
    h_{1} \\
   h_{2} \\
   h_{3} \\
   h_{4} \\
h_{5} \\
h_{6} \\
h_{7} \\
h_{8} \\
h_{9}
\end{bmatrix}
={\bf 0}
\]


With N pairs of correspondences, we can construct a matrix \({\bf M}\) which has 2N lines
\[ {\bf M}=\begin{bmatrix}
    0    &   0 & 0 & x_{1} & y_{1} & 1 & -x_{1}y_{1}' & -y_{1}y_{1}' & -y_{1}'  \\
    x_{1}    &  y_{1} & 1 & 0 & 0 & 0 & -x_{1}'x_{1} & -x_{1}'y_{1} & -x_{1}' \\
...\\
 0    &   0 & 0 & x_{N} & y_{N} & 1 & -x_{N}y_{N}' & -y_{N}y_{N}' & -y_{N}'  \\
    x_{N}    &  y_{N} & 1 & 0 & 0 & 0 & -x_{N}'x_{N} & -x_{N}'y_{N} & -x_{N}' \\
\end{bmatrix}
\]
The last step is SVD \({\bf USV = M}\). Then the last column of the matrix \({\bf V}\) corresponds to elements in matrix \({\bf P}\) since the last column of \({\bf V}\) is for the smallest Eigen value. It is the closest solution we can find for making
\[ {\bf M}
\begin{bmatrix}
    h_{1} \\
   h_{2} \\
   h_{3} \\
   h_{4} \\
h_{5} \\
h_{6} \\
h_{7} \\
h_{8} \\
h_{9}
\end{bmatrix}
={\bf 0}
\]

Sample Matlab source code for the whole procedure is shown below:
clear all; close all;
% to generate (x, y, 1)
srcPos = [-1483         853        4172        2572       -1196       -4241         308        4340         688       -4881
    3308         497       -2142        2537         678       -4460        2792       -3701        -306       -1629];


[M, N] = size(srcPos);
srcPos = [srcPos.' ones(N, 1)].';
PM = [1 2 0
    0 1 0
    -0.01 0.01 1];
dstTemp = PM * srcPos;
dstPos(1,:) = dstTemp(1,:) .* (1 ./ dstTemp(3,:));
dstPos(2,:) = dstTemp(2,:) .* (1 ./ dstTemp(3,:));
dstPos(3,:) = dstTemp(3,:) .* (1 ./ dstTemp(3,:));


% (x', y', 1)
dstPos = round(dstPos);
M = zeros(2*N, 9);
for i = 1:N
    M(2*i-1,:) = [0 0 0 srcPos(1,i) srcPos(2,i) 1 -srcPos(1,i)*dstPos(2,i) -srcPos(2,i)*dstPos(2,i) -dstPos(2,i)];
    M(2*i,:) = [srcPos(1,i) srcPos(2,i) 1 0 0 0 -srcPos(1,i)*dstPos(1,i) -srcPos(2,i)*dstPos(1,i) -dstPos(1,i)];
end


% To estmate matrix P
[u, s, v]= svd(M);
h = v(:,9)/v(9,9)

PM_hat = [h(1) h(2) h(3)
    h(4) h(5) h(6)
    h(7) h(8) h(9)]


To estimate

\[ {\bf P} =  \begin{bmatrix}
    1 & 2 & 0 \\
0 & 1 & 0 \\
-0.01 & 0.01 & 1
\end{bmatrix}\]

the result of above code is
\[ {\bf \hat{P}} =  \begin{bmatrix}
    1.0060  &  2.0137 & -56.3469\\
    0.0014  &  1.0073 & -14.1412\\
   -0.0100  &  0.0100   & 1.0000
\end{bmatrix}\]

The offsets of -56.3469 and -14.1412 are due to rounding error. If we change dstPos = round(dstPos) in the code to dstPos = (dstPos). Then \( {\bf \hat{P}}\) becomes

\[ {\bf \hat{P}} =  \begin{bmatrix}
        1.0000  &  2.0000 &   0.0000\\
   -0.0000  &  1.0000  &  0.0000\\
   -0.0100  &  0.0100  &  1.0000
\end{bmatrix}\]

Using least square estimation here does not yield good results since we don't know \(k\) in 
\[

\begin{bmatrix}   kx'   \\
   ky'   \\
    k
\end{bmatrix} = {\bf P} \begin{bmatrix}
   x   \\
   y   \\
    1
\end{bmatrix}
\]


(Added on 08/13/2016) Another way to solve this equation is this: since \({h_{1},...,h_{9}}\) parameters have ambiguity in scaling, we can force \({h_{9}=1}\). Then the equation can be rewritten as:
\[ {\bf M'}
\begin{bmatrix}
    h_{1} \\
   h_{2} \\
   h_{3} \\
   h_{4} \\
h_{5} \\
h_{6} \\
h_{7} \\
h_{8}
\end{bmatrix}
=\begin{bmatrix}
    y_{1}'  \\
    x_{1}' \\
...\\
 y_{N}'  \\
    x_{N}' \\
\end{bmatrix}
\]
where
\[ {\bf M'}=\begin{bmatrix}
    0    &   0 & 0 & x_{1} & y_{1} & 1 & -x_{1}y_{1}' & -y_{1}y_{1}'  \\
    x_{1}    &  y_{1} & 1 & 0 & 0 & 0 & -x_{1}'x_{1} & -x_{1}'y_{1} \\
...\\
 0    &   0 & 0 & x_{N} & y_{N} & 1 & -x_{N}y_{N}' & -y_{N}y_{N}' \\
    x_{N}    &  y_{N} & 1 & 0 & 0 & 0 & -x_{N}'x_{N} & -x_{N}'y_{N} \\
\end{bmatrix}
\]

Equation in this form can be solved by least square method.

No comments:

Post a Comment