This app is mainly consisted of two parts: the first part is a customized library which establishes the wireless link between Wiimote and smart phone. Majority of codes in this part is written in C language, and then incorporated into Android project by using Java Native Interface. This part of code is in the folder of ./jni. It benefits greatly from an open source library named fWIIne (http://sourceforge.net/projects/fwiine/). The second part is signal processing code which enables the virtual object in phone screen to track the 3D rotation of the real Wiimote in which quaternion is the backbone of the algorithm.
The inputs of 3D rotation tracking are from inertial sensors in Wiimote including accelerometer and gyroscope. Each accelerometer reading includes three values ax, ay and az, which represent the acceleration in x-axis, y-axis and z-axis, respectively. When the object is not moving, due to gravity, accelerometer tells Wiimote's orientation. For example, [0, 0, 9.8] means the object is lying on the desk facing upwards. Another inertial sensor, gyroscope, measures the rotation speed of the object. It also has three values gx, gy and gz, which represent the rotation speed around x-axis, y-axis and z-axis, respectively. When an object is stationary, each value is zero. But when the object starts to move, gyroscope starts to have meaningful outputs. In fact, accelerometer and gyroscope complement each other well. When Wiimote is moving, it is difficult to derive its orientation from accelerometer alone since gravity is mixed with motion acceleration. But this is not a problem for gyroscope since it records the rotation in each moment and thus provides useful information for orientation. Suffering from error accumulation, gyroscope has its own drawback. That means each gyroscope reading contains error, and when many readings are added together, the total error will shoot to the sky. However, accelerometer does not suffer from error accumulation. Therefore, while gyroscope tracks short-term movement accurately, its error accumulation can be corrected with the help of accelerometer. Because of their complementary properties, many 3D control algorithms fuse the inputs from both gyroscope and accelerometer.
In general, there are two ways to represent rotation in 3D space: Euler angle and quaternion. Euler angle is a more intuitive way to interpret 3D rotation. If one wants to rotate an object from one orientation to another, he can at first rotate it around x-axis for certain degree, then y-axis, followed by z-axis. These angles are called Euler angles. In fact, going through different axes, there are multiple ways available for the rotations. Despite its intuitiveness, Euler angles have some serious issues. The most famous one among them is gimbal lock, which means your object won't rotate as it is supposed to be. I still remembered my own encounter with gimbal lock. When I initially coded the rotation in my app with Euler angle, if pointing Wiimote to certain direction (I think it is up), the virtual 3D object on my phone screen will rotate violently showing that the rotation tracking algorithm collapsed. Another issue of Euler angle is that it is not signal processing friendly. For example, if someone asks you to perform a 1/n rotation from vector u to vector v, how to do that with Euler angle is far from clear. By comparison, quaternion is a concept which seems intriguing at the beginning but becomes easy to use after you get familiar with it. The idea of quaternion is simple. Assuming that there are two 3D vectors u and v, their rotation is only determined by their cross product, u x v, and an angle. In total, 4 elements are sufficient to define the rotation, three for cross product and one for angle. Unlike Euler angle, quaternion is unique. Also it does not suffer from gimbal lock. For the sake of signal processing, quaternion is also easier than Euler angle. Posing the same question of 1/n rotation from vector u to vector v, you can just use the same cross product but make the angle 1/n of the original value. An important property of quaternion is that the squares of its four elements add up to 1: q0^2+q1^2+q2^2+q3^2=1.
Now we can start to explain how quaternion-based signal processing is used in Wii motion monitor app. The bulk of it is within IMUupdate() function of CubeRenderer.java under .\src\com\goldsequence\motionmonitorpro. Note that for this quaternion-based motion tracking algorithm, we follow Madgwick’s IMU code but with some changes on filtering method (his code can be found at https://code.google.com/p/imumargalgorithm30042010sohm/) Our function has six inputs: ax/ay/az are from accelerometer and gx/gy/gz are from gyroscope. The way that gyroscope and accelerometer data are combined is like complementary filter.
At first, the input of accelerometer is normalized:
norm = (float) Math.sqrt(ax*ax + ay*ay + az*az);
ax = ax / norm;
ay = ay / norm;
az = az / norm;
Based on quaternion, we can derive 3D rotation matrix. Their relationship is that 3D rotation matrix
M =
[q0*q0+q1*q1-q2*q2-q3*q3 2*q1*q2-2*q0*q3 2*q1*q3+2*q0*q2 ]
[2*q1*q2+2*q0*q3 q0*q0+q2*q2-q1*q1-q3*q3 2*q2*q3-2*q0*q1 ]
[2*q1*q3-2*q0*q2 2*q2*q3+2*q0*q1 q0*q0+q3*q3-q1*q1-q2*q2]
The last low of M is parallel to z-axis, which the direction of gravity. Since MM'=I, M*[vx vy vz]'=[0 0 1]'. Cross multiplying this row with accelerometer tells us the discrepancy between accelerometer measurement and quaternion:
// estimated direction of gravity
vx = 2*(q1*q3 - q0*q2);
vy = 2*(q0*q1 + q2*q3);
vz = q0*q0 - q1*q1 - q2*q2 + q3*q3;
// error is sum of cross product between reference direction of field and direction measured by sensor
ex = (ay*vz - az*vy);
ey = (az*vx - ax*vz);
ez = (ax*vy - ay*vx);
Therefore, those discrepancies ex/ey/ez are used to correct the measurement of gyroscope:
// integral error scaled integral gain
exInt = filtCoef*exInt + (1-filtCoef)*ex;
eyInt = filtCoef*eyInt + (1-filtCoef)*ey;
ezInt = filtCoef*ezInt + (1-filtCoef)*ez;
// adjusted gyroscope measurements
gx = filtCoef2*gx + (1-filtCoef2)*(Kp*ex + Ki*exInt)/T;
gy = filtCoef2*gy + (1-filtCoef2)*(Kp*ey + Ki*eyInt)/T;
gz = filtCoef2*gz + (1-filtCoef2)*(Kp*ez + Ki*ezInt)/T;
At last, the calibrated gyroscope measurement is incorporated into quaternion. When a quaternion q multiplies with another vector p, it becomes
[q0 -q1 -q2 -q3]*[p0 p1 p2 p3]'
[q1 q0 -q3 q2]
[q2 q3 q0 -q1]
[q3 -q2 q1 q0]
where p' = [p0 p1 p2 p3]' is a 4 x 1 vector. With Quaternion properties of i^2=j^2=k^2=ijk=-1, (q0+q1*i+q2*j+q3*k)*(p0+p1*i+p2*j+p3*k)=(q0p0-q1p1-q2p2-q3p3)+(q0p1+q1p0+q2p3-q3p2)*i+(q0p2+q2p0+q3p1-q1p3)*j+(q0p3+q1p2-q2p1+q3p0)*k. This explains the quaternion multiply matrix.
Let p = [0 gx gy gz], and then multiplying the product of q and p with T/2 to make it delta quaternion, we have the code below. The T/2 factor is used to change angle speed to angle.
// integrate quaternion rate and normalise
q0 = q0 + (-q1*gx - q2*gy - q3*gz)*halfT*T;
q1 = q1 + (q0*gx + q2*gz - q3*gy)*halfT*T;
q2 = q2 + (q0*gy - q1*gz + q3*gx)*halfT*T;
q3 = q3 + (q0*gz + q1*gy - q2*gx)*halfT*T;
This ends our explanation of our motion tracking algorithm based on quaternion. Our hope is that this example can give you a better understanding of quaternion-based signal processing.
Here is one paper for using inertial sensor data: http://users.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/2008_Euston_iros_v1.04.pdf
Can I contact you on Github or email?
ReplyDelete