I am working on an application to draw in the air with an android phone.
As my phone is moving, thanks to the acceletometer, i retrieve the acceleration on each axis ax, ay, az. What I am interested in is: x,y,z.
From what I read in forums and in some tutorials, integrating the accelaration twice gives huge errors.
So what is the best solution for me to get information on the deplacement of the phone?
Thanks for your help.
Not exactly what you are looking for:
Store orientation to an array - and compare
Tracking orientation works well. Perhaps you can do something similar with the accelerometer data (without any integration).
As you mentioned in your post, integrating the acceleration twice does not work.
Update:
If accuracy is not important at all then the double integral might work for a few seconds but expect very poor results.
However, the gyro mouse is a better choice in my opinion. See between 37:00-38:25 in
Sensor Fusion on Android Devices: A Revolution in Motion Processing.
Take a look at:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k
http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
Also I'm working on a similar project and I don't think you need to double integrate accelertaion and calculate distance moved. Instead use acceleration itself for moving.
But if you insist to use distance, by sampling acceleration(a0,a1,...,an) in short time intervals(t0,t1,...,tn) and suppose your acceleration in this intervals is average of boundaries, you can have something like that;
Vs=(t1-t0)*(a1+a0)/2 + (t2-t1)*(a2+a1)/2 + ... + (t[n]-t[n-1])(a[n]+a[n-1])/2
T=t[n]-t[0]
Xs=Vs*T
It is a little bit different from double intergating. Good luck.