I am trying to understand Android sensor management.
Am I right that if I want the gyroscope to be included getting the phone orientation, it is automatically done when I call getOrientation(..) and the phone has an gyroscope sensor?
So if the phone got acceleration and gyroscope sensor there will probably be a better orientation result, in contrast to the case when it has only acceleration sensors?
Thanks!
The methods included in the Android APIs to get the orientation does not include readings from the gyroscope sensor. Gyroscope does not provide information about orientation, since it only has information about rotation speed.
You can benefit from gyroscope sensor readings using such information to get a better estimation of the orientation:
- Step 1: You have an initial estimation of the orientation.
- Step 2: You have a newer estimation of the orientation and some information coming from the gyroscope sensor.
- You have information about the time increase (delta-t) between step 1 and step 2: So, you can integrate the rotational speed during this time to get an estimation of the rotation between the two states.
- You have also a new orientation reading in state 2.
- You can integrate these two sources of information ([orientation @step=1 + rotation] and [orientation @step=2]) to get a refined estimation of orientation at step 2.
- This can be done in a rather simple way using the complementary filter
If you are working with Api Level 9 and above, and your device has a gyroscope, you can benefit from Sensor.TYPE_ROTATION_VECTOR
.
It takes care of the "fusion" of data retrieved from accelerometer, magnetic sensor and gyroscope.