Transforming accelerometer's data from device&

2019-01-11 00:09发布

问题:

I'm really sorry if this is a very basic question, but I have not choice but ask it: How do you translate the accelerometer data from the device coordinates to real world coordinates?

I mean, assuming that the accelerometer is giving me somenting like (Ax,Ay,Az) -in device's coordinates-, what transformations should I apply to transform the values into (Ax',Ay',Az') -in real world's coordinates-, so I can use the acceleration vector in real worlds coordinates to calculate if the device is accelerating north, east, south-west,etc?

I have been working around this issue during the past few days. At first I thought that it shound't be difficult, but after searching dozens of pages I haven't come up with anything functional.

By the way, here is some code with what I've implemented so far:

    private SensorEventListener mSensorEventListener = new SensorEventListener() {

    public void onAccuracyChanged(Sensor sensor, int accuracy){
}

    public void onSensorChanged(SensorEvent event) {
        switch(event.sensor.getType()){
        case Sensor.TYPE_ACCELEROMETER:
            accelerometervalues = event.values.clone();
            AX.setText(accelerometervalues[0]+"");
            AY.setText(accelerometervalues[1]+"");
            AZ.setText(accelerometervalues[2]+"");
            break;
        case Sensor.TYPE_ORIENTATION:
            orientationvalues = event.values.clone();
            azimuth.setText(orientationvalues[0]+"");
            pitch.setText(orientationvalues[1]+"");
            roll.setText(orientationvalues[2]+"");
            break;
        case Sensor.TYPE_MAGNETIC_FIELD:
            geomagneticmatrix =event.values.clone();
            TAX.setText(geomagneticmatrix[0]+"");
            TAY.setText(geomagneticmatrix[1]+"");
            TAZ.setText(geomagneticmatrix[2]+"");
            break;
        }
        if (geomagneticmatrix != null && accelerometervalues != null) {
            float[] R = new float[16];
            float[] I = new float[16];
            SensorManager.getRotationMatrix(R, I, accelerometervalues, geomagneticmatrix);
            //What should I do here to transform the components of accelerometervalues into real world acceleration components??
        }
   }
};

I have:

A vector of accelerations in native coordinates in accelerometervalues.

A vector of magnetic field values in geomagneticmatrix.

Azimuth, pitch and roll in orientationvalues.

Rotation matrix R. Inclination matrix I.

I think all the necessary information is there, azimuth, pitch and roll should describe the displacement of the device's coordinate system in relation with the real world coordinate system. Also, I believe that R is/can even be used as a true north vector inside the devices coordinates.

It seems to me that obtaing the values of acceleration in real world is just a mathematical transformation away from those data. I just can't figure it out.

Thanks in advance.

Edited:

I have tried directly multipliying the components of accelerometervalues with the rotation matrix R (trueaccel=accel*R) but it didn't work.

                    trueacceleration[0]= accelerometervalues[0]*R[0]+accelerometervalues[1]*R[1]+accelerometervalues[2]*R[2];
                trueacceleration[1]= accelerometervalues[0]*R[1]+accelerometervalues[1]*R[4]+accelerometervalues[2]*R[7];
                trueacceleration[2]= accelerometervalues[0]*R[2]+accelerometervalues[1]*R[5]+accelerometervalues[2]*R[8];

I have also tried multipliying accelerometervalues with the inclination matrix I. Also multipliying with both R and I (trueaccel=accel*R*I) and that didn't work either. Neither does calling to remapcoordinates() and then multiply in any of the previous forms.

Does anybody have an idea about what am I doing wrong?

回答1:

Oki, I have worked this out mathematically myself so please bear with me.

If you want to translate an acceleration vector accelerationvalues into an acceleration vector trueacceleration expressed in real world's coordinates, once you have azimuth,pitch and roll stored in a orientationvalues vector, just do the following:

                trueacceleration[0] =(float) (accelerometervalues[0]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[2]*(-Math.sin(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])));
            trueacceleration[1] = (float) (accelerometervalues[0]*(-Math.cos(orientationvalues[2])*Math.sin(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[2]*(Math.sin(orientationvalues[2])*Math.sin(orientationvalues[0])+ Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])));
            trueacceleration[2] = (float) (accelerometervalues[0]*(Math.sin(orientationvalues[2])*Math.cos(orientationvalues[1])) + accelerometervalues[1]*(-Math.sin(orientationvalues[1])) + accelerometervalues[2]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[1])));


回答2:

You need to be able to know the reference coordinate system that also gives you the orientation of your device within 'real' world coordinates. Without that information, it look impossible to transform your data into anything useful.

For example, does your device have a type of 'directional' sensor that would help make sense of the accelerometer data (gyro & compass for example?)



回答3:

I am dealing with the same problem. What you can do is, as you have the R[] matrix multiply your acceleration vector and voilá.

float resultVec[] = new float[4];
Matrix.multiplyMV(trueacceleration, 0, R, 0, accelerometervalues, 0);

PS: accelerometervalues must be a 4 field vector, just add 0 to the last field.



回答4:

Try this, its working for me

private float[] gravityValues = null;
    private float[] magneticValues = null;
    private SensorManager mSensorManager = null;  
private void registerSensorListener(Context context) {
        mSensorManager = (SensorManager) context.getSystemService(SENSOR_SERVICE);
        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY),
                SensorManager.SENSOR_DELAY_FASTEST);
    }

    @Override
    public void onSensorChanged(SensorEvent event) {
        if ((gravityValues != null) && (magneticValues != null)
                && (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {

            float[] deviceRelativeAcceleration = new float[4];
            deviceRelativeAcceleration[0] = event.values[0];
            deviceRelativeAcceleration[1] = event.values[1];
            deviceRelativeAcceleration[2] = event.values[2];
            deviceRelativeAcceleration[3] = 0;

            Log.d("Raw Acceleration::","Values: (" + event.values[0] + ", " + event.values[1] + ", " + event.values[2] + ")");

            // Change the device relative acceleration values to earth relative values
            // X axis -> East
            // Y axis -> North Pole
            // Z axis -> Sky

            float[] R = new float[16], I = new float[16], earthAcc = new float[16];

            SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);

            float[] inv = new float[16];

            android.opengl.Matrix.invertM(inv, 0, R, 0);
            android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
            Log.d("Earth Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");

        } else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
            gravityValues = event.values;
        } else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
            magneticValues = event.values;
        }
    }


回答5:

This is what I used to map accelrometer data from local(Mobile) frame of reference to Earth frame of reference, to get rid of orientation in dependency. Since in earth frame Z-axis is pointing towards the sky and must show value ~=9.81m/sec^2. One phenomenon that I couldn't understand is when I put phone on the revolving chair any any orientation and rotate at constant speed then XEarth and YEarth values shows rotation with 90 degree phase shift and oscillates like a sin/cosine wave which i assume North and East axis.

public void onSensorChanged(SensorEvent event) {

        switch(event.sensor.getType()){

           case Sensor.TYPE_ACCELEROMETER:
                 System.arraycopy(event.values, 0, accel, 0, 3);
                     //To get Quternion representation of Accelrometer data              
                     SensorManager.getQuaternionFromVector(quatA , event.values);
             q1.w = quatA[0]; q1.x = quatA[1]; q1.y = quatA[2]; q1.z = quatA[3];
           break;

           case Sensor.TYPE_ROTATION_VECTOR:
                SensorManager.getRotationMatrixFromVector(rotationMatrix1,event.values);
                System.arraycopy(event.values, 0, rotationVector, 0, 3);
                SensorManager.getQuaternionFromVector(quat , event.values);
                q2.w = quat[0]; q2.x = quat[1]; q2.y = quat[2]; q2.z = quat[3];
                rotationMatrix2 = getRotationMatrixFromQuaternion(q2);
                rotationResult =  matrixMultiplication(accel,rotationMatrix2);
                //You can  use rotationMatrix1 or rotationMatrix2  

             break;
//Accel Data rotated as per earth frame of reference 
//rotationResult[0]; 
//rotationResult[1];
//rotationResult[2];

        }

    private float[] getRotationMatrixFromQuaternion(Quaternion q22) {
        // TODO Auto-generated method stub
        float [] q = new float[4];
        float [] result = new float[9];
        q[0] = q22.w;
        q[1] = q22.x;
        q[2] = q22.y;
        q[3] = q22.z;

        result[0] = q[0]*q[0] + q[1]*q[1] - q[2]*q[2] -q[3]*q[3];
            result[1] = 2 * (q[1]*q[2] - q[0]*q[3]);
            result[2] = 2 * (q[1]*q[3] + q[0]*q[2]);

            result[3] = 2 * (q[1]*q[2] + q[0]*q[3]);
            result[4] = q[0]*q[0] - q[1]*q[1] + q[2]*q[2] - q[3]*q[3];
            result[5] = 2 * (q[2]*q[3] - q[0]*q[1]);

            result[7] = 2 * (q[2]*q[3] + q[0]*q[1]);
            result[6] = 2 * (q[1]*q[3] - q[0]*q[2]);
        result[8] = q[0]*q[0] - q[1]*q[1] - q[2]*q[2] + q[3]*q[3];

        return result;
    }

 private float[] matrixMultiplication(float[] A, float[] B) {
        float[] result = new float[3];

        result[0] = A[0] * B[0] + A[1] * B[1] + A[2] * B[2];
        result[1] = A[0] * B[3] + A[1] * B[4] + A[2] * B[5];
        result[2] = A[0] * B[6] + A[1] * B[7] + A[2] * B[8];

        return result;
    }