what's wrong with my sensor monitoring techniq

2020-05-27 01:28发布

(please read UPDATE 3 at the end)I'm developing an app that continually works with the sensors of device, works with Accelerometer and Magnetic sensors to retrieve the orientation of device(the purpose is mentioned here). in other words, my app needs to know the orientation of device in Real-time(however this is never possible, so as fast as possible instead, but really as fast as possible !). as mentioned in professional Android 4 Application Development by Reto Meier:

The accelerometers can update hundreds of times a second...

I must not lose any data that sensors report and I also want to do time-consuming operations on these data(retrieve the orientation and then do calculations... ). I decided to solve my problem by using LinkedBlockingQueue:

    public void startSensors() {
            LinkedBlockingQueue<float[][]> array=new LinkedBlockingQueue();
    sensorListenerForOrientation = new SensorEventListener() {

        @Override
        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
                aValues = (event.values.clone());
            else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
                mValues = (event.values.clone());
            if (aValues != null && mValues != null) {
                try {
                    array.put(new float[][] { aValues, mValues });
                } catch (InterruptedException e) {
                }
            }
        }

        @Override
        public void onAccuracyChanged(Sensor sensor, int accuracy) {
        }
    };
    Sensor aSensor = sm.getSensorList(Sensor.TYPE_ACCELEROMETER).get(
            sm.getSensorList(Sensor.TYPE_ACCELEROMETER).size() - 1);
    Sensor mSensor = sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).get(
            sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).size() - 1);
    sm.registerListener(sensorListenerForOrientation, aSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    sm.registerListener(sensorListenerForOrientation, mSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    executor.execute(new Runnable() {
        @Override
        public void run() {
            doCalculations();
        }
    });
}

and

    public void doCalculations() {
    for (;;) {
        float[][] result = null;
        try {
            result = array.take();
        } catch (InterruptedException e) {
        }
        float[] aValues, mValues;
        aValues = result[0];
        mValues = result[1];

                    int[] degrees=getOrientation(aValues,mValues);
                    Log.e("",String.valueOf(degrees[0]));

                 //other calculations...
                     }
                                }

now I pick up my device and rotate it about 90 degrees to right and then return it to the first position fast(for example in 1.5 seconds) but as I look at the orientations that are registered in device I see for example: 0,1,2,3,4,5.......,40,39,38,37,....,0

I just want to say that I can't see a large domain of degrees in my result . based on what I have done and what I have researched I just can be sure that I am NOT losing any data, any new data reported by sensors are recorded.

any Idea, solution?!

Regards!

UPDATE 1: I did another experiment with my device and got shocking results! if I rotate my device over an axis 90 degrees fast (less than a second), I can see all degrees in my result: 0,1,2,3,....,89,90 (for example) but if I rotate it 90 degrees and then rotate it back to its first position, the result would be 0,1,2,...,36,37,36,...2,1,0(for example)...really confusing !

UPDATE 2: I updated doCalculations() method to be more clear what I have done

UPDATE 3: I think maybe we can solve the problem in another way! I have clear purposes for this code. please have a look at this. I have mentioned what is going to happen, I need to detect an specific movement gesture. so maybe the whole way that I have chosen(the technique above) is not a good way for solving this problem. maybe it's better to detect that gesture by using other sensors or using the same sensors in other way. what do you think?

7条回答
疯言疯语
2楼-- · 2020-05-27 01:55

The usual thing to do within an event block is to do almost nothing, since this is really fast. "Almost" being the important word. In your case, the event could just add the data of the event (from the event parameter) to some data structure (list, stack, circular buffer... your pick). That way you should lose less events (if any).

Which means that you can then (for instance periodically) read the stored events and decide if a gesture was made. That means that your intensive calculations are made less often. But you don't lose any events. I think this is acceptable because of your purpose, which is gesture recognition. I assume it doesn't have to be that fast (ie. you don't have to calculate it every time the sensor updates).

Note : this is one common way to handle IT in the Linux world.

查看更多
登录 后发表回答