location based augmented reality android applicati

2019-02-15 09:04发布

I am developing augmented reality android application based on real time location. It is a simple concept: my application should show some places around me. I have researched this intensively and yet I am still running into issues. I have my GPS coordinates and the target place's GPS coordinates.

My question is: How can I retrieve what my phone's camera is looking at (for example a building)? What is the logical way to solve something like this?

5条回答
聊天终结者
2楼-- · 2019-02-15 09:07

First check the availability of sensors in the device. If the device supports TYPE_ROTATION_VECTOR then register a sensor listener for the same or else check for TYPE_MAGNETIC_FIELD and TYPE_ACCELEROMETER.

SensorManager mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
rSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
if (rSensor == null) {
    mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
    aSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
}

Then in onSensorChanged function you should calculate the azimuth.

    if (rSensor == null) {
    switch (event.sensor.getType()) {
        case Sensor.TYPE_MAGNETIC_FIELD:
            magnetic = event.values.clone();
            break;
        case Sensor.TYPE_ACCELEROMETER:
            accelerometer = event.values.clone();
            break;
    }

    if (magnetic != null && accelerometer != null) {
        Rot = new float[9];
        I = new float[9];
        SensorManager.getRotationMatrix(Rot, I, accelerometer, magnetic);

        float[] outR = new float[9];
        SensorManager.remapCoordinateSystem(Rot, SensorManager.AXIS_X,
                SensorManager.AXIS_Z, outR);
        SensorManager.getOrientation(outR, values);

        azimuth = values[0];
        magnetic = null;
        accelerometer = null;
    }
} else {
    SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
    SensorManager.getOrientation(mRotationMatrix, mValues);

    azimuth = Math.toDegrees(mValues[0]));
    }
}

You can use this azimuth to plot the location in cameraview

    double angle = bearing(myLatitude, myLongitude, dLatitude, dLongitude) - azimuth;
double xAxis, yAxis;

if(angle < 0)
    angle = (angle+360)%360;

xAxis = Math.sin(Math.toRadians(angle)) * dist;
yAxis = Math.sqrt(Math.pow(dist, 2) - Math.pow(xAxis, 2));

if (angle > 90 && angle < 270)
    yAxis *= -1;

double xAxisPosition = angle * (screenWidth / 90d);

xAxis = xAxisPosition - spotImageWidth/2;
float x, y;
if (angle <= 45)
    x = (float) ((screenWidth / 2) + xAxis);

else if (angle >= 315)
    x = (float) ((screenWidth / 2) - ((screenWidth*4) - xAxis));

else
    x = (float) (float)(screenWidth*9);

y = (float)(((screenHeight - 300) - (i * 100)));

protected static double bearing(double lat1, double lon1, double lat2, double lon2) {
    double longDiff = Math.toRadians(lon2 - lon1);
    double la1 = Math.toRadians(lat1);
    double la2 = Math.toRadians(lat2);
    double y = Math.sin(longDiff) * Math.cos(la2);
    double x = Math.cos(la1) * Math.sin(la2) - Math.sin(la1) * Math.cos(la2) * Math.cos(longDiff);

    double result = Math.toDegrees(Math.atan2(y, x));
    return (result+360.0d)%360.0d;
}

x, y will give the coordinates of the destination location.

查看更多
Melony?
3楼-- · 2019-02-15 09:09

There're two directions in this issue, device and targets.

The Azimuth of device location is shown below:

Azimuth

This information can be collected by sensors. However, if the orientation of the device is not fixed, you should do SensorManager.remapCoordinateSystem

The azimuth to targets is shown below:

enter image description here

It's probably the best figure I can find on the internet. Once you have device location and the target location can be computed by:

azi = Math.abs(Math.toDegrees(Math.atan((tlon-lon)/(tlat-lat))));

where tlat and tlon indicates target gps locations, lat and lon are device location. the value of this equation lies in -90 to +90, which is not what azimuth really is. So, there're 3 additional code should be added.

   if((tlon-lon)>0&&(tlat-lat)<0){
        azi = 180 - azi;
    }
    if((tlon-lon)<0&&(tlat-lat)<0){
        azi = 180 + azi;
    }
    if((tlon-lon)<0&&(tlat-lat)>0){
        azi = 360 - azi;
    }   

After these are all done, it's just easy to detect if targets are in your sight.

Hope these helped.

查看更多
孤傲高冷的网名
4楼-- · 2019-02-15 09:13

The first step you need is to use sensor to get the direction of the back camera. You can read more about sensor at http://developer.android.com/reference/android/hardware/SensorManager.html
After you are done coding with sensors come back and ask the next question.

查看更多
一纸荒年 Trace。
5楼-- · 2019-02-15 09:15

Augmented Reality will transfer real coordinates system to camera coordinates system. In AR Location-based, the real coordinate is Geographic coordinate system. We will convert the GPS coordinate (Latitude, Longitude, Altitude) to Navigation coordinate (East, North, Up), then transfer Navigation coordinate to Camera coordinate and display it on camera view.

I just create the demo for you: https://github.com/dat-ng/ar-location-based-android

查看更多
萌系小妹纸
6楼-- · 2019-02-15 09:28

Try the DroidAR SDK https://github.com/bitstars/droidar . This is a AR SDK for Android. Most of your problems should be solved with it. There are also video manuals. You can also look into the code if you need just some stuff for your project.

查看更多
登录 后发表回答