How to use RajawaliVR or Rajawali to play a 360 Vi

2019-03-16 21:01发布

问题:

I am having a hard time to figure out how to use the Rajawali to play a 360 video. In order to achieve this, I tried every solution I could find in the Internet, but I failed.

Firstly, I used the RajawaliCardboard, and let the MainActivity extend from CardboardActivity. At the same time, in MyRenderer class, I let this class extend from the RajawaliCardboardRenderer class. In MyRenderer class, I overrided the initScene() function:

protected void initScene() {
    StreamingTexture mTexture = null;
    if (externalMemoryAvailable())
    {
        mVideoPath = Environment.getExternalStorageDirectory().getAbsolutePath()+"/testVideo.mp4";
        try{
            mPlayer = new MediaPlayer();
            mPlayer.setDataSource(mVideoPath);
        }catch(IllegalArgumentException e){
        e.printStackTrace();
        } catch (SecurityException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (IllegalStateException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
        try {
            mPlayer.prepare();
        } catch (IOException t) {
            t.printStackTrace();
        }
        mTexture = new StreamingTexture("video", mPlayer);
    }
    Sphere sphere = createPhotoSphereWithTexture(mTexture);
    getCurrentScene().addChild(sphere);
    getCurrentCamera().setPosition(Vector3.ZERO);
    getCurrentCamera().setFieldOfView(75);
}

private  Sphere createPhotoSphereWithTexture(ATexture texture) {
    Material material = new Material();
    material.setColor(0);
    try {
        material.addTexture(texture);
    } catch (ATexture.TextureException e) {
        throw new RuntimeException(e);
    }
    Sphere sphere = new Sphere(50, 64, 32);
    sphere.setScaleX(-1);
    sphere.setMaterial(material);
    return sphere;
}

The program can run without any error, but the screen is black and with no image.
I want to ask what should I do to improve my program, and why I should do to play video by using Rajawali. Who can help me?

回答1:

I got success to play video with Rajawali.

public class VideoRenderer extends RajawaliCardboardRenderer {

    Context mContext;

    private MediaPlayer mMediaPlayer;
    private StreamingTexture mVideoTexture;

    public VideoRenderer(Context context) {
        super(context);
        mContext = context;
    }

    @Override
    protected void initScene() {

        mMediaPlayer = MediaPlayer.create(getContext(),
                R.raw.video);
        mMediaPlayer.setLooping(true);

        mVideoTexture = new StreamingTexture("sintelTrailer", mMediaPlayer);
        Material material = new Material();
        material.setColorInfluence(0);
        try {
            material.addTexture(mVideoTexture);
        } catch (ATexture.TextureException e) {
            e.printStackTrace();
        }

        Sphere sphere = new Sphere(50, 64, 32);
        sphere.setScaleX(-1);
        sphere.setMaterial(material);

        getCurrentScene().addChild(sphere);

        getCurrentCamera().setPosition(Vector3.ZERO);

        getCurrentCamera().setFieldOfView(75);

        mMediaPlayer.start();

    }

    @Override
    protected void onRender(long ellapsedRealtime, double deltaTime) {
        super.onRender(ellapsedRealtime, deltaTime);
        mVideoTexture.update();
    }

    @Override
    public void onPause() {
        super.onPause();
        if (mMediaPlayer != null)
            mMediaPlayer.pause();
    }

    @Override
    public void onResume() {
        super.onResume();
        if (mMediaPlayer != null)
            mMediaPlayer.start();
    }

    @Override
    public void onRenderSurfaceDestroyed(SurfaceTexture surfaceTexture) {
        super.onRenderSurfaceDestroyed(surfaceTexture);
        mMediaPlayer.stop();
        mMediaPlayer.release();
    }
    public void nextVideo(String nextVideoPath){
        try{
            mMediaPlayer.stop();
            mMediaPlayer.reset();

            mMediaPlayer.setDataSource(nextVideoPath);
            mMediaPlayer.prepare();
            mMediaPlayer.start();

      }catch (Exception e){
        e.printStackTrace();
      }
    }
}


回答2:

I think your main error is to call MediaPlayer.prepare() in the media player and not MediaPlayer.prepareAsync()
You have to take into account the different states that a MediaPlayer goes through when a video is played. Here you have a link to the state diagram. You should only call MediaPlayer.start() once the video player has finished preparing everything so that the video starts to play.
I'm working on the same thing (a video player for 360 videos) with Rajawali and so far I've achieved to reproduce them in normal Gyroscope and Touch mode, but I'm finding lots of problems to make it work with the Google Cardboard integration, so I'm trying to make my own "sideBySide" renderer at the moment.

If my comments aren't enough, here you have a sample of the code I'm currently using to reproduce the video as a streaming texture on the Sphere. It is a part of the overrided method initScene() on a class that extends RajawaliRenderer

//create a 100 segment sphere
    earthSphere = new Sphere(1, 100, 100);
    //try to set the mediaPLayer data source
    mMediaPlayer = new MediaPlayer();
    try{
        mMediaPlayer.setDataSource(context, Uri.parse("android.resource://" + context.getPackageName() + "/" + R.raw.pyrex));
    }catch(IOException ex){
        Log.e("ERROR","couldn attach data source to the media player");
    }
    mMediaPlayer.setLooping(true);  //enable video looping
    video = new StreamingTexture("pyrex",mMediaPlayer); //create video texture
    mMediaPlayer.prepareAsync();    //prepare the player (asynchronous)
    mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
        @Override
        public void onPrepared(MediaPlayer mp) {
            mp.start(); //start the player only when it is prepared
        }
    });
    //add textture to a new material
    Material material = new Material ();
    material.setColorInfluence(0f);
    try{
        material.addTexture(video);
    }catch(ATexture.TextureException ex){
        Log.e("ERROR","texture error when adding video to material");
    }
    //set the material to the sphere
    earthSphere.setMaterial(material);
    earthSphere.setPosition(0, 0, 0);
    //add the sphere to the rendering scene
    getCurrentScene().addChild(earthSphere);


回答3:

Since you want to play 360 video you need some Orientation tracker. Here is example for cardboard activity.

public class CardboardRendererExample extends Renderer implements CardboardView.StereoRenderer {
public static final int FIELD_OF_VIEW = 90;
public static final float PLANE_WIDTH = 64.0f;
public static final float PLANE_HEIGHT = 36.0f;
public static final float PLANE_DISTANCE = -64.0f;

private final MediaPlayer mMediaPlayer;
protected StreamingTexture mStreamingTexture;

protected Quaternion mOrientation = Quaternion.getIdentity();
protected Quaternion mEyeOrientation = Quaternion.getIdentity();

protected float[] mHeadView = new float[16];
private Matrix4 mEyeMatrix = new Matrix4();
private Vector3 mEyePosition = new Vector3();
private Matrix4 mHeadViewMatrix4 = new Matrix4();

public CardboardRendererExample(Context context, MediaPlayer mediaPlayer) {
    super(context);

    mMediaPlayer = mediaPlayer;
}

@Override
protected void initScene() {
    getCurrentCamera().setPosition(Vector3.ZERO);
    getCurrentCamera().setFieldOfView(FIELD_OF_VIEW);

    mStreamingTexture = new StreamingTexture("give_it_some_name", mMediaPlayer);
    mStreamingTexture.shouldRecycle(true);
    setSceneCachingEnabled(true);

    final Plane projectionScreen = new Plane(PLANE_WIDTH, PLANE_HEIGHT, 64, 64);
    final Material material = new Material();
    material.setColor(0);
    material.setColorInfluence(0f);
    try {
        material.addTexture(mStreamingTexture);
    } catch (ATexture.TextureException e) {
        e.printStackTrace();
        throw new RuntimeException(e);
    }

    projectionScreen.setDoubleSided(true);
    projectionScreen.setMaterial(material);
    projectionScreen.setTransparent(true);
    projectionScreen.setPosition(0, 0, PLANE_DISTANCE);
    getCurrentScene().addChild(projectionScreen);

    getCurrentScene().addChild(projectionScreen);
}

@Override
public void onNewFrame(HeadTransform headTransform) {
    headTransform.getHeadView(mHeadView, 0);

    mHeadViewMatrix4.setAll(mHeadView).inverse();
    mOrientation.fromMatrix(mHeadViewMatrix4);
}

@Override
public void onDrawEye(Eye eye) {
    getCurrentCamera().updatePerspective(
            eye.getFov().getLeft(),
            eye.getFov().getRight(),
            eye.getFov().getBottom(),
            eye.getFov().getTop());

    mEyeMatrix.setAll(eye.getEyeView());
    mEyeOrientation.fromMatrix(mEyeMatrix);
    getCurrentCamera().setOrientation(mEyeOrientation);
    mEyePosition = mEyeMatrix.getTranslation(mEyePosition).inverse();
    getCurrentCamera().setPosition(mEyePosition);

    super.onRenderFrame(null);
}

@Override
public void onFinishFrame(Viewport viewport) {
}

@Override
public void onSurfaceChanged(int width, int height) {
    super.onRenderSurfaceSizeChanged(null, width, height);
}

@Override
public void onSurfaceCreated(EGLConfig eglConfig) {
    super.onRenderSurfaceCreated(eglConfig, null, -1, -1);
}

@Override
public void onRenderSurfaceCreated(EGLConfig config, GL10 gl, int width, int height) {
    super.onRenderSurfaceCreated(config, gl, width, height);
}

@Override
public void onRendererShutdown() {
}

@Override
protected void onRender(long elapsedRealTime, double deltaTime) {
    super.onRender(elapsedRealTime, deltaTime);
    if (mStreamingTexture != null) {
        mStreamingTexture.update();
    }
}

@Override
public void onOffsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset) {
}

@Override
public void onTouchEvent(MotionEvent event) {
}

}

Alternatively you can implement your tracker based (for example) on

com.google.vrtoolkit.cardboard.sensors.HeadTracker

Sure you can get rid of all those fields, but they supposed to make GC life easier.