I'm supposed to write an app for iOS and Android that sometimes shows a customized video player on a part of the screen. I have to be able to control it (seek, play, pause, set speed, choose video...). I know that such media is not yet supported in Gluon.
But would it be possible to write such a thing in XCode and Android Studio and somehow embed it in a Gluon app?
Following the design patterns in the Gluon Charm Down library, this could be a basic Android implementation of a VideoService
.
It is based on this tutorial, and adapted to be used on the current SurfaceView
that JavaFX uses. It will create a TextureView
that will be placed on the center of the screen, on top of the current view, taking the 95% of its width.
With the Gluon plugin for your IDE, create a Single View Project.
- Place these two classes under Source Packages, package
com.gluonhq.charm.down.plugins
:
VideoService interface
package com.gluonhq.charm.down.plugins;
public interface VideoService {
void play(String videoName);
void stop();
void pause();
void resume();
}
VideoServiceFactory class
package com.gluonhq.charm.down.plugins;
import com.gluonhq.charm.down.DefaultServiceFactory;
public class VideoServiceFactory extends DefaultServiceFactory<VideoService> {
public VideoServiceFactory() {
super(VideoService.class);
}
}
- Android Package: Place this class under Android/Java Packages, package
com.gluonhq.charm.down.plugins.android
:
AndroidVideoService class
package com.gluonhq.charm.down.plugins.android;
import android.content.Context;
import android.content.res.AssetFileDescriptor;
import android.graphics.SurfaceTexture;
import android.media.MediaMetadataRetriever;
import android.media.MediaPlayer;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.Surface;
import android.view.TextureView;
import android.view.WindowManager;
import android.widget.RelativeLayout;
import com.gluonhq.charm.down.plugins.VideoService;
import java.io.IOException;
import javafxports.android.FXActivity;
public class AndroidVideoService implements VideoService, TextureView.SurfaceTextureListener {
private static final String TAG = AndroidVideoService.class.getName();
private MediaPlayer mMediaPlayer;
private String videoName;
private final RelativeLayout relativeLayout;
private final TextureView textureView;
private final DisplayMetrics displayMetrics;
public AndroidVideoService() {
displayMetrics = new DisplayMetrics();
WindowManager windowManager = (WindowManager) FXActivity.getInstance().getSystemService(Context.WINDOW_SERVICE);
windowManager.getDefaultDisplay().getMetrics(displayMetrics);
relativeLayout = new RelativeLayout(FXActivity.getInstance());
textureView = new TextureView(FXActivity.getInstance());
textureView.setSurfaceTextureListener(this);
relativeLayout.addView(textureView);
}
@Override
public void play(String videoName) {
this.videoName = videoName;
stop();
FXActivity.getInstance().runOnUiThread(() -> {
FXActivity.getViewGroup().addView(relativeLayout);
});
}
@Override
public void stop() {
if (mMediaPlayer != null) {
mMediaPlayer.stop();
mMediaPlayer.release();
mMediaPlayer = null;
}
if (relativeLayout != null) {
FXActivity.getInstance().runOnUiThread(() -> {
FXActivity.getViewGroup().removeView(relativeLayout);
});
}
}
@Override
public void pause() {
if (mMediaPlayer != null) {
mMediaPlayer.pause();
}
}
@Override
public void resume() {
if (mMediaPlayer != null) {
mMediaPlayer.start();
}
}
@Override
public void onSurfaceTextureAvailable(SurfaceTexture st, int i, int i1) {
Surface surface = new Surface(st);
try {
AssetFileDescriptor afd = FXActivity.getInstance().getAssets().openFd(videoName);
calculateVideoSize(afd);
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mMediaPlayer.setSurface(surface);
mMediaPlayer.setLooping(true);
mMediaPlayer.prepareAsync();
mMediaPlayer.setOnPreparedListener(mediaPlayer -> mediaPlayer.start());
} catch (IllegalArgumentException | SecurityException | IllegalStateException | IOException e) {
Log.d(TAG, e.getMessage());
}
}
@Override public void onSurfaceTextureSizeChanged(SurfaceTexture st, int i, int i1) { }
@Override public boolean onSurfaceTextureDestroyed(SurfaceTexture st) { return true; }
@Override public void onSurfaceTextureUpdated(SurfaceTexture st) { }
private void calculateVideoSize(AssetFileDescriptor afd) {
try {
MediaMetadataRetriever metaRetriever = new MediaMetadataRetriever();
metaRetriever.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
String height = metaRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT);
String width = metaRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH);
double factor = Double.parseDouble(width) > 0 ? Double.parseDouble(height) / Double.parseDouble(width) : 1d;
// 95% screen width
RelativeLayout.LayoutParams lp = new RelativeLayout.LayoutParams((int) (0.95 * displayMetrics.widthPixels),
(int) (0.95 * displayMetrics.widthPixels * factor));
lp.addRule(RelativeLayout.CENTER_IN_PARENT);
textureView.setLayoutParams(lp);
} catch (NumberFormatException e) {
Log.d(TAG, e.getMessage());
}
}
}
- Sample
Place a video file in the android/assets folder, like big_buck_bunny.mp4
, that can be downloaded from here.
BasicView
public class BasicView extends View {
private boolean paused;
public BasicView(String name) {
super(name);
}
@Override
protected void updateAppBar(AppBar appBar) {
appBar.setNavIcon(MaterialDesignIcon.MENU.button());
appBar.setTitleText("Video View");
// big_buck_bunny.mp4 video in src/android/assets:
Services.get(VideoService.class).ifPresent(video -> {
appBar.getActionItems().add(MaterialDesignIcon.PLAY_ARROW.button(e -> video.play("big_buck_bunny.mp4")));
appBar.getActionItems().add(MaterialDesignIcon.PAUSE.button(e -> {
if (!paused) {
video.pause();
paused = true;
} else {
video.resume();
paused = false;
}
}));
appBar.getActionItems().add(MaterialDesignIcon.STOP.button(e -> video.stop()));
});
}
}
Deploy on your Android device and test:
Note that the TextureView will be on top until you remove it by pressing the stop button.
The native video player (or in this case the method of "previewing" a video) was used in the following example:
https://gist.github.com/bgmf/d87a2bac0a5623f359637a3da334f980
Beside some prerequisites, the code looks like this:
package my.application;
import ch.cnlab.disentis.util.Constants;
import org.robovm.apple.foundation.*;
import org.robovm.apple.uikit.UIApplication;
import org.robovm.apple.uikit.UIDocumentInteractionController;
import org.robovm.apple.uikit.UIDocumentInteractionControllerDelegateAdapter;
import org.robovm.apple.uikit.UIViewController;
import java.io.File;
import java.util.logging.Logger;
public class NativeVideoServiceIOS extends PathHelperIOS implements NativeVideoService {
private static final Logger LOG = Logger.getLogger(NativeVideoServiceIOS.class.getName());
public NativeVideoServiceIOS() {
LOG.warning("Initialized Native Video Service with path: " + this.pathBase);
}
@Override
public void triggerPlatformApp(String filename) {
String fullfile = pathBase.getAbsolutePath() + filename;
NSURL url = new NSURL(NSURLScheme.File, "", fullfile);
UIDocumentInteractionController popup = new UIDocumentInteractionController(url);
popup.setDelegate(new UIDocumentInteractionControllerDelegateAdapter() {
@Override
public UIViewController getViewControllerForPreview(UIDocumentInteractionController controller) {
return UIApplication.getSharedApplication()
.getWindows().first().getRootViewController();
}
});
popup.presentPreview(true);
}
}