I would like to implement feature like instagram story(only text overlay).
I was able to come to the point where user can add some text on video like screen shot below(upper right icon starts entering text, upper left just back to previous page).
After user put some text I want store the video into Firebase storage
.
But the problem is how can I keep this text in the video?
Is there any way to rewrite the file(re-encode) which has text overlay user put?
Or do I have to store text info into database then fetch it and display every time?
I can only give a partial answer, I hope it helps anyway.
You can use a PictureRecorder
to export a bitmap or png of a Canvas
in Flutter.
The png image should have the same size as the source video, and you can overlay it over the video with a simple Image
widget.
You can also upload this png image to Firebase, then download it on other clients to get exactly the same appearance (even when fonts are not installed).
The cool thing is that you can even save things like hand drawings, stickers, gradients and complex shapes (everything you can draw on a canvas) in the png image.
I guess you could also use some kind native library to bake the png image into the video if that is a requirement.
Here is a simple example to show how to generate and display such a png image:
import 'dart:async';
import 'dart:typed_data';
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
/// @param size Video size
/// @param text Styled text
ui.Image createTextImage(Size size, TextSpan text) {
final recorder = ui.PictureRecorder();
final cullRect = Offset.zero & size;
final canvas = Canvas(recorder, cullRect);
final textPainter = TextPainter(textDirection: TextDirection.ltr, text: text);
textPainter.layout();
// draw text in center of canvas, you can adjust this as you like
final textOffset = cullRect.center.translate(-textPainter.width / 2, textPainter.height / 2);
textPainter.paint(canvas, textOffset);
// you can also draw other geometrical shapes, gradients, paths...
canvas.drawCircle(Offset(100.0, 100.0), 50.0, Paint()..color = Color(0xffff00ff));
final picture = recorder.endRecording();
final image = picture.toImage(size.width.toInt(), size.height.toInt());
return image;
}
void main() => runApp(MyApp());
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Canvas Test',
home: MyHomePage(),
);
}
}
class MyHomePage extends StatefulWidget {
@override
_MyHomePageState createState() => new _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
/// Bytes of the generated image
Future<Uint8List> _imageBytes;
_generateImage() {
// Get this size from your video
final videoSize = Size(720.0, 1280.0);
final textStyle = TextStyle(
fontFamily: 'Roboto',
fontSize: 80.0,
color: Colors.red,
);
final text = TextSpan(text: 'Hello World', style: textStyle);
// Generate the image
final imageInfo = createTextImage(videoSize, text);
// Convert to png
final imageBytes =
imageInfo.toByteData(format: ui.ImageByteFormat.png).then((byteData) => Uint8List.view(byteData.buffer));
setState(() {
_imageBytes = imageBytes;
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('Canvas Test'),
),
body: Center(
child: Column(
mainAxisSize: MainAxisSize.min,
children: <Widget>[
FutureBuilder(
future: _imageBytes,
builder: (BuildContext context, AsyncSnapshot<Uint8List> snapshot) {
if (!snapshot.hasData) return Text('No data');
// Display the generated image in a box
return DecoratedBox(
decoration: BoxDecoration(border: Border.all()),
child: Image.memory(
snapshot.data,
width: 180.0,
height: 320.0,
),
);
},
),
RaisedButton(onPressed: _generateImage, child: Text('Generate Image'))
],
),
),
);
}
}