I was thinking of using multiple Android devices (e.g. Nexus 7 tablets) to build a photo / video wall and I'm wondering a) whether it is possible and b) how to synchronize the display of all these devices. Google showed off its Chrome racer experiment so clearly it is possible to synchronize displays across many devices.
So here are my questions:
- what technology should I use to synchronize the displays? Android? Chrome? Please point me to existing code if possible.
- what's the minimum lag between devices that could be achieved in such a setup?
- can video and sound playback also be started simultaneously on multiple devices (think video wall)?
- what kind of architecture should be considered for such a project? Centralized server that sends out commands? Should devices talk to each other?
I'm very curious about suggestions!
EDIT:
blinkendroid is the only app I've found so far that might do the job. Pros? Cons? Alternatives?
Technically the screen isn't shared, the games state is shared and the phones all render the state as they understand it.
Just a bit of background about Chrome Racer. We have a case-study on here, but it doesn't fully cover the question you are asking.
The primary technology used for communication in Racer is WebSockets. WebSockets allow one client to push and receive messages from a server in near-realtime.
Racer starts a session for a game by giving it a unique ID and holds open a Web Socket to the user. Anyone who subsequently joins a game is told to use the Same ID and the server creates a Web Socket to them as well. Now the server knows all the participants.
When a game starts a message is broadcast to all the participants asking them get ready to start, during this phase the server is working out how long it takes to round trip messages to all the clients. It is doing this so that it can work out any latency between devices and thus attempt to compensate for the latency on the slower clients.
Now the server knows about the clients the game can start properly. As the users are playing their game their commands are being pushed to the server over the web socket. The server the relays this message out to all the connected clients (like a satellite does) and it does this same thing for every single user that is connected to the session. This is how the games state is shared.
As each client receives the commands that are broadcast to it from the server it updates its internal representation of the game and renders that to the screen.
And that is about it.
Actually, we wanted to use WebRTC Data Channel because it can reduce the number of hops that data has to make to reach the client. In our solution today a client pings the server and the server relays the message (2 hops), we can reduce the latency by half if we can send it directly to the other user (which is the goal of WebRTC). Unfortunately WebRTC was not ubiquitous enough to deploy this as a solution at the time.
Websockets means web. You don't need the web to sync multiple devices at the same physical place... For video/music syncing, native apps via local offline technologies like Bluetooth or WiFi sounds more reliable.