I am working on a project where I am muxing h264 raw stream into fmp4 so I can play it through MSE (Media source extension)
I am bit confused about frame duration. For example,
If I provide 30 frames and timescale is 1000, it plays for 1 second.
If I change timescale to 800, it plays for 2 seconds for the same 30 frames
If I change timescale to 500, it plays for 3 seconds for the same 30 frames
My question, how does the player calculate a single frame duration from timescale?