[Live-devel] Stream h264 - bitrate's effect on latency

Jeff Shanab jshanab at jfs-tech.com
Sat Apr 26 20:03:10 PDT 2014


What is the resolution? At higher resolutions and/or encoding qualities the
encoder/decoder may require multiple calls to pump out the frame. I assume
that you are sticking with a simple base profile specifically to avoid B
frames. (Bidirectional predictive B frames would require latency
obviously).

To synchronize, you have to be at the mercy of the slowest stream. Of
course MJPEG with it's every frame standing on it's own, would avoid the
problem but have much higher bandwidth.

Other little things. The smaller the packet the lower the latency. Which is
why transport streams have such tiny 188(or 204) byte packets. This allows
mixing streams at a finer grainularity and assemble them as soon as you get
all the info from multiple streams at once. (each network packet of MTU
1400 can contain parts of multiple streams so it appears more parallel to
the next level up code.)

The transport UDP/TCP choice can effect latency if a data is lost and needs
to be re-sent(TCP) UDP could drop and move on. Therefor less latency.

How is it encoded? Software or Hardware/firmware?  Software can be improved
with GPU encodeing. (The Raspi may have this avail)
Software may be improved by periodic intra-refresh. Stock H264 can have
large Key frames that require a chunk of work to handle.

Whoa? did I hear you correctly? stdin? I cannot even speculate how you can
quantify stdin timing. (try sockets, shared memory)

Make sure you create your timestamps at end of image capture/beginning of
encoding time. Since you are the source, the timestamps can be based all
off the same clock.

Other idea. While normally the first substream of a video stream is the
audio, perhaps you could create a stream with n substreams each a video.
Then they are synced?




On Sat, Apr 26, 2014 at 7:02 AM, Mark Theunissen
<mark.theunissen at gmail.com>wrote:

> I have three Raspi cameras pushing out h264 elementary streams, which I'm
> reading from stdin and sending out using live555's RTSP server. All works
> great.
>
> The only problem is that the three camera feeds can sometimes have
> different bitrates depending on what they're looking at - I want them
> syncronized. It seems that the cameras with the higher bitrates are closer
> to real-time, and the lower bitrates lag more. I expect this is because
> there is some fixed-size buffer somewhere, and the higher the bitrate, the
> faster the buffer is filled and the more realtime the stream is.
>
> If I manually limit the bitrate so that all cameras are the same, they
> syncronize perfectly.
>
> However, I'd like to rather allow them to have different bitrates, but
> still be syncronized.
>
> Can anyone help with this? Is there a setting in live555 somewhere?
> Perhaps variable buffer size or rather, no buffer so that video is sent
> ASAP? Gstreamer manages extremely low latency, but uses too much CPU for my
> application.
>
> I'm just using the test app modified to read from stdin.
>
> Thanks
> Mark
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20140426/3219e5ec/attachment.html>


More information about the live-devel mailing list