[Live-devel] OutPacketBuffer::maxSize and TCP vs UDP

Google Admin bill at 2048-bit.com
Wed Sep 12 04:59:06 PDT 2018


I really wanted to post this as a learning check. I think I was able to
finally wrap my head around an RTSP/RTP issue I've been having.
Specifically, I have been playing around with a coupld of different IP
cameras.

I've found that I have no real issues streaming (or proxying) 720p or less
cameras, but now that I've gotten into some HD and UDH (4MP and 4K) I've
been seeing nothing but black frames in VLC.

In doing some testing I found that if I switched the transport protocol to
TCP I started getting video, even up to 4K. I had previously been using UDP
for transport of the RTP stream. I did a few packet captures and found that
my average packet loss was around 20%.

In an effort to understand why this was occurring I read Ross's post
about OutPacketBuffer::maxSize, and the fact that a single lost packet
prevents the reassembly of the whole picture frame. I suppose I have a few
questions.

I'm now assuming that the mere fact that TCP is a reliable transport
protocol is what's ensuring that I get all of the necessary packets to
reassemble the image frame. Am I missing something, or is really that
simple? Are there any reasons why TCP would work when UDP does not?

Lastly, I'm trying to understand how the OutPacketBuffer::maxSize value
affects things specifically?  I noticed there was a comment that there was
no point in going larger than 65536 bytes, but I've seen a number of posts
referencing that for giant 4K frames that you should set that value much
higher. If my MTU is 1500 how does changing that to be higher affect
transport of my frames?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20180912/0537fe47/attachment.html>


More information about the live-devel mailing list