[Live-devel] Lost packets

Mojtaba Hosseini MHosseini at newheights.com
Thu Jun 28 07:00:37 PDT 2007


Hello,
  I agree. I was also able to create an RTP streaming application for my H264 encoder within a week. More documentation would surely
be appreciated by new users. Towards this, I documented my work and put it here: (it has sample code + UML diagrams)

http://www.white.ca/patrick/tutorial.tar.gz

Do you think you would be able to do the same, Luc?
I'm also interested in your question about packet loss. I have not yet had time to look at that part of RTP but I will have to, very soon. 
Are we to assume that presentation times will be regular (like every 33 ms) and if there is a gap between them on the receiving end, we know a frame was lost? 
I may be wrong but that doesn't seem like an elegant solution...

Mojtaba Hosseini


-----Original Message-----
From: live-devel-bounces at ns.live555.com on behalf of Luc Roels
Sent: Thu 6/28/2007 7:14 AM
To: live-devel at ns.live555.com
Subject: [Live-devel] Lost packets
 



-----Original Message-----
From: live-devel-bounces at ns.live555.com on behalf of Luc Roels
Sent: Thu 6/28/2007 7:14 AM
To: live-devel at ns.live555.com
Subject: [Live-devel] Lost packets
 

Hi Ross,
 
I've been able to create a simple streaming server for my 'modified H.264' video encoder card and created a simple viewing client in just a couple of days using the livemedia library, and it might even have been faster if there was some good documentation available :-). Even so, Livemedia is great, to do this from scratch would have taken me several weeks. One more question though regarding packet loss. In a previous post you told me that I can detect packet loss by inspecting the presentation times at a higher level. I don't see how this can work properly? Suppose we are streaming live MPEG4 video using RTP over the internet. If a P frame isn't delivered because one or more of it's composing packets are lost, the client should stop decoding until it receives a new and complete I frame. I don't see how the client can detect the packet loss by simply looking at the presentation time. If the streaming server delivers a variable framerate then there is no way to know that a frame is lost by looking at it's presentation time or am I wrong? The only way to detect the frame loss would be if the higher level had access to the frame's beginning and ending RTP sequence numbers or am I mistaken? What would be the simplest way to detect this?
 
best regards,
 
Luc Roels
_________________________________________________________________
Ontdek Windows Live Hotmail, het ultieme online mailprogramma!
http://get.live.com/mail/overview


-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/ms-tnef
Size: 3854 bytes
Desc: not available
Url : http://lists.live555.com/pipermail/live-devel/attachments/20070628/b56582f3/attachment.bin 


More information about the live-devel mailing list