[Live-devel] RTP header extension
PROMONET Michel
michel.promonet at thalesgroup.com
Mon Jan 21 04:08:33 PST 2013
Hi Ross,
Thanks for your analysis of the appropriate channel to use depending on kind of data.
I would like to use RTP header extension in order to send if frame is a synchronisation point, and timestamps (recording time).
Do you think it's possible to give a callback to the RTSPClient ?
For the server side, it seems needed to make some stuff in MultiFramedRTPSink to allow to set the X bit, and to fill the extended buffer, isn't it ?
Thanks & Regards,
Michel.
[@@THALES GROUP RESTRICTED@@]
De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson
Envoyé : mercredi 16 janvier 2013 16:05
À : LIVE555 Streaming Media - development & use
Objet : Re: [Live-devel] RTP header extension
I guess this could be interesting to carry information inside the stream independandly of codec used.
That might be "interesting", but not necessarily appropriate. It depends on what sort of 'information' this is. The use of a RTP header extension is appropriate ***only if*** the information is directly related to the RTP packets (not just the stream as a whole). For example, one can imagine some RTP packets carrying an extra timestamp (e.g., a 'decoding timestamp'), in addition to the usual RTP timestamp (from which a 'presentation timestamp' is derived).
If the 'information' is static, and unchanging, then it could be put in the stream's SDP description (e.g., the 'info' or 'description' SDP lines). There are (optional) parameters to "ServerMediaSession::createNew()" to provide this information, and also - at the receiving end - member functions of "MediaSession" to get this information:
sessionName();
sessionDescription();
Another way to get information that's static (or doesn't change much) is to use the RTSP "GET_PARAMETER" command, as you've done.
For information that is time-based - i.e., changes over time - but is not directly related to an existing media stream (i.e., the audio or video stream) - then the information could itself be its own RTP media stream - e.g., using the "text" media type. Note, for example, that we support time-varying T.140 text streams over RTP, using the class "T140TextRTPSink". (That's used for transmitting text over RTP; for receiving such streams, we just use "SimpleRTPSource".) We use such streams to transmit the 'subtitle' tracks from Matroska files (and VLC, when used as a RTP receiver, will also display these as subtitles).
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20130121/43e77343/attachment-0001.html>
More information about the live-devel
mailing list