<html>
<head>
<style>
P
{
margin:0px;
padding:0px
}
body
{
FONT-SIZE: 10pt;
FONT-FAMILY:Tahoma
}
</style>
</head>
<body>
<BR>
Hi Mojtaba,<BR>
<BR>
Writing a tutorial takes some time, which for the moment I don't really have, sorry. But in short, what I did to create my streaming server was to:<BR>
<BR>
1) Create a custom DeviceSource class derived from FramedSource ( see FAQ, based on liveMedia/DeviceSource.cpp and liveMedia/DeviceSource.hh ). Basically, in the doGetNextFrame() and the deliverFrame() function there is code similar to the code in your x264VideoStreamFramer.cpp file ( the else{} part of your doGetNextFrame() function is in deliverFrame() ). I also created a simple FIFO class to store encoded video frames based on an STL queue.<BR>
<BR>
2) Create a custom RTPSink class derived from VideoRTPSink. This is essentially a modified copy of liveMedia/MPEG4ESVideoRTPSink.cpp needed for handling my type of frames.<BR>
<BR>
3) Create a custom MediaSubsesion class, overridding 3 functions createNewStreamSource(), createNewRTPSink() and getAuxSDPLine(). Function 1 and 2 create instances of your custom devicesource and your custom rtpsink. The getAuxSDPLine() starts the rtpsink and waits till my systemheader is created, so that it can be put in the config=... part of the reply to the DESCRIBE.<BR>
<BR>
4) Create a custom RTSPserver class based on mediaServer/DynamicRTSPserver.hh and mediaServer/DynamicRTSPserver.cpp changing the createNewSMS() function to handle my media type<BR>
<BR>
That's it! I also set the max packet len (<FONT face="Times New Roman">OutPacketBuffer::maxSize</FONT>) in MediaSink.cpp to 120 KB.<BR>
<BR>
The whole thing is started up by ( see mediaServer/live555MediaServer.cpp )<BR>
<BR>
- Creating a basictaskscheduler<BR>
- Creating a basicuserenvironment<BR>
- Create the RTSP server<BR>
- Entering the doeventloop<BR>
<BR>
<BR>
By the way, thanks for your tutorial, it helped...<BR>
<BR>
<BR>
About the packet loss, for my application it will be necessary to create some container format which will hold the video and audio frames. To simply fix the packet loss detection problem I could of course put a frame number in the container format, this would avoid any changes to liveMedia, but it would be nice if some function or mechanism would be added to report to the higher level that a frame was lost.<BR>
<P lang=en-US style="MARGIN-BOTTOM: 0cm" align=left><FONT face=serif></FONT> </P>
<BR>
<BR>
<BR>
Hello,<BR> I agree. I was also able to create an RTP streaming application for my H264 encoder within a week. More documentation would surely<BR>be appreciated by new users. Towards this, I documented my work and put it here: (it has sample code + UML diagrams)<BR><BR><A href="http://www.white.ca/patrick/tutorial.tar.gz"><U><FONT color=#0000ff>http://www.white.ca/patrick/tutorial.tar.gz</FONT></U></A><BR><BR>Do you think you would be able to do the same, Luc?<BR>I'm also interested in your question about packet loss. I have not yet had time to look at that part of RTP but I will have to, very soon. <BR>Are we to assume that presentation times will be regular (like every 33 ms) and if there is a gap between them on the receiving end, we know a frame was lost? <BR>I may be wrong but that doesn't seem like an elegant solution...<BR><BR>Mojtaba Hosseini<BR><BR><BR><br /><hr />het ultieme online mailprogramma! <a href='http://get.live.com/mail/overview' target='_new'>Ontdek Windows Live Hotmail,</a></body>
</html>