<div dir="ltr">Hi, Ross,<div><br></div><div>Thank you for your reply, I should not express myself well, I think your method provided is the normal way in live555 which add live aac source audio in to current Media session, but I want to remain the aac audio RTP streamed by Gstreamer, and the H264 video is finished by live555, then is that possible that add the audio which RTP streamed out by Gstreamer to current live555 video RTSP media session.</div><div><br></div><div>Thank you.</div><div>Kevin</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Fri, Apr 10, 2015 at 1:43 PM, Ross Finlayson <span dir="ltr"><<a href="mailto:finlayson@live555.com" target="_blank">finlayson@live555.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="word-wrap:break-word"><div><span class=""><blockquote type="cite"><div dir="ltr"><div>what kind od ServerMediaSubsession objects I should use?</div></div></blockquote><div><br></div></span><div>As with the video stream, you must write your own subclass of “OnDemandServerMediaSubsession”, and implement the “createNewStreamSource()” and “createNewRTPSink()” virtual functions.</div><div><br></div><div>Like the existing “ADTSAudioFileServerMediaSubsession” class, your subclass will implement the “createNewRTPSink()” virtual function by calling “MPEG4GenericRTPSink::createNew()” (see “liveMedia/include/MPEG4GenericRTPSink.hh”), with the following parameters (based on the SDP description that you gave):</div><div><span style="white-space:pre-wrap"> </span>rtpTimestampFrequency:<span style="white-space:pre-wrap"> </span>48000</div><div><span style="white-space:pre-wrap"> </span>sdpMediaTypeString:<span style="white-space:pre-wrap"> </span>“audio”</div><div><span style="white-space:pre-wrap"> </span>mpeg4Mode:<span style="white-space:pre-wrap"> </span>"AAC-hbr”</div><div><span style="white-space:pre-wrap"> </span>configString:<span style="white-space:pre-wrap"> </span>“1190”</div><div><span style="white-space:pre-wrap"> </span>numChannels:<span style="white-space:pre-wrap"> </span>2</div><div><br></div><div>You must also implement the “createNewStreamSource()” virtual function to deliver one AAC frame at a time from your input source. It’s important that you set “fPresentationTime” properly, if you want the audio to be properly synchronized with the video, once you later combine them in a single “ServerMediaSession”.</div></div><span class=""><br><br><div>
<span style="border-collapse:separate;color:rgb(0,0,0);font-family:Helvetica;font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal;line-height:normal;text-align:-webkit-auto;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"><span style="border-collapse:separate;color:rgb(0,0,0);font-family:Helvetica;font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal;line-height:normal;text-align:-webkit-auto;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Ross Finlayson<br>Live Networks, Inc.<br><a href="http://www.live555.com/" target="_blank">http://www.live555.com/</a></span></span>
</div>
<br></span></div><br>_______________________________________________<br>
live-devel mailing list<br>
<a href="mailto:live-devel@lists.live555.com">live-devel@lists.live555.com</a><br>
<a href="http://lists.live555.com/mailman/listinfo/live-devel" target="_blank">http://lists.live555.com/mailman/listinfo/live-devel</a><br>
<br></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature"><div>Xingjun Chen<br>M.S. Degree <span><em>Earned</em></span></div>
<div>Electrical and Computer Engineering<br>University of Arizona, Tucson AZ 85721</div></div>
</div>