<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Fri, May 8, 2015 at 1:23 AM, Ross Finlayson <span dir="ltr"><<a href="mailto:finlayson@live555.com" target="_blank">finlayson@live555.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div style="word-wrap:break-word"><div><blockquote type="cite"><div><div dir="ltr" style="font-family:Helvetica;font-size:14px;font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal;line-height:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"><div class="gmail_extra"><div class="gmail_quote"><div>Having ruled out firewall and UDP transmission issues, what else could cause RTP-over-UDP to fail when RTP-over-TCP works fine?</div></div></div></div></div></blockquote><div><br></div>Nothing comes to mind, unfortunately. Assuming that you haven’t modified the supplied code, I can’t think of any reason (other than firewalls) why RTP-over-TCP would work, but RTP-over-UDP would not.</div></div></blockquote><div><br></div><div>Just to be sure, I wiped all the live555 code yesterday, untarballed it and rebuilt from scratch-- same result. We subclass OnDemandServerMediaSubsession, FramedSource (for our DeviceSource class) and create an RTSP server instance based on your example code. </div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div style="word-wrap:break-word"><div><br></div><div>Let’s see your “OnDemandServerMediaSubsession” subclass. In particular, how have you implemented the “createNewStreamSource()” and “createNewRTPSink()” virtual functions?</div></div></blockquote><div><br></div><div>#include "H264SMS.h"</div><div>#include "Log.h"</div><div>#include "ImageRingBuffer.h"</div><div>#include <queue></div><div><br></div><div>#include "leakcheck.h" // always include this last in any file</div><div><br></div><div><br></div><div>H264SMS::H264SMS( UsageEnvironment& env, bool reuseFirstSource,</div><div> GetMoreEncodedDataCallback* moreDataCallback, void* userCtx,</div><div> std::vector<uint8_t> useSPS,</div><div> std::vector<uint8_t> usePPS</div><div> )</div><div> : OnDemandServerMediaSubsession( env, reuseFirstSource ),</div><div> sink(NULL),</div><div> source(NULL),</div><div> myPPS(usePPS),</div><div> mySPS(useSPS),</div><div> moreDataCb(moreDataCallback),</div><div> userContext(userCtx)</div><div>{</div><div><br></div><div>}</div><div><br></div><div>H264SMS *</div><div>H264SMS::createNew( UsageEnvironment& env, bool reuseFirstSource,</div><div> GetMoreEncodedDataCallback moreDataCallback, void* userCtx,</div><div> std::vector<uint8_t> useSPS,</div><div> std::vector<uint8_t> usePPS)</div><div>{</div><div> return new H264SMS( env, reuseFirstSource, moreDataCallback, userCtx,</div><div> useSPS, usePPS</div><div> );</div><div>}</div><div><br></div><div>H264SMS::~H264SMS()</div><div>{</div><div> //Medium::close(sink);</div><div> //Medium::close(source);</div><div>}</div><div><br></div><div>FramedSource* H264SMS::createNewStreamSource(unsigned, unsigned& estBitrate )</div><div>{</div><div><br></div><div> estBitrate = 20000;</div><div> unsigned timePerFrame = 1000000/30; // microseconds</div><div><br></div><div> source = H264DeviceSource::createNew( envir(), timePerFrame);</div><div> source->registerGetMoreEncodedDataCallback(moreDataCb, userContext);</div><div><br></div><div> H264VideoStreamDiscreteFramer* framer = H264VideoStreamDiscreteFramer::createNew(envir(), source);</div><div><br></div><div> return framer;</div><div>}</div><div><br></div><div>RTPSink* H264SMS::createNewRTPSink(Groupsock* rtpGroupsock,</div><div> unsigned char rtpPayloadTypeIfDynamic,</div><div> FramedSource* inputSource)</div><div>{</div><div> (void)inputSource;</div><div><br></div><div> OutPacketBuffer::maxSize = 2000000;</div><div><br></div><div> sink = H264VideoRTPSink::createNew( envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,</div><div> &mySPS[0], mySPS.size(), &myPPS[0], myPPS.size() );</div><div> return sink;</div><div>}</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div style="word-wrap:break-word"><div><blockquote type="cite"><div dir="ltr" style="font-family:Helvetica;font-size:14px;font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal;line-height:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"><div class="gmail_extra"><div class="gmail_quote"><div>Here is how I am sending the encoded frames which I get from libavcodec to the DeviceSource (in case this rings any alarm bells):</div></div></div></div></blockquote><div><br></div>No, this looks OK. But what does your implementation of “doGetNextFrame()” do?</div></div></blockquote><div><br></div><div>I spent some time going through older posts on this mailing list and noticed the requirement that we not send multiple NALs to the H264VideoStreamDiscreteFramer. I put in a test looking for multiple NALs by matching the 0x00000001 start code and found that most encoded frame buffers have two NALs-- the first one appears to be SPS/PPS information. So I added code to split these NALs and deliver them one-at-a-time without the start sequence. The result was no video over TCP or UDP even though the NALs looked correct.</div><div><br></div><div>For reasons I don't fully understand (which may have to do with the behaviour of our HW encoder) we are manually extracting the SPS and PPS and feeding it to the constructor of the ServerMediaSubsession subclass (as you see above). Since this was suspect, I checked both that saveCopyOfSPS and saveCopyOfPPS we being called (they are) and that the SPS and PPS sequences match what we have extracted (they do). Just to be extra sure, I hacked in a pile of code from Fraunhofer H.264 reference codebase to validate the SPS and PPS we were sending you. They appear to parse okay and give the following result:</div><div><br></div><div><div>2015-May-08 09:28:14 SPS: profile_idc</div><div>2015-May-08 09:28:14 SPS: constrained_set0_flag</div><div>2015-May-08 09:28:14 SPS: constrained_set1_flag</div><div>2015-May-08 09:28:14 SPS: constrained_set2_flag</div><div>2015-May-08 09:28:14 SPS: constrained_set3_flag</div><div>2015-May-08 09:28:14 SPS: reserved_zero_4bits</div><div>2015-May-08 09:28:14 SPS: level_idc</div><div>2015-May-08 09:28:14 SPS: seq_parameter_set_id</div><div>2015-May-08 09:28:14 SPS: log2_max_frame_num_minus4</div><div>2015-May-08 09:28:14 SPS: pic_order_cnt_type</div><div>2015-May-08 09:28:14 SPS: num_ref_frames</div><div>2015-May-08 09:28:14 SPS: gaps_in_frame_num_value_allowed_flag</div><div>2015-May-08 09:28:14 SPS: pic_width_in_mbs_minus1</div><div>2015-May-08 09:28:14 SPS: pic_height_in_map_units_minus1</div><div>2015-May-08 09:28:14 SPS: frame_mbs_only_flag</div><div>2015-May-08 09:28:14 SPS: direct_8x8_inference_flag</div><div>2015-May-08 09:28:14 SPS: frame_cropping_flag</div><div>2015-May-08 09:28:14 SPS: vui_parameters_present_flag</div></div><div><br></div><div>Very glad you have you cast your eyes over the implementation below and above. I suspect our code is doing something wrong, but I am still not seeing where or how. Is there any extra tracing or validation you can recommend to make sure the stream and SDP are 100% correct going out the door on the server side?<br></div><div><br></div><div>void H264DeviceSource::doGetNextFrame() {<br></div><div><br></div><div> if (fNeedAFrame) {</div><div> Log::w("doGetNextFrame called while deliverFrame active");</div><div> return;</div><div> }</div><div><br></div><div> fNeedAFrame = True;</div><div> deliverFrameToClient();</div><div><br></div><div>} </div><div><br></div><div><div>void H264DeviceSource::deliverFrameToClient() {</div><div><br></div><div> fNeedAFrame = False;</div><div><br></div><div> pEncodedFrameAndMetadata frameAndMetadata;</div><div><br></div><div><br></div><div><br></div><div><span class="" style="white-space:pre"> </span>// Dequeue the next encoded frame object</div><div><span class="" style="white-space:pre"> </span>while (running){</div><div><br></div><div><span class="" style="white-space:pre"> </span>if (moreDataCallback != NULL)</div><div><span class="" style="white-space:pre"> </span>frameAndMetadata = moreDataCallback(userCtx);</div><div><span class="" style="white-space:pre"> </span>else</div><div><span class="" style="white-space:pre"> </span>return;</div><div><br></div><div><span class="" style="white-space:pre"> </span>if (frameAndMetadata.get() != NULL &&</div><div><span class="" style="white-space:pre"> </span>frameAndMetadata->encodedFrame.get() != NULL &&</div><div><span class="" style="white-space:pre"> </span>!frameAndMetadata->wasStreamed)</div><div><span class="" style="white-space:pre"> </span>{</div><div><span class="" style="white-space:pre"> </span>if (!frameAndMetadata->encodedFrame->empty()) {</div><div><span class="" style="white-space:pre"> </span>break;</div><div><span class="" style="white-space:pre"> </span>} else {</div><div><span class="" style="white-space:pre"> </span>Log::i("Empty frame!!!");</div><div><span class="" style="white-space:pre"> </span>}</div><div><span class="" style="white-space:pre"> </span>}</div><div><br></div><div><span class="" style="white-space:pre"> </span>boost::this_thread::sleep(boost::posix_time::milliseconds(20));</div><div><span class="" style="white-space:pre"> </span>}</div><div><br></div><div><span class="" style="white-space:pre"> </span>frameAndMetadata->wasStreamed = true;</div><div><br></div><div><span class="" style="white-space:pre"> </span>//Log::i("RTSP WE HAVE H264 FRAME.");</div><div><br></div><div><span class="" style="white-space:pre"> </span>lastImageAcquisitionTime = VTime::currentTimeMillis();</div><div><br></div><div><span class="" style="white-space:pre"> </span>// Set the 'presentation time': the time that this frame was sent</div><div><br></div><div><span class="" style="white-space:pre"> </span>gettimeofday(&fPresentationTime, NULL);</div><div><br></div><div><span class="" style="white-space:pre"> </span>fDurationInMicroseconds = 27027;</div><div><br></div><div><span class="" style="white-space:pre"> </span>startCapture(); // prepare the next frame</div><div><br></div><div><br></div><div><span class="" style="white-space:pre"> </span>if (!running){</div><div><span class="" style="white-space:pre"> </span>return;</div><div><span class="" style="white-space:pre"> </span>}</div><div><br></div><div><span class="" style="white-space:pre"> </span>// strip start sequence and copy to the fTo buffer:</div><div><span class="" style="white-space:pre"> </span>populateEncodedImage(frameAndMetadata);</div><div><br></div><div><br></div><div><br></div><div> if (fFrameSize >= fMaxSize) {</div><div> Log::e("H264DeviceSource::deliverFrameToClient(): read maximum buffer size. Frame may be truncated");</div><div> }</div><div><br></div><div> frameDeliveryRate.countEvent();</div><div><br></div><div><br></div><div>}</div></div><div><br>Thanks again for your help.</div><div><br></div><div>Dave</div><div><br></div></div><div><br></div><div class="gmail_signature"><div dir="ltr"></div></div>
</div></div>