<div>Thanks for the response Debargha.</div>
<div><br><< You need to send NAL units to your H264VideoRTPSink. Is the X264 encoder you are using creating a single NAL unit per video frame encoded? >></div>
<div> </div>
<div>My encoder is creating a single NAL unit per video frame encoded. I verified that very thing this evening just to double check. I do see that the encoder is capable of generating multiple NAL units per frame, but when I checked the size of the incoming buffers, it was clear that I was getting a single frame per nal unit.</div>
<div> </div>
<div>Because of this, I have currentNALUnitEndsAccessUnit always return true.</div>
<div><< In order to write an x264VideoStreamFramer class, I would also model it based on the MPEG4VideoStreamDiscreteFramer class, and then add an appropriate parsing mechanism to extract NAL units from the encoded frame bit-stream received >></div>
<div> </div>
<div>I haven't looked at that framer, but I will.</div>
<div> </div>
<div>After tonight, I'm convinced something is wrong with my SDP stuff. I fired up Wireshark to see the traffic between my transmitting application and VLC, and I don't see the SDP stuff anywhere in the bitstream. Would it be contained in the RTP packets, or the RTCP packets? I feel like I'm pretty close to having this work, but every time I fire up VLC it basically does absolutely nothing with the stream, so...not sure what I'm missing here.</div>
<div> </div>
<div>The other thing I don't understand is I see other people using the string "h264" for the sprop string in the framer--why is it that I need to include my SPS/PPS parameters, but other people have reported implementations working just find with "h264" inputted for that entire string? I'm definitely confused about how RTSP, RTP and SDP all operate together, and what Live555 provides w/r/t all of these technologies.</div>