[Live-devel] Frames are corrupted

Vikram Singh vikram at vizexperts.com
Thu May 1 09:58:18 PDT 2014


Hi ross,

I am not able to get SPS and PPS units from the encoder.

I am using CUDA Video Encode library which has a function NVGetSPSPPS().

NVGetSPSPPS() returns  a buffer to SPS and PPS.

The problem is that I don't have the formatting for this buffer so that I
could separate SPS and PPS units from each other.

 

In my case this buffer is

 

               00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2
00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80

 

Total of 44 bytes.

 

In the link
http://www.cardinalpeak.com/blog/the-h-264-sequence-parameter-set/ 

 

0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0x00, 0x0a, 0xf8, 0x41, 0xa2  ==> sps

 

0x00, 0x00, 0x00, 0x01, 0x68, 0xce, 0x38, 0x80  ==> pps
 
According the webpage, 7 in 0x67 for nal_unit_type sps
And 8 in 0x68 is for pps nal_unit_type.
 
I have the same in my buffer.
Does this mean 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4
c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 is sps and 
ee 3c 80  is pps leaving out 00 24 at the starting.
 
According to my assumption 
 
                   00 00 00 01 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00
00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 ==> sps
And            00 00 00 01 68 ee 3c 80 ==> pps
 

Sorry I am not getting the detailed documentation for the function
NVGetSPSPPS(). Please help me.

 

 

From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of
Ross Finlayson
Sent: Tuesday, April 22, 2014 8:46 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Frames are corrupted

 

Does this happen because I have not received SPS and PPS nal units ?

 

Yes.  If you are streaming H.264 video, then you *must* have SPS and PPS NAL
units.  Either

            1/ Your H.264 video source contains SPS and PPS NAL units,
occurring frequently.  In this case, you *should not* modify
"getAuxSDPLine()". Or:

            2/ Your H.264 video source does not contain SPS and PPS NAL
units, but you know them some other way, in advance.  In this case, you
should not implement "getAuxSDPLine()", but you *must* then pass these NAL
units to "H264VideoRTPSink::createNew()", in your implementation of the
"createNewRTPSink()" virtual function.

 

If neither 1/ nor 2/ is true - i.e., if your video source does not contain
SPS and PPS NAL units, nor do you know these in advance - then you will not
be able to successfully stream H.264 video.

 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/ 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20140501/a61ffa2b/attachment.html>


More information about the live-devel mailing list