[Live-devel] how to make latency as low as possible

xliu at vscenevideo.com xliu at vscenevideo.com
Mon Jan 16 03:46:20 PST 2017


hi Ross,

I made changes to send every NAL unit seperately, and made sure all start code were cut off. For the "key frame" (the SC+SPS+SC+PPS+SC+Frame in one chunk of data), I seperated them and called memmove and afterGetting for each. See the code in grey below.

void Gm813xSource::deliverFrame(void)
{
    int ret;
    gm_enc_multi_bitstream_t bs;

    if (!isCurrentlyAwaitingData())
        return; // we're not ready for the data yet

    memset(&bs, 0, sizeof(bs));

    bs.bindfd = main_bindfd;//poll_fds[i].bindfd;
    bs.bs.bs_buf = frameBuf;  // set buffer point
    bs.bs.bs_buf_len = FRAME_BUF_SIZE;  // set buffer length
    bs.bs.mv_buf = 0;  // not to recevie MV data
    bs.bs.mv_buf_len = 0;  // not to recevie MV data

    if (bytesInBuf > 0) { // send leftover data
        if (bytesInBuf > fMaxSize) {
            fFrameSize = fMaxSize;
            fNumTruncatedBytes = bytesInBuf - fMaxSize;
            bytesInBuf = fNumTruncatedBytes;
            dataPtr += fFrameSize;
        } else {
            fFrameSize = bytesInBuf;
            bytesInBuf = 0;
        }
        memmove(fTo, dataPtr, fFrameSize);
        FramedSource::afterGetting(this);
    } else { // get a new frame and send
        if ((ret = gm_recv_multi_bitstreams(&bs, 1)) < 0) {
            printf("Error, gm_recv_multi_bitstreams return value %d\n", ret);
        } else {
            if ((bs.retval < 0) && bs.bindfd) {
                printf("Error to receive bitstream. ret=%d\n", bs.retval);
            } else if (bs.retval == GM_SUCCESS) {
                u_int8_t* newFrameDataStart = (u_int8_t*)bs.bs.bs_buf; //%%% TO BE WRITTEN %%%
                unsigned newFrameSize = bs.bs.bs_len; //%%% TO BE WRITTEN %%%

                if(newFrameDataStart[0] == 0x00 && newFrameDataStart[1] == 0x00
                        && newFrameDataStart[2] == 0x00 && newFrameDataStart[3] == 0x01) {
                    newFrameDataStart += 4;
                    newFrameSize -= 4;
                }

                if(1 == bs.bs.keyframe
                        && newFrameDataStart[10] == 0 && newFrameDataStart[11] == 0
                        && newFrameDataStart[12] == 0 && newFrameDataStart[13] == 1
                        && newFrameDataStart[18] == 0 && newFrameDataStart[19] == 0
                        && newFrameDataStart[20] == 0 && newFrameDataStart[21] == 1
                        ) {
                    fFrameSize = 10;
                    gettimeofday(&fPresentationTime, NULL);
                    memmove(fTo, newFrameDataStart, fFrameSize); // SPS
                    FramedSource::afterGetting(this);

                    gettimeofday(&fPresentationTime, NULL);
                    fFrameSize = 4;
                    memmove(fTo, newFrameDataStart+14, fFrameSize); // PPS
                    FramedSource::afterGetting(this);

                    newFrameDataStart += (10 + 4 + 4 + 4); // SPS + SC + PPS + SC
                    newFrameSize -= 22;
                }

                bytesInBuf = newFrameSize;
                dataPtr = newFrameDataStart;

                // Deliver the data here:
                if (newFrameSize > fMaxSize) {
                    fFrameSize = fMaxSize;
                    fNumTruncatedBytes = newFrameSize - fMaxSize;
                } else {
                    fFrameSize = newFrameSize;
                }

                bytesInBuf -= fFrameSize;
                dataPtr += fFrameSize;

                //fFrameSize = fMaxSize;

                gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead.

                // If the device is *not* a 'live source' (e.g., it comes instead from a file or buffer), then set "fDurationInMicroseconds" here.
                memmove(fTo, newFrameDataStart, fFrameSize);

                // After delivering the data, inform the reader that it is now available:
                FramedSource::afterGetting(this);
            }
        }
    }
}

Now that I made this changes, the app crashed when I tried to open a session with the server. Since it crashed at inside the libliveMedia, I got some difficulties to debug it. Can you give me some pointers here? 

Belows is the backtrace of this crash.

(gdb) bt
#0  0x76c43e9c in ?? () from /lib/libc.so.0
#1  0x76e7d950 in OutPacketBuffer::enqueue(unsigned char const*, unsigned int)
    () from /mnt/nfs/target/usr/lib/libliveMedia.so.57
#2  0x76e7d988 in OutPacketBuffer::enqueueWord(unsigned int) ()
   from /mnt/nfs/target/usr/lib/libliveMedia.so.57
#3  0x76e8c18c in MultiFramedRTPSink::buildAndSendPacket(unsigned char) ()
   from /mnt/nfs/target/usr/lib/libliveMedia.so.57
#4  0x76dfe938 in AlarmHandler::handleTimeout() ()
   from /mnt/nfs/target/usr/lib/libBasicUsageEnvironment.so.1
#5  0x76dfeefc in BasicTaskScheduler::SingleStep(unsigned int) ()
   from /mnt/nfs/target/usr/lib/libBasicUsageEnvironment.so.1
#6  0x76dfe224 in BasicTaskScheduler0::doEventLoop(char volatile*) ()
   from /mnt/nfs/target/usr/lib/libBasicUsageEnvironment.so.1
#7  0x0000bc44 in main (argc=1, argv=0x7efcfdc4) at ../rtspd_main.cpp:74

Again, thanks for your patience.



Xin
Mobile: +86 186-1245-1524
Email: xliu at vscenevideo.com
QQ: 156678745
 
From: Ross Finlayson
Date: 2017-01-13 23:52
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] how to make latency as low as possible
> For the start code, I have eleminated them all. My camera's working pattern is like this, when encoding key frame, it outputs SC+SPS+SC+PPS+SC+Frame, when encoding non-key frame, it outputs SC+Frame. So, after I eleminated the SC, what I sent to the sink object is SPS+PPS+FRAME for key frame, and FRAME alone for non-key frame.
> 
> Did I missed something here?
 
Yes, you missed the part where I said that:
each ‘frame’ that come from your input source must be a ***single*** H.264 NAL unit, and MUST NOT be prepended by a 0x00 0x00 0x00 0x01 ‘start code’.
 
So, you must deliver (without a prepended ‘start code’ in each case):
SPS, then
PPS, then
FRAME, etc.
 
***provided that*** FRAME is a single NAL unit.  If, instead, FRAME consists of multiple ‘slice’ NAL units (this is actually preferred for streaming, because it’s much more tolerant of network packet loss), then you must deliver each ‘slice’ NAL unit individually (again, without a ‘start code’).
 
Also, when testing your server, I suggest first using “openRTSP” <http://www.live555.com/openRTSP/> as a client, before using VLC.
 
 
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
 
 
_______________________________________________
live-devel mailing list
live-devel at lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20170116/209be21f/attachment-0001.html>


More information about the live-devel mailing list