[Live-devel] blur images vs frames order incorrect

Luca Zappella luca at filibusta.crema.unimi.it
Mon Feb 7 17:45:47 PST 2005


Hi Ross,
thank to your guide now I can encode frames with my application and send
the stream via RTSP with liveMedia. I want to show you how I do it
because I obtain something not really correct and I'm not sure I'm doing
well.
I use my app (a different thread) that produces the encoded frame and
put it in mem pointed by my pointer myfTo:

void M4VSource::takeFrame(const unsigned char* buf, const unsigned size,
const timeval& time) 
{
    //Gain the lock, if locked wait untill is unlocked
    boost::recursive_try_mutex::scoped_lock lock(dataMutex);
    
    //Copy from buf into myfTo
    actual_size = size;
    frame_time = time;
    memcpy(myfTo, buf, size);

    //New frame avaiable	
    newframe=true;
}

Takeframe is invoked by my application which has a pointer to this
function.

The M4VSource is a class derived from FramedSource and in the
doGetNextFrame I do this:

void M4VSource::doGetNextFrame() 
{
    try
    {
        //Gain the lock, if locked trows the exception
        boost::recursive_try_mutex::scoped_try_lock lock(dataMutex);

        if (newframe)
        {
            //Unlocked and New frame avaiable
            deliverFrame();	
            delayedTaskTime = 0;
            newframe=false;
        }
        else
	{
            //Unlocked but no new frame
            if (delayedTaskTime == 0)
                delayedTaskTime = (1.0/fps) * 1000000;
            
            //Maybe in half of the fps I will have a new frame		
            delayedTaskTime = delayedTaskTime / 2;
            fDurationInMicroseconds = delayedTaskTime;
	}
		
       nextTask() =
envir().taskScheduler().scheduleDelayedTask(delayedTaskTime,
(TaskFunc*)afterGetting,this);
	}
	catch(boost::lock_error&)
	{
             //locked, return without waiting            
             return;
	}
}

The deliver frame:

void M4VSource::deliverFrame() 
{
    if (!isCurrentlyAwaitingData()) return; 	

    if (actual_size > fMaxSize) 
    {
        fNumTruncatedBytes = actual_size - fMaxSize;
        fFrameSize = fMaxSize;
    }
    else
    {
        fNumTruncatedBytes = 0;
        fFrameSize = actual_size;
    }

    fPresentationTime =	frame_time;
    fDurationInMicroseconds = (1.0 / fps) * 1000000;

    //Copy frame from encoded buffer 
    memcpy(fTo, myfTo, fFrameSize);
}

Now the problem:

if I use this code I obtain a stream where it seems that every 1 or 2
frame I have an old frame so if someone pass in front of the camera I
see him going forward and then come back a little and so on.

If, in the doGetNextFrame(), when I have no new frame I say 
delayedTaskTime = delayedTaskTime / 4; (instead of /2)
the stream is correct but goes blurring until (I think) I have a new gop
image.

I don't know if what I'm doing when I have no new frame is correct... 

I'm sure that the stream is correctly encoded because if I write on disk
I can play it and everything is ok.

I hope you can help me,
thanks in advance,
Luca



More information about the live-devel mailing list