[Live-devel] H.264 ES streaming

Stas Desyatnlkov stas at tech-mer.com
Wed Aug 26 07:45:40 PDT 2009


Hi All,

I'm trying to stream (unicast) an H264 from elementary stream file. The final goal is to stream NAL packets from a hardware encoder but this should be easily adopted once the file streamer works.
The ES file was created by a remux software and is playable by mplayer.

The problem is my streamer exits after about 1 sec and the afterPlaying callback is called. The ByteStreamFileSource::doGetNextFrame() function fails on feof(fFid) with 16, which means the code trys to read 16 bytes past the end of file marker.

This is what I do to stream it:

        Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl);
        Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl);

videoSink = SimpleRTPSink::createNew(*env, &rtpGroupsock, 33, 90000, "video", "mp2t", 1, True, False /*no 'M' bit*/);

estimatedSessionBandwidth = 5000; // in kbps; for RTCP b/w share
        const unsigned maxCNAMElen = 100;
        unsigned char CNAME[maxCNAMElen+1];
        gethostname((char*)CNAME, maxCNAMElen);
        CNAME[maxCNAMElen] = '\0'; // just in case
        RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock,
                estimatedSessionBandwidth, CNAME,
                videoSink, NULL /* we're a server */, false);

        play();

        env->taskScheduler().doEventLoop(&exit_flag); // does not return
        Medium::close(videoSource);
        Medium::close(rtcp);
        return 0; // only to prevent compiler warning
}

void afterPlaying(void* /*clientData*/)
{
        *env << "...done reading from file\n";
        exit_flag = 1;
}

void play()
{
        unsigned const inputDataChunkSize = 7*188;

        // Open the input file as a 'byte-stream file source':
        ByteStreamFileSource* videoFile
                = ByteStreamFileSource::createNew(*env, video, inputDataChunkSize);

        if (!videoFile )
                exit(1);

        // Create a 'framer' for the input source (to give us proper inter-packet gaps):
        videoSource = MPEG2TransportStreamFromESSource::createNew(*env);
        ((MPEG2TransportStreamFromESSource*)videoSource)->addNewVideoSource(videoFile, 10);

        // Finally, start playing:
        *env << "Beginning to read from file...\n";
        videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
}

What am I missing here?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20090826/9ac213c6/attachment.html>


More information about the live-devel mailing list