[Live-devel] Help with live streaming h.264 frames

Ross Finlayson finlayson at live555.com
Sun Apr 21 17:45:01 PDT 2024



> On Apr 18, 2024, at 1:07 PM, Flavio Alves <flavio.alves at vitalintelligencedata.com> wrote:
> 
> Hello,
> 
> I'm working on a live streaming service using Live55 using an Nvidia Jetson Nano board.
> 
> I'm capturing the frames from an USB webcam and I am encoding using Nvidia's hardware encoder. Then Live555 is responsible for streaming this capture using RTSP.
> 
> The Nvidia software api to use the encoder uses some threads, and I was unable to use it in a single application. I implemented 2 applications: the RTSP server and the Capture application.
> 
> The communication between them is shared memory, in Linux. I implemented a circular buffer on this shared memory to place the encoded frames, which are seen by the RTSP server application.

The problem with this solution is that having the encoder application write H.264 NALs to a shared buffer, and the the RTSP server application read H.264 NALs from the shared buffer will add some extra latency.  If this latency is OK for you, then fine.

But if not, then I suggest instead trying to have the encoder and the RTSP server be part of the same application (i.e., process), but as separate threads.  One thread (just one) will run LIVE555 code (the RTSP server).  The other thread (the encoder) would signal the availabilty of each new, just-encoded H.264 NAL by calling “triggerEvent()”.  (This is the only LIVE555 function that can be called by a non-LIVE555 thread.)

For a model of how to do this, see “liveMedia/DeviceSource.cpp”


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/




More information about the live-devel mailing list