[Live-devel] Help with live streaming h.264 frames

Flavio Alves flavio.alves at vitalintelligencedata.com
Thu Apr 18 12:07:51 PDT 2024


Hello,

I'm working on a live streaming service using Live55 using an Nvidia Jetson
Nano board.

I'm capturing the frames from an USB webcam and I am encoding using
Nvidia's hardware encoder. Then Live555 is responsible for streaming this
capture using RTSP.

The Nvidia software api to use the encoder uses some threads, and I was
unable to use it in a single application. I implemented 2 applications: the
RTSP server and the Capture application.

The communication between them is shared memory, in Linux. I implemented a
circular buffer on this shared memory to place the encoded frames, which
are seen by the RTSP server application.

I created custom classes for MediaSession (from
OnDemandServerMediaSubsession) and DeviceSource (from FramedSource).

The software is almost working. I had issues with adding SPS and PPS and
timestamps on the encoded frame which seems to be fine.

But what is happening now is that the applications seems to be ignoring the
fDurationInMicroseconds and/or the fPresentationTime values ... and the
framerate does not seem to be respected when streaming. The video showing
up is very fast.

I would like to ask for any advice about how to properly address this
problem.

Best regards,

Flavio
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20240418/d6089320/attachment.htm>


More information about the live-devel mailing list