[Live-devel] Streaming NVENC H.264 through testH264VideoStreamer.cpp

Philippe Noël philippe_noel at college.harvard.edu
Fri Sep 27 13:16:16 PDT 2019


After further work, I have found that NVENC returns NAL units and so I will
need to write my own FramedSource. I'm very new to all of this and have no
idea how to do so, is there specific examples of working with NVENC I could
find somewhere?

Cheers,

Philippe

On Thu, Sep 26, 2019 at 3:33 PM Philippe Noël <
philippe_noel at college.harvard.edu> wrote:

> Hello,
>
> I am using Nvidia NVENC (all in cpp) from a Windows 10 machine to encode
> raw video files to H.264 using the GPU for the purpose of doing
> live-streaming to another device. I would like to use Live555 for my server
> & client RTSP to send my h.264 stream of frames as UDP packets using
> RTSP/RTP.
>
> I've looked at testH264VideoStreamer.cpp and at the FAQ on the wiki and it
> seems to be the file I should use. The FAQ mentions converting test.264 to
> read from stdin and pipe my encoder stream to the file. I'm a little lost
> as to how to proceed, having never done it before, and am seeking some help.
>
> The NVENC encoder takes a frame, encodes it as H.264 and then keeps it in
> memory before flushing it and repeating, and so I would need to pipe my
> stream before the frame is flushed so it sent to RTSP. The code I'm working
> from used to store the frames in a file, but I would like for openRTSP to
> directly get the frames from video memory to save the CPU of as much work
> as possible.
>
> Please let me know, I would really appreciate some help here.
>
> Philippe
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20190927/b927406b/attachment.htm>


More information about the live-devel mailing list