[Live-devel] MPEG1or2VideoStreamFramer
Florian Winter
fw at graphics.cs.uni-sb.de
Fri May 14 16:12:11 PDT 2004
I am having some problems transmitting MPEG2 video data over RTP using
libLive.
The stream comes from a Hauppage WinTV PVR 350 card and is filtered
through our
own MPEG2 demux. I am passing the video frames from the demux to the
framer using
a FramedSource which is based on libLive's ByteStreamFileSource (but
reads from a
buffer instead).
The problem is that I get strange video artifacts at the receiver. What
is strange is that
these artifacts go away if I merge subsequent packets, received from the
demux, together
to one big chunk before passing it to the StreamFramer (the "big chunk"
is close to fMaxSize
bytes long). This seems strange because, since the StreamFramer is some
sort of MPEG2 video
parser, it shouldn't matter how many bytes I send to it in each
iteration. Or are there some
restrictions I do not know about? Or can it be that the video stream
from the Hauppage card
is just strange? But why does it work perfectly if I merge the buffers?
When I merge buffers before passing them to the framer, I get no more
video artifacts. But the
framerate is reduced to about 15 fps. If I increase the socket receive
buffer size, by modifying the
setting in /proc/sys/net/core/rmem_max, I get full 25 fps. This however
does not fix the problem
with the video artifacts.
I also tried testMPEG1or2VideoStreamer together with
testMPEG1or2VideoReceiver and mplayer.
It also shows artifacts and reduced framerate.
More information about the live-devel
mailing list