[Live-devel] Multi-File Streaming
Randy Scheifele
rjscheif at motorolasolutions.com
Mon Aug 29 13:24:56 PDT 2016
Hi,
I am using the Live555 Streaming Media library in a project that I am
working on and have a few system constraints that force me to use the
library in a somewhat non-standard way. I was hoping someone could comment
on my proposed use of the library given my constraints.
Suppose the following system:
A -> B -> C -> D
where:
A is a COTS device that hosts a RTSP server
B is a device using Live555 as a RTSP client
C is a device using a Live555 RTSP server
D is a device that runs a RTSP client (say, VLC)
The objective is to be able to stream a live stream of data from A to D. A
and B are connected via Ethernet and C and D are connected via Ethernet.
However, the connection between B and C is a proprietary communication link
that has the ability to transfer files.
I am using Live555 on device B as a RTSP client to connect to the server on
A and save off the RTP payload. This program is very similar to the
testRTSPClient.cpp testProg, but instead of the DummySink used, I created a
new sink type (based off of FileSink), called FileChopperSink, that has the
ability to concatenate multiple RTP packet payloads into a single file
(IIRC, the FileSink writes 1 file per packet). The FileChopperSink will
take a file size argument, say 30K, and write out packets until it reaches
the threshold and then opens a new file. The files are named similar to
the default of FileSink, so they include the RTP presentation timestamps
(but the presentation timestamp information for files that are appended to
an existing packet because of the FileChopperSink are lost).
These payload files (as well as the SDP file from the stream) are then
transferred over to C using the proprietary communication link. I now want
to be able to consume these RTP payloads on system D, so I created a RTSP
server on C where the files are hosted to serve up the RTP files. I
created a new class (using ByteStreamMultiFileSource as a starting point)
called ByteStreamFolderSource. The only difference between the two classes
is that the ByteStreamFolderSource accepts a path to a folder (rather than
a sequence of files) and will scan the folder on construction for files and
after each file is sent will re-scan the folder to keep the "fFileArray"
full of files as they are coming in.
The above solution does seem to work, but I have a few questions based off
of some observations that I've made.
The current setup is using an Axis camera as device A that is pointed to a
digital clock that can display seconds. It is streaming MPEG4-ES data, so
I am using the MPEG4VideoFileServerMediaSubsession (with the exception that
I am passing in my ByteStreamFolderSource instead of the
ByteStreamSource). When I connect to C via VLC, the video plays but
several seconds into the video, the frames start to get choppy and
occasionally I will see the seconds on the clock go backwards and then jump
forwards.
Is the reason for this choppy video because the Live555 RTSP server does
not have access to the correct time stamp information and/or SDP metadata?
Is there a way I can provide some of this information to the server to
result in a smoother playback experience? Given my system constraints, is
there a better way to utilize the Live555 library or is the track I am on a
good one?
The objective is to eventually parse the SDP information for the data type
(video, audio, etc...) and dynamically create new MediaSubsession objects
similar to how the testOnDemandRTSPServer program does with the different
input filenames.
Thanks,
Randall Scheifele
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20160829/e2658bb6/attachment.html>
More information about the live-devel
mailing list