[Live-devel] Streaming with Sensoray 2250 encoded data

Jordan Shelley jps330 at psu.edu
Fri Jul 20 05:46:21 PDT 2007


Ross,

 

Thanks for your reply. I took your advice and worked on getting data written
to a file. I pretty much just added a FileSink object in and dropped my
Encoder2250 object into it. It worked perfectly. Now that this works, do you
have any advice for me to get it working over RTP? I'm sorry to bother you
with this, but this is part of a very important project and I don't exactly
know what to try since getting it to write to a file was so easy. I just
can't understand why the FileSink object was able to read my doGetNextFrame
function so easily and the RTPSink object cant. Thanks a lot.

 

-Jordan

 

  _____  

From: live-devel-bounces at ns.live555.com
[mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson
Sent: Tuesday, July 17, 2007 12:12 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Streaming with Sensoray 2250 encoded data

 

I am trying to stream video from a platform that has a Sensoray 2250 encoder
installed on it. I have already verified that all of the hardware works and
that data produced from it can be streamed using the live library by doing
the following:

*         Created a process that grabs MPEG2 frames from the encoder and
dumps them to a named pipe

*         Modified testMPEG1or2VideoStreamer to look at this pipe for data,
and stream it to a specified IP address

*         Works fine, except that, as you can image, there is some major
delay in the video which I cannot have due to the nature of what this video
stream is being used for.

 

You can probably eliminate most of this delay by reducing the maximum
buffering used by your pipe.  (I don't know how you would do this, but there
should be a way.)

 

This method - piping encoded data directly to the (modified)
"testMPEG1or2VideoStreamer" is *by far* the easiest way to get what you
want.  I recommend sticking with this approach if you can.

 

 

 

I decided to move on and write a FramedSource subclass that encapsulates my
encoder, and deliver this object directly to a MPEG1or2VideoRTPSink.

 

You should insert a "MPEG1or2VideoStreamDiscreteFramer" in front of your
"MPEG1or2VideoRTPSink".

 

 

The thing that is very odd is that it seems like my implementation of
doGetNextFrame() is never executed

 

You should first make sure that you can walk before you try to run.  I
suggest that you begin by trying to write your encoded data to a file rather
than streaming it over a network.  I.e., start by using a "FileSink" instead
of a "MPEG1or2VideoRTPSink".  Once you have shown that you can successfully
write your encoded data into a file (and test that the data is correct by
trying to play the file in a media player), then it would make sense to move
to streaming.

-- 


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.live555.com/pipermail/live-devel/attachments/20070720/972c268c/attachment-0001.html 


More information about the live-devel mailing list