[Live-devel] What is a frame in the live-lib ?

Heintje Müller live-devel
Tue Jun 7 02:42:54 PDT 2005


Hi, Ross.. thanks for your fast reply to Michael's email.

 >
 > >What exactly is a Frame?
 >
 > In the context of the "LIVE.COM Streaming Media" code, a "frame" is a
 > discrete unit of data that gets delivered, by a "FramedSource" 
object, to a
 > downstream receiver.  That's a deliberately vague explanation.
 >
 > For RTP streaming, for example, a "frame" is a discrete unit of data 
- with
 > special semantics - that can be packed (perhaps several at a time) 
into a
 > RTP packet.

so, the FramedSource's getNextFrame could return just a single PES 
packet and the RTPSink would be demanding frames until a rtp packet is 
"full" and ready to be sent?
The problem is: As long as we don't know exactly what a frame is, we 
cannot serve it, so we're developing a kind of "ByteStreamBufferSink" to 
communicate with our dvb-parser and are going to make use of a framer, 
but we thought it would be much better if we use a FramedSource 
instead.. cause we're developing the DVB-parser, too, we're not much 
limited ;o)
 
 > >  We want to implement a DVB source based on
 > >FramedSource. Our DVB-Parser currently exports PES-Frames
 >
 > So, is there one PES source for audio, and another PES source for
 > video?  Or is your input data really a single MPEG Program Stream (in 
which
 > case you will want to pass your data through a "MPEG1or2Demux"
 > first)?  Also, what do you want to do with this data?  Stream it 
using RTP
 > (as separate audio and video streams), or convert it to a Transport 
Stream,
 > or what?
 >
 > Rather that focusing right now on the question of what a 'frame' is, you
 > should first ask yourself what is the format of your input data, and 
what
 > do you want to do with it?
 >

Yes, our DVB parser currently delivers two PES, one stream for audio and 
one for video. It could deliver ES instead, but then we guess that both 
are out of sync ;-)
We want to stream via RTP (2 streams) to another liveMedia app we're 
writing, a proxy, which will transcode the video part on-the-fly and 
transmit the streams to the client (2 streams, client: VLC). (i assume 
that we have to implement the proxy as a FramedFilter and then do 
something magic in order to keep both streams sync, although video is 
delayed due to the transcoding process, right? anyway.. first we have to 
write the server application streaming from the DVB source)

We tried to study the corresponding classes, already.. but haven't get 
to know, what data an implemented FramedSource has to serve (-> frame).

Hope, u can help. ;)

Thank you very much in advance,
Heintje




More information about the live-devel mailing list