[Live-devel] JPEGVideoSource doubts

Ross Finlayson finlayson at live.com
Tue Jul 20 12:24:40 PDT 2004


>-type (I read in the mailing list archives that plain JPEGs use type 1, is 
>this correct?)

I believe so.  Type 0 is also possible, but seems to be less common.

>-qfactor (range 1 to 10 or 1 to 100?)

Actually, 0 to 255, but typically 1 to 99.  (See RFC 2435 for an 
explanation of how this value - which will be sent as part of the RTP 
packet - is used by RTP receivers to generate the appropriate quantization 
table for the received image.)

>-width-height (all my frames are the same size. The GIMP says 320 x 240 
>pixels. Why I have
>to divide the pixel size by 8?)

Because these are the values that get put into RTP packets (see RFC 
2435).  (Note that JPEG works only on images whose dimensions are multiples 
of 8 pixels.)

>last doubt--I'm following the testMPEG1or2VideoStreamer example. Which 
>changes must be
>done to  correctly use the "JPEGVideoSource" derived class?

You need only define a derived class (that, of course, implements all of 
the pure virtual functions from "JPEGVideoSource").

>  I've tried using the
>JPEGVideoRTPSink type for the videoSink variable, but it is not working.

In what way is it "not working"?  Have you tried using "openRTSP" (e.g., 
with the "-m" option to output each received frame into a separate file) to 
receive and record the JPEG data?  (This requires that you enable the 
built-in RTSP server in your streamer application.)


	Ross Finlayson
	LIVE.COM
	<http://www.live.com/>



More information about the live-devel mailing list