[Live-devel] WebCam as Frame Source

Scott Hays sdhays at neon.com.tw
Fri Sep 2 00:22:45 PDT 2005


Re-read the comments in DeviceSource.cpp.  I don't know if it's the  
cause of all of your problems, but you're not supposed to allocate  
memory to fTo.  Memory should already have been allocated and if the  
buffer isn't big enough, set fNumTruncatedBytes to let the class  
which owns the buffer know that the buffer should be bigger (and it  
will then expand the buffer).  Also, in general, it's a bad idea to  
mix C and C++ memory allocation methods.

Hope that helps,
Scott


On Sep 1, 2005, at 1:32 PM, Ioannis Skazikis wrote:

> Hallo,
>
> I try to build an RTSP Live Video Stream server based on the live  
> media
> library.
>
> I have also read the live Media FAQ + Mailing List and tryed to follow
> your answers to questions from the mailing list archive from others
> having the same problem like me.
>
> The program is based on the testMPEG4VideoStreammer testprogramm found
> in the liveMedia library.
>
> Based on the DeviseSource.cpp I wrote an DeviceSourceCammera  
> subclass of
> the FramedSource" which opens a USB Camera, grubs an Image and decodes
> it with help of the Revel Lib (XVID encoding Lib) to MPEG4.
>
> The functions which open the device and starts the Server look like  
> this:
>
> void init_play(){
>
>   // Open the input source
>  DeviceParameters params;
>
>   fileSource = DeviceSourceCammera::createNew(*env, params);
>   if (fileSource == NULL) {
>     *env << "Unable to open source\n";
>     exit(1);
>   }
>
> }
>
> void play() {
>
>   FramedSource* videoES = fileSource;
>
>   // Create a framer for the Video Elementary Stream:
>   videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES);
>
>   // Finally, start playing:
>   *env << "Beginning to read from file...\n";
>   videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
> }
>
>
>> From debuging outputs implemented in the DeviceSourceCammera.cpp by
>>
> running the programm I can see that it really opens the device, grab a
> frame and encodes it (the procedure repeats it self), but if I  
> start the
> mplayer, it connects to the servers reads the sdp , it writes as  
> output
> at the end "Found video stream: 0" ,but it does not starts to play the
> stream. It stays there with out do nothing (like waiting frames, but
> never receives them)
>
> Any Idea why (or what?) is happening?
>
> The DeviceSourceCammera.cpp is included in this e-mail.
> lvsMain.cpp is the "Main" file with the server start up code (also
> included in this e-mail)
> DeviceUtil.hh is the Header file which includes the basic Function  
> of my
> code for handling the WebCam device. (this works, I  have encoded a
> video file, before to play with the LiveMedia Lib.)
>
> The subclass Functions look like this. If you need more details please
> ask me or have a look in the attached files...
>
> DeviceSourceCammera::DeviceSourceCammera(UsageEnvironment& env,
> DeviceParameters params)
>   : DeviceSource(env, params){
>
> init(); // Open Device
> }
>
> void DeviceSourceCammera::doGetNextFrame() {
>
>   grab_image();  //Take a Frame from the Camera
>
>   if (0 /* the source stops being readable */) {
>     handleClosure(this);
>     fprintf(stderr,"In Grab Image V4l, the source stops being readable
> !!!!\n");
>     return;
>   }
>
>     deliverFrame(); // call DeviceSourceCammera::deliverFrame()  
> Function.
> }
>
> DeviceSourceCammera::deliverFrame() {
>
> //Encoding the grub frame from the Camera
>
>     if (get_grab_size() > fMaxSize){
>       frameSize = fMaxSize;
>     }
>     else frameSize = get_grab_size();
>
> fTo = (unsigned char*)malloc(frameSize);
>
> memcpy(fTo, frame.pixels, frameSize); // frame.pixels is the buffer
> where the grubed frame is now encoded, framesize the size of the  
> encoded
> frame
> }
>
> Thank you very much for your help and your time !!!!
> Sorry about my bad english too.
>
> Ioannis.
>
>
> #include "DeviceSourceCammera.hh"
> #include "DeviceUtil.hh"
> DeviceSourceCammera*
> DeviceSourceCammera::createNew(UsageEnvironment& env,
>             DeviceParameters params) {
>   return new DeviceSourceCammera(env, params);
> }
>
> DeviceSourceCammera::DeviceSourceCammera(UsageEnvironment& env,  
> DeviceParameters params)
>   : DeviceSource(env, params){
>   // Any initialization of the device would be done here
>
>   init();
>
> }
>
> DeviceSourceCammera::~DeviceSourceCammera() {
> //printf("DeviceSourceCammera Deconstructor...");
> }
>
> void DeviceSourceCammera::doGetNextFrame() {
>
>   // Arrange here for our "deliverFrame" member function to be called
>   // when the next frame of data becomes available from the device.
>   // This must be done in a non-blocking fashion - i.e., so that we
>   // return immediately from this function even if no data is
>   // currently available.
>   //
>   // If the device can be implemented as a readable socket, then  
> one easy
>   // way to do this is using a call to
>   //     envir().taskScheduler().turnOnBackgroundReadHandling( ... )
>   // (See examples of this call in the "liveMedia" directory.)
>
>   grab_image();
>   fprintf(stderr,"In Grab Image V4l...\n");
>
>   // If, for some reason, the source device stops being readable
>   // (e.g., it gets closed), then you do the following:
>   if (0 /* the source stops being readable */) {
>     handleClosure(this);
>     fprintf(stderr,"In Grab Image V4l, the source stops being  
> readable !!!!\n");
>     return;
>   }
>     deliverFrame();
> }
>
> void DeviceSourceCammera::deliverFrame() {
>   // This would be called when new frame data is available from the  
> device.
>   // This function should deliver the next frame of data from the  
> device,
>   // using the following parameters (class members):
>   // 'in' parameters (these should *not* be modified by this  
> function):
>   //     fTo: The frame data is copied to this address.
>   //         (Note that the variable "fTo" is *not* modified.   
> Instead,
>   //          the frame data is copied to the address pointed to by  
> "fTo".)
>   //     fMaxSize: This is the maximum number of bytes that can be  
> copied
>   //         (If the actual frame is larger than this, then it should
>   //          be truncated, and "fNumTruncatedBytes" set accordingly.)
>
>   // 'out' parameters (these are modified by this function):
>   //     fFrameSize: Should be set to the delivered frame size (<=  
> fMaxSize).
>   //     fNumTruncatedBytes: Should be set iff the delivered frame  
> would have been
>   //         bigger than "fMaxSize", in which case it's set to the  
> number of bytes
>   //         that have been omitted.
>   //     fPresentationTime: Should be set to the frame's  
> presentation time
>   //         (seconds, microseconds).
>   //     fDurationInMicroseconds: Should be set to the frame's  
> duration, if known.
>   if (!isCurrentlyAwaitingData()){
>    printf("isCurrentlyAwaitingData: Return; \n");
>    return; // we're not ready for the data yet
>   }
>   // Deliver the data here:
>
>   //---------------------- Start Revel  
> --------------------------------------------
>    fprintf(stderr,"In DeviceSourceCammera deliver Frame\n");
>
>    int width = 160;
>    int height = 120;
>    int numFrames = 128;
>    long frames;
>
>    const char *filename = "test.m4v";
>    Revel_Error revError;
>     // Make sure the API version of Revel we're compiling against  
> matches the
>     // header files!  This is terribly important!
>     if (REVEL_API_VERSION != Revel_GetApiVersion())
>     {
>         printf("ERROR: Revel version mismatch!\n");
>         printf("Headers: version %06x, API version %d\n",  
> REVEL_VERSION,
>             REVEL_API_VERSION);
>         printf("Library: version %06x, API version %d\n",  
> Revel_GetVersion(),
>             Revel_GetApiVersion());
>         exit(1);
>     }
>
>     // Create an encoder
>     int encoderHandle;
>     revError = Revel_CreateEncoder(&encoderHandle);
>     if (revError != REVEL_ERR_NONE)
>     {
>             printf("Revel Error while creating encoder: %d\n",  
> revError);
>             exit(1);
>     }
>
>     // Set up the encoding parameters.  ALWAYS call  
> Revel_InitializeParams()
>     // before filling in your application's parameters, to ensure  
> that all
>     // fields (especially ones that you may not know about) are  
> initialized
>     // to safe values.
>     Revel_Params revParams;
>     Revel_InitializeParams(&revParams);
>     revParams.width = width;
>     revParams.height = height;
>     revParams.frameRate = 8.0f;
>     revParams.quality = 1.0f;
>     revParams.codec = REVEL_CD_XVID;
>
>     // Initiate encoding
>     revError = Revel_EncodeStart(encoderHandle, filename, &revParams);
>     if (revError != REVEL_ERR_NONE)
>     {
>             printf("Revel Error while starting encoding: %d\n",  
> revError);
>             exit(1);
>     }
>
>     // Draw and encode each frame.
>     Revel_VideoFrame frame;
>     frame.width = width;
>     frame.height = height;
>     frame.bytesPerPixel = 3;
>     frame.pixelFormat =  REVEL_PF_BGR;
>     frame.pixels = new int[get_grab_size()*4];
>
>     memset(frame.pixels, 0, get_grab_size());
>
>     grab_image();
>     memcpy(frame.pixels, get_grab_data(), get_grab_size());
>
>         int i=0;
>         int frameSize;
>
>         //Copy raw Frame from the WebCam and encode it
>     grab_image();
>         memcpy(frame.pixels, get_grab_data(), get_grab_size());
>         frames++;
>
>         revError = Revel_EncodeFrame(encoderHandle, &frame,  
> &frameSize);
>         if (revError != REVEL_ERR_NONE)
>         {
>                 printf("Revel Error while writing frame: %d\n",  
> revError);
>                 exit(1);
>         }
>
>     //Set Device Source Params...
>     if (get_grab_size() > fMaxSize){
>       frameSize = fMaxSize;
>     }
>     else frameSize = get_grab_size();
>     fFrameSize = frameSize;
>
>     //Copy to DeviceSource...
>     fTo = (unsigned char*)malloc(frameSize);
>     memcpy(fTo, frame.pixels, frameSize);
>
>     printf("Frame %d : %d bytes\n", i+1, frameSize);
>
>     // Final cleanup.
>     Revel_DestroyEncoder(encoderHandle);
>     delete [] (int*)frame.pixels;
>
> //-------------------------End Revel  
> -------------------------------------------
>
>   // After delivering the data, switch to another task, and inform
>   // the reader that he has data:
>   nextTask()
>     = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*) 
> afterGetting,this);
>
> }
>
> // "liveMedia"
> // Copyright (c) 1996-2004 Live Networks, Inc.  All rights reserved.
> // A template for a MediaSource encapsulating an audio/video input  
> device
> // C++ header
>
> #include <DeviceSource.hh>
>
>
>
> class DeviceSourceCammera: public DeviceSource {
> public:
>   static DeviceSourceCammera* createNew(UsageEnvironment& env,  
> DeviceParameters params);
>
> protected:
>   DeviceSourceCammera(UsageEnvironment& env, DeviceParameters params);
>   // called only by createNew(), or by subclass constructors
>   virtual ~DeviceSourceCammera();
>
> private:
>   // redefined virtual functions:
>   virtual void doGetNextFrame();
>
> private:
>   void deliverFrame();
>
> private:
>   DeviceParameters fParams;
> };
>
> #include <DeviceUtil.hh>
> //V4l Glabal Var...
> int grab_fd, grab_size;
> static struct video_mmap grab_buf;
> int yuv_width = 320;
> int yuv_height = 240;
> //int yuv_width = 160;
> //int yuv_height = 120;
> static unsigned char *grab_data = NULL;
>
> unsigned char *get_grab_data(){
> return grab_data;
> }
>
> int get_grab_size(){
> return grab_size;
> }
>
> //V4l Fct...
> void deinit() {
>    munmap(grab_data, grab_size);
>    grab_data = NULL;
>    close(grab_fd);
> }
>
> void init() {
>
>    struct video_capability grab_cap;
>    struct video_channel grab_chan;
>    struct video_mbuf vid_mbuf;
>    struct video_picture pic;
>
>    atexit(deinit);
>
>    if ((grab_fd = open(VIDEO_DEVICE, O_RDWR)) == -1) {
>       fprintf(stderr, "Couldn't open /dev/videoXXX: %s\n", strerror 
> (errno));
>       exit(-1);
>    }
>
>    if (ioctl(grab_fd, VIDIOCGCAP, &grab_cap) == -1) {
>       perror("ioctl VIDIOCGCAP");
>       exit(-1);
>    }
> //------------- VIDEO CAPABILITIES ------------------
>    printf("------------------------------------\n");
>    printf("------------------------------------\n");
>    printf("name hola-> %s\n", grab_cap.name);
>    printf("type -> %d\n", grab_cap.type);
>    printf("channels -> %d\n", grab_cap.channels);
>    printf("audios -> %d\n", grab_cap.audios );
>    printf("maxwidth -> %d\n", grab_cap.maxwidth );
>    printf("maxheight -> %d\n", grab_cap.maxheight);
>    printf("minwidth -> %d\n", grab_cap.minwidth );
>    printf("minheight -> %d\n", grab_cap.minheight );
>    printf("------------------------------------\n");
> //----------------------------------------------------
>
>    grab_chan.channel = 1;  //Or 0 ????
>    grab_chan.type = VIDEO_TYPE_CAMERA;
>
>    if (ioctl(grab_fd, VIDIOCGCHAN, &grab_chan) == -1) {
>       perror("ioctl VIDOCGCHAN");
>       exit(-1);
>    }
>
>    grab_chan.norm = 0;  //   grab_chan.norm = VIDEO_MODE_PAL;
>
>    if (ioctl(grab_fd, VIDIOCSCHAN, &grab_chan) == -1) {
>       perror("ioctl VIDIOCSCHAN");
>       exit(-1);
>    }
> //------------IMAGE PROPERTIES-------------------------
>    pic.depth = 24;
>    pic.palette = VIDEO_PALETTE_RGB24;
>    pic.brightness = 100;
>    pic.contrast = 30;
>    pic.whiteness = 0;
>    pic.colour = 0;
>    pic.hue = 0;
>
>    printf("------------------------------------\n");
>    printf("brightness -> %d \n", pic.brightness/256 );
>    printf("hue -> %d\n", pic.hue/256);
>    printf("colour -> %d\n", pic.colour/256 );
>    printf("contrast -> %d \n", pic.contrast/256 );
>    printf("whiteness -> %d\n", pic.whiteness/256 );
>    printf("depth -> %d\n", pic.depth );
>    printf("palette -> %d \n", pic.palette );
>    printf("------------------------------------\n");
>
>    ioctl(grab_fd, VIDIOCGPICT, &pic );
> //-------------------------------------------------------
>
>    grab_buf.format = VIDEO_PALETTE_RGB24;  //Also  
> VIDEO_PALETTE_YUV420P or VIDEO_PALETTE_RAW or VIDEO_PALETTE_RGB24
>    grab_buf.frame = 0;
>    grab_buf.width = yuv_width;
>    grab_buf.height = yuv_height;
>
>    ioctl(grab_fd, VIDIOCGMBUF, &vid_mbuf);
>
>    grab_size = vid_mbuf.size;
>    grab_data = (unsigned char*)mmap(0, grab_size, PROT_READ |  
> PROT_WRITE, MAP_SHARED, grab_fd, 0);
>
>    if ((grab_data == NULL) || (-1 == (int) grab_data)) {
>       fprintf(stderr, "Couldn't mmap\n");
>       exit(1);
>    }
>
>    /* Useless? probably. */
>    setpriority(PRIO_PROCESS, 0, 20);
>    nice(20);
> }
>
> void grab_image() {
>    int i = 0;
>    fd_set fds;
>
>    FD_ZERO(&fds);
>    FD_SET(0, &fds);
>    FD_SET(grab_fd, &fds);
>
>    /* Maybe a little nicer to the cpu ? */
>    select(grab_fd + 1, &fds, NULL, NULL, NULL);
>
>    if (ioctl(grab_fd, VIDIOCMCAPTURE, &grab_buf) == -1) {
>       perror("ioctl VIDIOCMCAPTURE");
>       return;
>    }
>
>    if (ioctl(grab_fd, VIDIOCSYNC, &i) == -1) {
>       perror("ioctrl VIDIOCSYNC");
>       return;
>    }
>    return;
> }
> //End V4l Fct...
>
> #include "revel.h"
> #include <stdio.h>
> #include <stdlib.h>
> #include <string.h>
> #include <errno.h>
> #include <unistd.h>
> #include <sys/ipc.h>
> #include <sys/shm.h>
> #include <sys/mman.h>
> #include <sys/ioctl.h>
> #include <sys/time.h>
> #include <sys/resource.h>
> #include <fcntl.h>
> #include <linux/videodev.h>
>
> //#define GUID_I420_PLANAR 0x30323449
> #define    VIDEO_DEVICE "/dev/video1"
>
>
> //End V4l Global Var...
>
> void deinit();
> void init();
> void grab_image();
> int get_grab_size();
> unsigned char *get_grab_data();
>
> /*Skazikis Ioannis
> Live Video Streamming Programm
> */
> #include <liveMedia.hh>
> #include <BasicUsageEnvironment.hh>
> #include <GroupsockHelper.hh>
> #include "DeviceSourceCammera.hh"
> Boolean reuseFirstSource = true;
>
> UsageEnvironment* env;
> char const* inputFileName = "test.m4v";
> MPEG4VideoStreamFramer* videoSource;
> RTPSink* videoSink;
> DeviceSourceCammera* fileSource;
>
> void init_play();
> void play(); // forward
>
> int main(int argc, char** argv) {
>   // Begin by setting up our usage environment:
>   TaskScheduler* scheduler = BasicTaskScheduler::createNew();
>   env = BasicUsageEnvironment::createNew(*scheduler);
>
>   // Create 'groupsocks' for RTP and RTCP:
>   struct in_addr destinationAddress;
>   destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env);
>   // Note: This is a multicast address.  If you wish instead to stream
>   // using unicast, then you should use the "testOnDemandRTSPServer"
>   // test program - not this test program - as a model.
>
>   const unsigned short rtpPortNum = 18888;
>   const unsigned short rtcpPortNum = rtpPortNum+1;
>   const unsigned char ttl = 255;
>
>   const Port rtpPort(rtpPortNum);
>   const Port rtcpPort(rtcpPortNum);
>
>   Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl);
>   rtpGroupsock.multicastSendOnly(); // we're a SSM source
>   Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl);
>   rtcpGroupsock.multicastSendOnly(); // we're a SSM source
>
>   // Create a 'MPEG-4 Video RTP' sink from the RTP 'groupsock':
>   videoSink = MPEG4ESVideoRTPSink::createNew(*env, &rtpGroupsock, 96);
>
>   // Create (and start) a 'RTCP instance' for this RTP sink:
>   const unsigned estimatedSessionBandwidth = 500; // in kbps; for  
> RTCP b/w share
>   const unsigned maxCNAMElen = 100;
>   unsigned char CNAME[maxCNAMElen+1];
>   gethostname((char*)CNAME, maxCNAMElen);
>   CNAME[maxCNAMElen] = '\0'; // just in case
>   RTCPInstance* rtcp
>   = RTCPInstance::createNew(*env, &rtcpGroupsock,
>                 estimatedSessionBandwidth, CNAME,
>                 videoSink, NULL /* we're a server */,
>                 True /* we're a SSM source */);
>   // Note: This starts RTCP running automatically
>
>   RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554);
>   if (rtspServer == NULL) {
>     *env << "Failed to create RTSP server: " << env->getResultMsg()  
> << "\n";
>     exit(1);
>   }
>   ServerMediaSession* sms
>     = ServerMediaSession::createNew(*env, "testStream", inputFileName,
>            "Session streamed by \"testMPEG4VideoStreamer\"",
>                        True /*SSM*/);
>   sms->addSubsession(PassiveServerMediaSubsession::createNew 
> (*videoSink, rtcp));
>   rtspServer->addServerMediaSession(sms);
>
>   char* url = rtspServer->rtspURL(sms);
>   *env << "Play this stream using the URL \"" << url << "\"\n";
>   delete[] url;
>
>   // Start the streaming:
>   *env << "Beginning streaming...\n";
>   init_play();
>   play();
>
>   env->taskScheduler().doEventLoop(); // does not return
>
>   return 0; // only to prevent compiler warning
> }
>
> void afterPlaying(void* /*clientData*/) {
>   *env << "...done reading from file\n";
>
>   Medium::close(videoSource);
>   // Note that this also closes the input file that this source  
> read from.
>
>   // Start playing once again:
>   play();
> }
>
> void init_play(){
>
>   // Open the input source
>  DeviceParameters params;
>
>   fileSource = DeviceSourceCammera::createNew(*env, params);
>   if (fileSource == NULL) {
>     *env << "Unable to open source\n";
>     exit(1);
>   }
>
> }
>
> void play() {
>
>   FramedSource* videoES = fileSource;
>
>   // Create a framer for the Video Elementary Stream:
>   videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES);
>
>   // Finally, start playing:
>   *env << "Beginning to read from file...\n";
>   videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
> }
>
> /** \mainpage Revel (The Really Easy Video Encoding Library
>  * The Revel library provides the shortest path between your  
> application and
>  * high-quality video output.
>  *
>  * Using Revel generally involves the following sequence of  
> function calls:
>  * - Ensure binary compatibility by testing REVEL_API_VERSION against
>  *   Revel_GetApiVersion().
>  * - Create an encoder with Revel_CreateEncoder().
>  * - Declare a Revel_Params object. initialize it to safe defaults  
> with
>  *   Revel_InitializeParams(), and then fill in the fields to fit your
>  *   application's needs. Pass the object to Revel_EncodeStart() to  
> begin
>  *   the encoding process.
>  * - Use Revel_EncodeFrame() to pass in individual video frames in
>  *   the order you'd like them to appear.  Optionally, you can
>  *   use Revel_EncodeAudio() to provide an audio track for your movie.
>  * - Stop the encoding process with Revel_EncodeEnd().  This finalizes
>  *   the output movie and makes it viewable.  Don't skip this step!
>  * - Destory the encode object with Revel_DestoryEncoder().
>  */
>
> /** \example reveltest.cpp
>  * This file contains a complete demonstration of Revel in action.
>  * It generates a short movie of an animating checkerboard
>  * pattern, complete with audio.
>  */
>
> /*
> Copyright (C) (2004) (Cort Stratton) <cort at cortstratton dot org>
>
> This program is free software; you can redistribute it and/or
> modify it under the terms of the GNU General Public License
> as published by the Free Software Foundation; either
> version 2 of the License, or (at your option) any later
> version.
>
> This program is distributed in the hope that it will be useful,
> but WITHOUT ANY WARRANTY; without even the implied warranty of
> MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
> GNU General Public License for more details.
>
> You should have received a copy of the GNU General Public License
> along with this program; if not, write to the Free Software
> Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
> */
>
> #ifndef REVEL_H
> #define REVEL_H
>
> #ifdef __cplusplus
> extern "C" {
> #endif
>
> /* Version information. */
> #define REVEL_VERSION 0x010100 /**< Updated for every release. */
> #define REVEL_API_VERSION 2 /**< Updated only when the public API  
> changes. */
>
> /**
>  * Retrieves the run-time value of REVEL_VERSION.
>  * Use this function for run-time binary compatibility tests.
>  * @return The run-time value of REVEL_VERSION.
>  */
> int Revel_GetVersion(void);
>
> /**
>  * Use this function to check for run-time binary compatibility.
>  * Specifically, if REVEL_API_VERSION != Revel_GetApiVersion(),
>  * then your library and header files are out of date, and
>  * Revel's behavior will be undefined!
>  * @return The run-time value of REVEL_API_VERSION.
>  */
> int Revel_GetApiVersion(void);
>
> /**
>  * List of legal pixel formats for input video frames.
>  */
> typedef enum
> {
>     REVEL_PF_BGRA = 0, /**< 32-bit BGRA */
>     REVEL_PF_ABGR,     /**< 32-bit ABGR */
>     REVEL_PF_RGBA,     /**< 32-bit RGBA */
>     REVEL_PF_ARGB,     /**< 32-bit ARGB */
>     REVEL_PF_BGR,      /**< 24-bit BGR */
>     REVEL_PF_RGB555,   /**< 16-bit RGB 5-5-5 packed */
>     REVEL_PF_RGB565,   /**< 16-bit RGB 5-6-5 packed */
>
>     REVEL_PF_COUNT     /**< Number of pixel formats (this is NOT a  
> legal pixel format!) */
> } Revel_PixelFormat;
>
>
> /**
>  * List of supported video codecs.
>  */
> typedef enum
> {
>     REVEL_CD_XVID = 0, /**< XviD (http://www.xvid.org) */
>
>     REVEL_CD_COUNT     /**< Number of video codecs (this is NOT a  
> legal codec!) */
> } Revel_VideoCodec;
>
> /**
>  * Partial list of supported audio sample formats.
>  * This enum only contains the most common sample formats.
>  * In actuality, any of the Microsoft WAVE_FORMAT_XXX values is
>  * legal.  Please refer to the Windows mmreg.h file (available
>  * on the web at:
>  * http://graphics.cs.uni-sb.de/NMM/dist-0.6.0/Docs/Doxygen/html/ 
> mmreg_8h.html)
>  * for a full list of supported sample formats and their associated
>  * numerical values.
>  */
> typedef enum
> {
>     REVEL_ASF_UNKNOWN = 0x0000, /**< Unknown format / no audio  
> present. */
>     REVEL_ASF_PCM     = 0x0001, /**< PCM format (the most common  
> choice). */
>     REVEL_ASF_ADPCM   = 0x0002, /**< ADPCM format. */
>     REVEL_ASF_ALAW    = 0x0006, /**< A-law format. */
>     REVEL_ASF_MULAW   = 0x0007, /**< mu-law format. */
> } Revel_AudioSampleFormat;
>
>
> /**
>  * List of error codes.
>  */
> typedef enum
> {
>     REVEL_ERR_NONE = 0,         /**< No error (the operation  
> completed successfully). */
>     REVEL_ERR_UNKNOWN,          /**< An unknown/unclassified error. */
>     REVEL_ERR_INVALID_HANDLE,   /**< An invalid handle was passed  
> to the function. */
>     REVEL_ERR_PARAMS,           /**< Invalid parameters passed to  
> the function. */
>     REVEL_ERR_FILE,             /**< A file I/O error. */
>     REVEL_ERR_MEMORY,           /**< A memory-related error. */
>     REVEL_ERR_BUSY,             /**< Revel is busy with another  
> operation. */
>
>     REVEL_ERR_COUNT             /**< Number of error types (this is  
> nOT a legal error code!) */
> } Revel_Error;
>
> /**
>  * General info about the video to encode.
>  */
> typedef struct
> {
>     unsigned int width, height; /**< width/height in pixels */
>     float frameRate; /**< frames per second. */
>     float quality; /**< ranges 0 to 1 */
>     Revel_VideoCodec codec; /**< This codec will be used to  
> compress the video frames. */
>
>     /**
>      * If 1, the output movie will include an audio stream.
>      * Any other value means the movie will be silent.
>      */
>     int hasAudio;
>
>     /* The following fields are ignored unless hasAudio is 1 */
>     int audioRate; /**< audio sample rate, e.g. 22050, 44100 */
>     int audioBits; /**< bits per audio sample, e.g. 8, 16 */
>     int audioChannels; /**< Number of channels in the audio stream  
> (1=mono, 2=stereo) */
>     int audioSampleFormat; /**< Format of the audio sample data. */
> } Revel_Params;
>
>
> /**
>  * Represents a frame of video.  Use this to pass raw images
>  * into Revel to be encoded.
>  */
> typedef struct
> {
>     unsigned int width, height; /**< width/height in pixels. */
>     unsigned char bytesPerPixel; /**< color depth of pixels, in  
> bytes. */
>     Revel_PixelFormat pixelFormat; /**< color channel ordering for  
> these pixels. */
>     void *pixels; /**< Pointer to pixel data. */
> } Revel_VideoFrame;
>
>
> /**
>  * Create an encoder for your movie.  Call me first!
>  * @param encoderHandle A handle to the new encoder will be stored  
> here.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_CreateEncoder(int *encoderHandle);
>
>
> /**
>  * Initialize a Revel_Params structure to safe default values.
>  *
>  * It is very important that you call this function on your
>  * Revel_Params structure before filling it out yourself!  Otherwise,
>  * if you upgrade to a new version of Revel in which new fields
>  * were added to Revel_Param, they would be left uninitialized in
>  * your application, and Terrible Things could happen.
>  *
>  * You should not rely on any of the specific default values used  
> in this
>  * function -- while they are guarenteed to be safe, they are NOT  
> guarenteed
>  * to be correct for your application.
>  *
>  * @param params This structure will be filled with safe default  
> values.
>  */
> void Revel_InitializeParams(Revel_Params *params);
>
> /**
>  * Start encoding a new movie.
>  *
>  * @param encoderHandle Must be a valid encoder handle.
>  * @param filename The output movie will be written to this file.
>  * @param params Fill this structure with your movie settings  
> before calling.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_EncodeStart(int encoderHandle, const char* filename,
>                               Revel_Params *params);
>
>
> /**
>  * Encode a single frame and write it to the video stream.
>  *
>  * @param encoderHandle must be a valid encoder handle.
>  * @param frame The video frame to encode.  It will not be written  
> to at all,
>  *              so it is unnecessary to make a copy of your  
> framebuffer first.
>  * @param frameSize If non-NULL, the size of the encoded frame (in  
> bytes) will
>  *                  be stored here.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_EncodeFrame(int encoderHandle,
>                               Revel_VideoFrame *frame, int  
> *frameSize = 0);
>
>
> /**
>  * Encode a chunk of audio data to the movie.
>  *
>  * This function will have no effect if the hasAudio field of  
> Revel_Params
>  * passed to Revel_EncodeStart() was 0.
>  * Each new audio chunk will be appended to the end of the existing  
> audio
>  * data.  See the Revel FAQ (http://www.dangerware.org/cgi-bin/ 
> revelfaq.py)
>  * for important ramifications of this behavior!
>  *
>  * @param encoderHandle must be a valid encoder handle.
>  * @param sampleBuffer The array of audio samples.  The sample data  
> must
>  *                     be of the same format specified in the  
> Revel_Params
>  *                     structure passed to Revel_EncodeStart().
>  * @param numBytes The size of the sampleBuffer array, in bytes.
>  * @param numTotalBytes If non-NULL, the total number of bytes of  
> audio
>  *                      encoded so far will be stored here.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_EncodeAudio(int encoderHandle, void *sampleBuffer,
>                               int numBytes, int *numTotalBytes = 0);
>
>
> /**
>  * Finalize the new movie.
>  *
>  * This function MUST be called when you're finished encoding, or  
> the output
>  * movie will be corrupted and unviewable!
>  *
>  * @param encoderHandle must be a valid encoder handle.
>  * @param totalSize If non-NULL, the total size of the output movie  
> (in bytes)
>  *                  will be stored here.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_EncodeEnd(int encoderHandle, int *totalSize = 0);
>
>
> /**
>  * Destroys an encoder through its handle, and frees the memory  
> associated
>  * with it.
>  *
>  * @param encoderHandle A handle to the encoder to destroy.
>  * @return REVEL_ERR_NONE if success, an error code if not.
>  */
> Revel_Error Revel_DestroyEncoder(int encoderHandle);
>
> #ifdef __cplusplus
> }
> #endif
> #endif
>
>
> <mime-attachment.txt>
>

-----------------------------------------------------------
Scott Hays
R&D Engineer
http://www.neon.com.tw/
Neon Advanced Technology Co., Ltd.





More information about the live-devel mailing list