[Live-devel] Stream my own manipulated image array (push data to stream)
Matias Hernandez Arellano
msdark at archlinux.cl
Fri Apr 1 11:03:47 PDT 2011
(sorry for my poor english)..
I hope i'm writing to right list..
I have an application to do some image processing created with OpenCV (to capture frames from a camera and manipulate the images) .. now i need to add a streaming capability (the idea is see the result of the processing part in other machine, maybe through HTML5) .. So i made a little research about this, and found i can use different protocols to do that and i think to use this library to take care of that..
Then i try to found a good tool/lib to accomplish my task. .. i try with gstreamer, they have an API called AppSrc to push data into a buffer (in this case the frames getting with OpenCV) and stream that, but i have some troubles to join their Glib Loop with the application, so i think in libvlc and finally found live555 , but i can't find any example of how can i do the stream part with frames externally created ... ..
So. It's possible to use live555 to stream this content??
Any guideline, idea or example about this?
PD: I get the frames and i can convert to any format (one-by-one but is really quick) .. so i just need to push into "the stream" to send over networks and can see in other device "live streaming" ..
Thanks in advance.
Matías Hernandez Arellano
Ingeniero de Software/Proyectos en VisionLabs S.A
CDA Archlinux-CL
www.msdark.archlinux.cl
More information about the live-devel
mailing list