[Live-devel] Custom live DeviceSource and custom scheduler problem
Ralf Globisch
RGlobisch at csir.co.za
Mon Mar 10 10:54:05 PDT 2008
Hi,
I've written some classes to integrate the DirectShow framework to the live555 library but I'm having some weird issues.
I hope someone can tell me if my approach is flawed.
The strange thing is I first implemented an live audio streaming server as a test project that captures live audio using DirectShow and then transmits it using the live555 library in a separate thread using a PassiveServerMediaSubsession.
Following advice on the mailing list I've written
1) A custom scheduler that calls scheduleDelayedTask to add events to the event loop when a watch variable is set for that specific source.
2) A class based on DeviceSource in which the deliverFrame method fills the buffer fTo, sets fFrameSize, fPresentationTime and fDurationInMicroseconds and calls FramedSource::afterGetting(this);
The deliverFrame is triggered via a DirectShow event and the scheduler then calls this method every time new data becomes available.
3) Code based on testWAVAudioStreamer which creates the source, sink, rtp and rtcp groupsocks, RTSP server, PassiveServerMediaSubsession, etc
I then tested the streamed audio with VLC and everything played back fine.
I then added similar code for video (with different rtp, rtcp ports, etc)and added another subsession to the same PassiveServerMediaSubsession.
Now I find that my queues in my DeviceSource classes aren't processed properly. Hardly any packets get sent.
Whilst stepping through the code I found that "if (!isCurrentlyAwaitingData()) return;" in my DeviceSource class is always exiting the deliverFrame method before adding the next frame to the buffer and sending it. With only one media stream this doesn't seem to be a problem.
I've done a search for every place fIsCurrentlyAwaitingData is changed and put breakpoints at those lines but these breakpoints are never executed.
What I want to do is the following:
I have an audio and a video source.
Every time a frame arrives from the external sources I need to notify the respective live555 RTP Sources of the new data. (trigger event in scheduler)
This results in the deliverFrame method being called in which the buffers are filled, sizes and times are set.
This calls FramedSource::afterGetting(this); which should result in the RTP packet being sent.
So here's my questions:
1) What is the difference between having one media subsession and multiple ones. ( Both video and audio stream fine when only one session exists in the subsession) Or does the problem lie elsewhere?
2) From what I understand a PassiveServerMediaSubsession is to be used when multicasting media so it should be no problem adding more than one media subsession to the PassiveServerMediaSubsession?
3) Is there anything else I'm missing.
I will gladly post code if that helps to clarify the matter but I'm avoiding polluting the mailing list for now ;)
Our group will also gladly release the integration source code once it's complete if anyone's interested.
Best regards,
Ralf
--
This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard.
The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html.
This message has been scanned for viruses and dangerous content by MailScanner,
and is believed to be clean. MailScanner thanks Transtec Computers for their support.
More information about the live-devel
mailing list