[Live-devel] Stream two subsessions in one ServerMediaSession

Piotr Piwko piotr.piwko at embedded-engineering.pl
Fri Dec 4 11:10:34 PST 2015


Hi,

I have two separated live sources of video (H264) and audio (ACC)
content. I created the FrameSource the OnDemandServerMediaSubsession
classes where the corresponding RTPSink objects are created
(H264VideoRTPSink and MPEG4GenericRTPSink). The fPresentationTime
parameter is set as the current time by gettimeofday() routine in both
FrameSource objects.

If I stream each of subsession as the separated stream everything
works correctly and the receiver (VLC) handles them properly.

The problem begins when I add video and audio subsessions into one
ServerMediaSession. It looks like that those subsessions are not
synchronized. Video's RTP frames have totally different timestamps
than audio's frames.

I dug a little bit and saw that the frame timestamp is calculated in
RTPSink class in convertToRTPTimestamp() method basing on
fTimestampBase and fTimestampFrequency parameters. The fTimestampBase
is set as the random value, so there is no possibility to have similar
timestamps in both subessions.
As far as I understand, there is also a need to have the same
fTimestampFrequency values, but the audio has 44100 sampling frequency
and video has this value hardcoded to 90000.

I performed an experiment and set the fTimestampBase parameters to the
same hardcoded value and fTimestampFrequency members to 90000. In this
case, everything works correctly and frames have similar timestamps.

How can I achieve above result without this dirty modification? Is
there any possibility to have similar timestamps (use the same
fTimestampBase) in both subsessions?

Thank you in advance for your engagement.

-- 
Piotr Piwko
http://www.embedded-engineering.pl/


More information about the live-devel mailing list