[Live-devel] MP3 streaming problem

Ignacio Barreto ignacio at radixcast.com
Wed Apr 20 13:19:34 PDT 2016


2016-04-19 16:19 GMT-03:00 Ross Finlayson <finlayson at live555.com>:
>> We have a video subsession, which works fine streaming H.264 video
frames.
>> We added another subsession for audio, we return as you suggested our
>> own framed source subclass.
>>
>> The interesting thing is if we NOT ADD the video subsession to the
>> ServerMediaSession, the audio hears perfectly well. However, if we add
>> the video subsession, as well as the audio subsession we started to
>> hear some glitches.
>> Any ideas?
>
> Problems like this are usually caused by one of two possible things:
>
> 1/ You are blocking (or ‘spin waiting’) in one or both of your
“FramedSource” subclasses (implementing your audio and/or video device).
You should not do this!  Remember that LIVE555-based applications are
event-based, using a single-threaded event loop for concurrency.  If no
data is immediately available, you should not block (or ‘spin wait’),
because that would prevent events (for the other medium) from getting
handled.  Instead, you should immediately return (to the event loop).

We think that we're not blocking both subclasses. In both cases we capture
in one thread, encoding in other thread (this signaled to deliverFrame) and
send the stream using live555 in another thread.

We tried using mutex when to avoid consumer-producer problem (to passing
information between threads) and we didn't get most improvement as well.

The clear thing is when we add the video subsession this cause that the
event loop process not only the audio frames, it processes the video frames
too.

I copy below two links of the test which I have done. Maybe you have a
better idea of what kind of mistake we are doing.

https://drive.google.com/file/d/0B0rFtoVWa4g0VWFNb3luYUhoMmM/view?usp=sharing

https://drive.google.com/file/d/0B0rFtoVWa4g0MzZDLXFjUEZFdW8/view?usp=sharing


> 2/ You are not setting “fPresentationTime” properly in your video or
audio media - of both.  The “fPresentationTime” values for each medium
should be in sync, and aligned with ‘wall clock’ time (i.e., the time that
you’d get by calling “gettimeofday()”).

We're doing exactly this in video and audio framed source subclasses

> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20160420/e3256f4f/attachment.html>


More information about the live-devel mailing list