[Live-devel] Live h264 streaming in Android
Daniel Yacouboff
danielya at essence-grp.com
Mon Jan 4 21:41:14 PST 2016
Hi Eric and Ross,
1) I admit I didn't explain myself fully - I use two different RTSPServer objects, with two different stream names, each one has only one MediaSession, one has a Video one, and one has an Audio one. I do that exactly because of the issue Eric mentioned he had, only that I decided to skip the timestamp calculations and just separate the streams.
2) As Ross repeated, it works perfectly fine in VLC, both working, flowing and synchronized (I open at the same time two vlc windows, one to play the audio and one to play the video).
3) It also works when I play it in VLC's Android application.
4) Video's presentation time is calculated correctly, since I get the exact timestamp right from my camera's encoder.
5) I've dig down a bit while I was waiting for your reply, it seems that according to Android logs, the application sends RTSP request to the live555 server at the beginning, and it gets the correct response, but then it gets into a "loop" of buffering which it never gets data from (I only get the first 2-3 seconds because that's the buffering he received from the first request, but later it tries to get more data from server but there isn't any...).
Any more ideas, please?
Thanks
More information about the live-devel
mailing list