<meta http-equiv="content-type" content="text/html; charset=utf-8"><span class="Apple-style-span" style="border-collapse: collapse; font-family: arial, sans-serif; font-size: 13px; "><div class="im" style="color: rgb(80, 0, 80); ">
Hi,</div><div class="im" style="color: rgb(80, 0, 80); ">currently I'm developing an RTSP client using VLC bindings. However i had to make a few adjustments to the VLC source code in order to achieve fast rewind. Tests were successful, but there are still a few bugs to be attended. I have two questions relating this issue.</div>
<div class="im" style="color: rgb(80, 0, 80); "><br></div><div class="im" style="color: rgb(80, 0, 80); ">First in the vlc devel list I'm discussing this issue:</div><div class="im" style="color: rgb(80, 0, 80); "><br>
</div><div class="im" style="color: rgb(80, 0, 80); ">>> The problem you mentioned in the forum is kind of a big fail (yes i'm<br>>> mundu :). When fast rewinding the time gets stuck and then it doesn't<br>
>> update. If you press play it will only start from the time you issued<br>>> the setRate(negative), and not at the time where I pressed play again.<br>>> Do you know where vlc actually updates its time?<br>
<br></div>> There are several issues with rtsp and time update. In fact the time<br>> will always update at 1x speed.<br>> That's because rtsp servers re-calculate timestamps, and vlc thinks it<br>> receives the pictures at 1x speed.<br>
> Time update is based on timestamps</span>
<div><span class="Apple-style-span" style="border-collapse: collapse; font-family: arial, sans-serif; font-size: 13px; "><br></span></div><div><span class="Apple-style-span" style="border-collapse: collapse; font-family: arial, sans-serif; font-size: 13px; ">It seems that VLC doesn't have a way to properly figure out at what time the video is currently playing when issuing for example a set_rate(-2). Any ideas how to solve this issue?</span></div>
<div><span class="Apple-style-span" style="border-collapse: collapse; font-family: arial, sans-serif; font-size: 13px; "><br></span></div><div><font class="Apple-style-span" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">Another question relates to the TS vs RTP. As you know set top boxes request raw-UDP because they don't need RTP overhead. However I'm currently trying to understand the benefits of eradicating the transport stream container as it proves to be the most greedy in terms of overhead. I mean RTP uses 12 bytes of overhead while 6 TS packets inside a RTP packet uses 24 bytes. Twice as much as the RTP overhead. Not to mention that both have timestamp information (redundant information). In a Video on demand application while streaming HD content, this proves to be to much in a long-term use. (And normally TS is to multiplex several Programs not just a movie). </span></font></div>
<div><font class="Apple-style-span" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">I raise this question only focused on an Internet service and not TV( STBs and DVB uses TS so no questions there). My question is, what is the downsize of using H264 without a container when it comes to trick play. As far as I know the indexer already searches for H264 I-Frames, am i wrong? But i read in a paper that the computational costs of doing such a thing is too much. Is this true?</span></font></div>
<div><font class="Apple-style-span" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;"><br></span></font></div><div><font class="Apple-style-span" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">Thank you for your attention.</span></font></div>
<div><font class="Apple-style-span" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">Nuno Mota</span></font></div>