[Live-devel] H.264 bitstream thought as H.265, and not playable with VLC or FFmpeg

denglikang denglikang at tuputech.com
Sun Feb 17 21:42:53 PST 2019


Hi All,

	I use the latest version   live.2019.02.03.tar.gz download from http://www.live555.com/liveMedia/public/, however H.264 bitstream is not playable but H.265 is ok.

ffplay warning:
> [rtsp @ 0x7f7fef834400] Multi-layer HEVC coding is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.

Actually I use the newest ffmpeg version on my macOS, and VLC also complain me that:


> live555 warning: unsupported NAL type for H265

I captured the respond message from RTSPServer with Wireshark. It seems live555 regard the H.264 bitstream as H.265.


> Session Description Protocol
>       Session Description Protocol Version (v): 0
>       Owner/Creator, Session Id (o): - 1550460541446764 1 IN IP4 0.0.0.0
>       Session Name (s): Session streamed by "RTSPServer"
 >      Session Information (i): stream
 >     Time Description, active time (t): 0 0
 >      Session Attribute (a): tool:LIVE555 Streaming Media v2019.02.03
>       Session Attribute (a): type:broadcast
>       Session Attribute (a): control:*
>       Session Attribute (a): range:npt=0-
>       Session Attribute (a): x-qt-text-nam:Session streamed by "RTSPServer"
>       Session Attribute (a): x-qt-text-inf:stream
>       Media Description, name and address (m): video 0 RTP/AVP 96
>       Connection Information (c): IN IP4 0.0.0.0
>       Bandwidth Information (b): AS:4096
>       Media Attribute (a): rtpmap:96 H265/90000
>       Media Attribute (a): control:track1

From SDP information, we see “Media Attribute (a): rtpmap:96 H265/90000”, but payload type 96 was thought as H.265, not H.264. 
I pasted some code below, though it may be not my fault. 
Much appreciation if you give some advices or patch. 

## About my code

I subclass` OnDemandServerMediaSubsession`, and Implement `createNewRTPSink` and `createNewStreamSource`. Also I subclass `FramedSource ` and  implement `doGetNextFrame` to get the frame from the encoder by `std::queue`.

Some code:
```c++
void QueueFramedSource::doGetNextFrame() { // derive drom `FramedSource`
   auto streamPacket = streamQueue->dequeue();
   fFrameSize = 0;     // reset frame size
   fDurationInMicroseconds = 0; // set to zero for live source, non-zero for file source
   if (streamPacket == nullptr) {
       log_->warningf("[%s#%d]The frame queue is empty!", __func__, __LINE__);
       nextTask() = envir().taskScheduler().scheduleDelayedTask( 3000, reinterpret_cast<TaskFunc*>(FramedSource::afterGetting), this);
       return ;
   } else {
       log_->tracef("[%s#%d]seq:%u, pts:%llu, len:%llu", __func__, __LINE__, streamPacket->seq_, streamPacket->pts, streamPacket->len_);
   }

   if(streamPacket->len_ > fMaxSize) {
       fFrameSize = fMaxSize;
       fNumTruncatedBytes = streamPacket->len_ - fMaxSize;
       log_->infof("[%s#%d]packet length(%llu) exceed the fMaxSize(%u), fNumTruncatedBytes(%u)!!!", __func__, __LINE__, streamPacket->len_, fMaxSize, fNumTruncatedBytes);
   } else {
       fFrameSize = streamPacket->len_;
       fNumTruncatedBytes = 0;
   }
   fPresentationTime.tv_sec = streamPacket->pts / 1000000UL;   // update pts
   fPresentationTime.tv_usec = streamPacket->pts % 1000000UL;

   auto offset = start_code_offset(streamPacket->data_, fFrameSize);
   auto startcode_length = start_code_length(streamPacket->data_ + offset, fFrameSize - offset);
   auto total_offset = offset + startcode_length;
   fFrameSize -= total_offset;
   memcpy(fTo, streamPacket->data_ + total_offset, fFrameSize);
   delete [] streamPacket->data_;

   FramedSource::afterGetting(this); 
}


FramedSource *RtspServerMediaSubsession::createNewStreamSource(unsigned clientSessionId,  //derived from OnDemandServerMediaSubsession
                                                                        unsigned &estBitrate) {
   estBitrate = 4096; 	//VBR, max 4096 kbps
   QueueFramedSource *framedSource = QueueFramedSource::createNew(streamQueue, envir());
   if (framedSource == nullptr) {
       return nullptr;
   }

   H264or5VideoStreamDiscreteFramer *discreteFramer = nullptr;
   if (payload_type_ == PT_H264) {
       discreteFramer = H264VideoStreamDiscreteFramer::createNew(envir(), framedSource);
   } else if (payload_type_ == PT_H265) {
       discreteFramer = H265VideoStreamDiscreteFramer::createNew(envir(), framedSource);
   }

   return discreteFramer;
}
RTPSink *RtspServerMediaSubsession::createNewRTPSink(Groupsock *rtpGroupsock,  //derived from OnDemandServerMediaSubsession
                                                              unsigned char rtpPayloadTypeIfDynamic,
                                                              FramedSource *inputSource) {
   H264or5VideoRTPSink *rtpSink = nullptr;
   OutPacketBuffer::increaseMaxSizeTo(max_size_);  /* packet-loss frequently */
   if (payload_type_ == PT_H264) {
       rtpSink = H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
   } else if (payload_type_ == PT_H265) {
       rtpSink = H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
   } else {
       rtpSink = nullptr;
   }

   if (nullptr != rtpSink) {
       rtpSink->setPacketSizes(preferredPacketSize, maxPacketSize);
   }
   return rtpSink;
}
```
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20190218/30f23033/attachment-0001.html>


More information about the live-devel mailing list