12 comments on “What I learned about H.264 for WebRTC video (Tim Panton)

  1. Regarding the mark bit, I understand that you forgot to keep the mark bit value when relaying a video packet, so it was always set to 0 in outbound. Is that correct?

  2. Yep, in classic VoIP audio the mark bit only gets used in some variants of DTMF (inband) – so my audio-heritage phono stack didn’t pass it up to the layers above or allow setting it on non DTMF packets.

  3. I enjoyed your post Tim, nice write up and deep details. People working in this level of detail will may also run into way more than 10 packets per frame when they move to HD etc, those can be 60+ packets; our team has had a lot of fun in that arena.

  4. A while ago I made a simple node app to stream low latency video to a browser,and was able to get below 100ms delay (depends on the bw, encoder settings etc). I took a bit different approach. I used regular websocket as transfer and used the awesome Broadway decoder and player (uses webgl and optimized android h264 decoder compiled to wasm). The result was quite impressive. You can check the lib and example here https://github.com/matijagaspar/ws-avc-player. Downside is that it only supports baseline profile.

    • Matija – that’s quite impressive that you got down to 100ms.
      Going the webRTC route is a little more complex, but you also get NAT traversal (i.e. both the device and the receiver can be behind different NATs) and the H264 decode uses hardware decoders if they are available, which saves on smartphone battery life.

      Since I wrote this post, I’ve been working to add support for bandwidth estimation – Google has done a terrific job, within a few seconds the RTCP messages give you a surprisingly accurate estimate of the available P2P bandwidth so you can adjust the encoder params.

      So overall webRTC is probably worth the extra effort in quite a few situations.

      • Yes! You are very right, and it’s a valid point. So I will try to play around with webrtc some more. My initial reasoning for going over ws was that: webrtc can be quite a pain and I honestly did not know that h264 was supported by webrtc, I assumed only vp8/9.

  5. Damn good article (I know I’m late to the party).
    Thanks for going into so much detail about your process.

    I’m testing my homebrew (mostly C) SFU on a raspi4 too!
    I too naively thought I could just make an SFU by multiplexing streams and being relatively naïve about the RTP content… *derp*

    I’m seeing promising results with 2(hd) senders and 4 receivers:

    But I’m hitting the limits of what I have access to in order to measure how well it scales… 😉

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.