Transcoding fMP4 to HLS while writing on iOS using FFmpeg

IosFfmpegAvfoundationHttp Live-StreamingFmp4

Ios Problem Overview


TL;DR

I want to convert fMP4 fragments to TS segments (for HLS) as the fragments are being written using FFmpeg on an iOS device.

Why?

I'm trying to achieve live uploading on iOS while maintaining a seamless, HD copy locally.

What I've tried

  1. Rolling AVAssetWriters where each writes for 8 seconds, then concatenating the MP4s together via FFmpeg.

What went wrong - There are blips in the audio and video at times. I've identified 3 reasons for this.

  1. Priming frames for audio written by the AAC encoder creating gaps.

  2. Since video frames are 33.33ms long, and audio frames 0.022ms long, it's possible for them to not line up at the end of a file.

  3. The lack of frame accurate encoding present on Mac OS, but not available for iOS [Details Here][1]

  1. FFmpeg muxing a large video only MP4 file with raw audio into TS segments. The work was based on the [Kickflip SDK][2]

What Went Wrong - Every once in a while an audio only file would get uploaded, with no video whatsoever. Never able to reproduce it in-house, but it was pretty upsetting to our users when they didn't record what they thought they did. There were also issues with accurate seeking on the final segments, almost like the TS segments were incorrectly time stamped.

What I'm thinking now

Apple was pushing fMP4 at WWDC this year (2016) and I hadn't looked into it much at all before that. Since an fMP4 file can be read, and played while it's being written, I thought that it would be possible for FFmpeg to transcode the file as it's being written as well, as long as we hold off sending the bytes to FFmpeg until each fragment within the file is finished.

However, I'm not familiar enough with the FFmpeg C API, I only used it briefly within attempt #2.

What I need from you

  1. Is this a feasible solution? Is anybody familiar enough with fMP4 to know if I can actually accomplish this?
  2. How will I know that AVFoundation has finished writing a fragment within the file so that I can pipe it into FFmpeg?
  3. How can I take data from a file on disk, chunk at a time, pass it into FFmpeg and have it spit out TS segments?

[1]: https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1388760-captureoutputshouldprovidesample "Details Here" [2]: https://github.com/Kickflip/kickflip-ios-sdk "Kickflip SDK"

Ios Solutions


Solution 1 - Ios

Strictly speaking you don't need to transcode the fmp4 if it contains h264+aac, you just need to repackage the sample data as TS. (using ffmpeg -codec copy or gpac)

Wrt. alignment (1.2) I suppose this all depends on your encoder settings (frame rate, sample rate and GOP size). It is certainly possible to make sure that audio and video align exactly at fragment boundaries (see for example: this table). If you're targeting iOS, I would recommend using HLS protocol version 3 (or 4) allowing timing to be represented more accurately. This also allows you to stream audio and video separately (non-multiplexed).

I believe ffmpeg should be capable of pushing a live fmp4 stream (ie. using a long-running HTTP POST), but playout requires origin software to do something meaningful with it (ie. stream to HLS).

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionbclymerView Question on Stackoverflow
Solution 1 - IosTijn PorcelijnView Answer on Stackoverflow