AVPlayer HLS live stream level meter (Display FFT Data)

IosObjective CAvfoundationMedia PlayerAvplayer

Ios Problem Overview


I'm using AVPlayer for a radio app using HTTP live streaming. Now I want to implement a level meter for that audio stream. The very best would a level meter showing the different frequencies, but a simple left / right solution would be a great starting point.

I found several examples using AVAudioPlayer. But I cannot find a solution for getting the required informations off AVPlayer.

Can someone think of a solution for my problem?

EDIT I want to create something like this (but nicer)

nice levelmeter

EDIT II

One suggestion was to use MTAudioProcessingTap to get the raw audio data. The examples I could find using the [[[_player currentItem] asset] tracks] array, which is, in my case, an empty array. Another suggestion was to use [[_player currentItem] audioMix] which is null for me.

EDIT III

After years already, there still not seems to be a solution. I did indeed make progress, so I'm sharing it.

During setup, I'm adding a key-value observer to the playerItem:

[[[self player] currentItem] addObserver:self forKeyPath:@"tracks" options:kNilOptions context:NULL];

//////////////////////////////////////////////////////

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
    if ([keyPath isEqualToString:@"tracks"] && [[object tracks] count] > 0) {
        for (AVPlayerItemTrack *itemTrack in [object tracks]) {
            AVAssetTrack *track = [itemTrack assetTrack];
            
            if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
                [self addAudioProcessingTap:track];
                break;
            }
        }
}

- (void)addAudioProcessingTap:(AVAssetTrack *)track {
    MTAudioProcessingTapRef tap;
    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = (__bridge void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalise;

    // more tap setup...
            
    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
    
    AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
    [inputParams setAudioTapProcessor:tap];
    [audioMix setInputParameters:@[inputParams]];
    
    [[[self player] currentItem] setAudioMix:audioMix];
}

So far so good. This all works, I could find the right track and setup the inputParams and audioMix etc. But unfortunately the only callback, that gets called is the init callback. None of the others will fire at any point.

I tried different (kinds of) stream sources, one of them an official Apple HLS stream: http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8

Ios Solutions


Solution 1 - Ios

Sadly, using an HLS stream with AVFoundation doesn't give you any control over the audio tracks. I ran into the same problem trying to mute an HLS stream, which turned out to be impossible.

The only way you could read audio data would be to tap into the AVAudioSession.

EDIT

You can access the AVAudioSession like this:

[AVAudioSession sharedInstance]

Here's the documentation for AVAudioSession

Solution 2 - Ios

Measuring audio using AVPlayer looks to be an issue that is still ongoing. That being said, I believe that the solution can be reached by combining AVPlayer with AVAudioRecorder.

While the two classes have seemingly contradictory purposes, there is a work around that allows AVAudioRecorder to access the AVPlayer's audio output.

##Player / Recorder

As described in this [Stack Overflow Answer][1], recording the audio of a AVPlayer is possible if you access the audio route change using kAudioSessionProperty_AudioRouteChange.

Notice that the audio recording must be started after accessing the audio route change. Use the linked stack answer as a reference - it includes more details and necessary code.


Once you have access to the `AVPlayer`'s audio route and are recording, the **measuring** is relatively straightforward.

##Audio Levels

In my [answer][2] to a stack question regarding measuring microphone input I describe the steps necessary to access the audio level measurements. Using `AVAudioRecorder` to monitor volume changes is more complex than one would think, so I included a GitHub [project][3] that acts as a template for monitoring audio changes while recording.

~~~~~~~~~~~~~~~~~~~~~~~~~~ **Please Note** ~~~~~~~~~~~~~~~~~~~~~~~~~~

This combination during an HLS live stream is not something that I have tested. This answer is strictly theoretical, so it may take a sound understanding of both classes to work out completely.


  [1]: https://stackoverflow.com/questions/8403995/playing-video-with-avplayer-and-recording-sound-with-avaudiorecorder-simultaneou
  [2]: https://stackoverflow.com/questions/45692438/measure-microphone-level-in-webrtc-for-ios/45952242#45952242
  [3]: https://github.com/picciano/iOS-Audio-Recoginzer/blob/master/ARAudioRecognizer.m

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJulian F. WeinertView Question on Stackoverflow
Solution 1 - IosSimon GermainView Answer on Stackoverflow
Solution 2 - IosChrisHazeView Answer on Stackoverflow