Maintaining good scroll performance when using AVPlayer

IosVideoAvfoundationAvplayer

Ios Problem Overview


I'm working on an application where there is a collection view, and cells of the collection view can contain video. Right now I'm displaying the video using AVPlayer and AVPlayerLayer. Unfortunately, the scrolling performance is terrible. It seems like AVPlayer, AVPlayerItem, and AVPlayerLayer do a lot of their work on the main thread. They are constantly taking out locks, waiting on semaphores, etc. which is blocking the main thread and causing severe frame drops.

Is there any way to tell AVPlayer to stop doing so many things on the main thread? So far nothing I've tried has solved the problem.

I also tried building a simple video player using AVSampleBufferDisplayLayer. Using that I can make sure that everything happens off the main thread, and I can achieve ~60fps while scrolling and playing video. Unfortunately that method is much lower level, and it doesn't provide things like audio playback and time scrubbing out of the box. Is there any way to get similar performance with AVPlayer? I'd much rather use that.

Edit: After looking into this more, it doesn't look like it's possible to achieve good scrolling performance when using AVPlayer. Creating an AVPlayer and associating in with an AVPlayerItem instance kicks off a bunch of work which trampolines onto the main thread where it then waits on semaphores and tries to acquire a bunch of locks. The amount of time this stalls the main thread increases quite dramatically as the number of videos in the scrollview increases.

AVPlayer dealloc also seems to be a huge problem. Dealloc'ing an AVPlayer also tries to synchronize a bunch of stuff. Again, this gets extremely bad as you create more players.

This is pretty depressing, and it makes AVPlayer almost unusable for what I'm trying to do. Blocking the main thread like this is such an amateur thing to do so it's hard to believe Apple engineers would've made this kind of mistake. Anyways, hopefully they can fix this soon.

Ios Solutions


Solution 1 - Ios

Build your AVPlayerItem in a background queue as much as possible (some operations you have to do on the main thread, but you can do setup operations and waiting for video properties to load on background queues - read the docs very carefully). This involves voodoo dances with KVO and is really not fun.

The hiccups happen while the AVPlayer is waiting for the AVPlayerItems status to become AVPlayerItemStatusReadyToPlay. To reduce the length of the hiccups you want to do as much as you can to bring the AVPlayerItem closer to AVPlayerItemStatusReadyToPlay on a background thread before assigning it to the AVPlayer.

It's been a while since I actually implemented this, but IIRC the main thread blocks are caused because the underlying AVURLAsset's properties are lazy-loaded, and if you don't load them yourself, they get busy-loaded on the main thread when the AVPlayer wants to play.

Check out the AVAsset documentation, especially the stuff around AVAsynchronousKeyValueLoading. I think we needed to load the values for duration and tracks before using the asset on an AVPlayer to minimize the main thread blocks. It's possible we also had to walk through each of the tracks and do AVAsynchronousKeyValueLoading on each of the segments, but I don't remember 100%.

Solution 2 - Ios

Don't know if this will help – but here's some code I'm using to load videos on background queue that definitely helps with main thread blocking (Apologies if it doesn't compile 1:1, I abstracted from a larger code base I'm working on):

func loadSource() {
    self.status = .Unknown

    let operation = NSBlockOperation()
    operation.addExecutionBlock { () -> Void in
    // create the asset
    let asset = AVURLAsset(URL: self.mediaUrl, options: nil)
    // load values for track keys
    let keys = ["tracks", "duration"]
    asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: { () -> Void in
        // Loop through and check to make sure keys loaded
        var keyStatusError: NSError?
        for key in keys {
            var error: NSError?
            let keyStatus: AVKeyValueStatus = asset.statusOfValueForKey(key, error: &error)
            if keyStatus == .Failed {
                let userInfo = [NSUnderlyingErrorKey : key]
                keyStatusError = NSError(domain: MovieSourceErrorDomain, code: MovieSourceAssetFailedToLoadKeyValueErrorCode, userInfo: userInfo)
                println("Failed to load key: \(key), error: \(error)")
            }
            else if keyStatus != .Loaded {
                println("Warning: Ignoring key status: \(keyStatus), for key: \(key), error: \(error)")
            }
        }
        if keyStatusError == nil {
            if operation.cancelled == false {
                let composition = self.createCompositionFromAsset(asset)
                // register notifications
                let playerItem = AVPlayerItem(asset: composition)
                self.registerNotificationsForItem(playerItem)
                self.playerItem = playerItem
                // create the player
                let player = AVPlayer(playerItem: playerItem)
                self.player = player
            }
        }
        else {
            println("Failed to load asset: \(keyStatusError)")
        }
    })

    // add operation to the queue
    SomeBackgroundQueue.addOperation(operation)
}

func createCompositionFromAsset(asset: AVAsset, repeatCount: UInt8 = 16) -> AVMutableComposition {
     let composition = AVMutableComposition()
     let timescale = asset.duration.timescale
     let duration = asset.duration.value
     let editRange = CMTimeRangeMake(CMTimeMake(0, timescale), CMTimeMake(duration, timescale))
     var error: NSError?
     let success = composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
     if success {
         for _ in 0 ..< repeatCount - 1 {
          composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
         }
     }
     return composition
}

Solution 3 - Ios

If you look into Facebook's AsyncDisplayKit (the engine behind Facebook and Instagram feeds), you can render video for the most part on background threads using their AVideoNode. If you subnode that into an ASDisplayNode and add the displayNode.view to whatever view you are scrolling (table/collection/scroll), you can achieve perfectly smooth scrolling (just make sure they create the node and assets and all that on a background thread). The only issue is when having the change the video item, as this forces itself onto the main thread. If you only have a few videos on that particular view you are fine to use this method!

        dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), {
            self.mainNode = ASDisplayNode()
            self.videoNode = ASVideoNode()
            self.videoNode!.asset = AVAsset(URL: self.videoUrl!)
            self.videoNode!.frame = CGRectMake(0.0, 0.0, self.bounds.width, self.bounds.height)
            self.videoNode!.gravity = AVLayerVideoGravityResizeAspectFill
            self.videoNode!.shouldAutoplay = true
            self.videoNode!.shouldAutorepeat = true
            self.videoNode!.muted = true
            self.videoNode!.playButton.hidden = true
            
            dispatch_async(dispatch_get_main_queue(), {
                self.mainNode!.addSubnode(self.videoNode!)
                self.addSubview(self.mainNode!.view)
            })
        })

Solution 4 - Ios

Here's a working solution for displaying a "video wall" in a UICollectionView:

  1. Store all of your cells in an NSMapTable (from henceforth, you will only access a cell object from the NSMapTable):

    self.cellCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsWeakMemory valueOptions:NSPointerFunctionsStrongMemory capacity:AppDelegate.sharedAppDelegate.assetsFetchResults.count]; for (NSInteger i = 0; i < AppDelegate.sharedAppDelegate.assetsFetchResults.count; i++) { [self.cellCache setObject:(AssetPickerCollectionViewCell *)[self.collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:[NSIndexPath indexPathForItem:i inSection:0]] forKey:[NSIndexPath indexPathForItem:i inSection:0]]; }

  2. Add this method to your UICollectionViewCell subclass:

    • (void)setupPlayer:(PHAsset *)phAsset { typedef void (^player) (void); player play = ^{ NSString __autoreleasing *serialDispatchCellQueueDescription = ([NSString stringWithFormat:@"%@ serial cell queue", self]); dispatch_queue_t __autoreleasing serialDispatchCellQueue = dispatch_queue_create([serialDispatchCellQueueDescription UTF8String], DISPATCH_QUEUE_SERIAL); dispatch_async(serialDispatchCellQueue, ^{ __weak typeof(self) weakSelf = self; __weak typeof(PHAsset) *weakPhAsset = phAsset; [[PHImageManager defaultManager] requestPlayerItemForVideo:weakPhAsset options:nil resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) { if(![[info objectForKey:PHImageResultIsInCloudKey] boolValue]) { AVPlayer __autoreleasing *player = [AVPlayer playerWithPlayerItem:playerItem]; __block typeof(AVPlayerLayer) *weakPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:player]; [weakPlayerLayer setFrame:weakSelf.contentView.bounds]; //CGRectMake(self.contentView.bounds.origin.x, self.contentView.bounds.origin.y, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height * (9.0/16.0))]; [weakPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; [weakPlayerLayer setBorderWidth:0.25f]; [weakPlayerLayer setBorderColor:[UIColor whiteColor].CGColor]; [player play]; dispatch_async(dispatch_get_main_queue(), ^{ [weakSelf.contentView.layer addSublayer:weakPlayerLayer]; }); } }]; });

      }; play(); }

  3. Call the method above from your UICollectionView delegate this way:

    • (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {

      if ([[self.cellCache objectForKey:indexPath] isKindOfClass:[AssetPickerCollectionViewCell class]]) [self.cellCache setObject:(AssetPickerCollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath] forKey:indexPath];

      dispatch_async(dispatch_get_global_queue(0, DISPATCH_QUEUE_PRIORITY_HIGH), ^{ NSInvocationOperation *invOp = [[NSInvocationOperation alloc] initWithTarget:(AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath] selector:@selector(setupPlayer:) object:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]]; [[NSOperationQueue mainQueue] addOperation:invOp]; });

      return (AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath]; }

By the way, here's how you would populate a PHFetchResult collection with all videos in the Video folder of the Photos app:

// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
    __block PHFetchResult *i = self->_assetsFetchResults;
    if (!i) {
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
            PHAssetCollection *collection = smartAlbums.firstObject;
            if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
            PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
            allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
            i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
            self->_assetsFetchResults = i;
        });
    }
    NSLog(@"assetsFetchResults (%ld)", self->_assetsFetchResults.count);
    
    return i;
}

If you want to filter videos that are local (and not in iCloud), which is what I'd assume, seeing as you're looking for smooth-scrolling:

// Filter videos that are stored in iCloud
- (NSArray *)phAssets {
    NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count];
    [[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
        if (asset.sourceType == PHAssetSourceTypeUserLibrary)
            [assets addObject:asset];
    }];
    
    return [NSArray arrayWithArray:(NSArray *)assets];
}

Solution 5 - Ios

I've played around with all the answers above and found out that they're true only to a certain limit.

Easiest and the simplest way that worked for me so far is that the code you assign your AVPlayerItem to your AVPlayer instance in a background thread. I noticed that assigning the AVPlayerItem to the player on the main thread (even after AVPlayerItem object is ready) always takes a toll on your performance and frame rate.

Swift 4

ex.

let mediaUrl = //your media string
let player = AVPlayer()
let playerItem = AVPlayerItem(url: mediaUrl)

DispatchQueue.global(qos: .default).async {
    player.replaceCurrentItem(with: playerItem)
}

Solution 6 - Ios

I manage to create a horizontal feed like view with avplayer in each cell did it like so:

  1. Buffering - create a manager so you can preload (buffer) the videos. The amount of AVPlayers you want to buffer depends on the experience you are looking for. In my app i manage only 3 AVPlayers, so one player is being played now and the previous & next players are being buffered. All the buffering manager is doing is managing that the correct video is being buffered at any given point

  2. Reused cells - Let the TableView / CollectionView reuse the cells in cellForRowAtIndexPath: all you have to do is after you dequqe the cell pass him it's correct player (i just give the buffering an indexPath on the cell and he returns the correct one)

  3. AVPlayer KVO's - Every time the buffering manager gets a call to load a new video to buffer the AVPlayer create all of his assets and notifications, just call them like so:

// player

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
    self.videoContainer.playerLayer.player = self.videoPlayer;
    self.asset = [AVURLAsset assetWithURL:[NSURL URLWithString:self.videoUrl]];
    NSString *tracksKey = @"tracks";
    dispatch_async(dispatch_get_main_queue(), ^{
        [self.asset loadValuesAsynchronouslyForKeys:@[tracksKey]
                                  completionHandler:^{                         dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
                                          NSError *error;
                                          AVKeyValueStatus status = [self.asset statusOfValueForKey:tracksKey error:&error];
                                          
                                          if (status == AVKeyValueStatusLoaded) {
                                              self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset];
                                              // add the notification on the video
                                              // set notification that we need to get on run time on the player & items
                                              // a notification if the current item state has changed
                                              [self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:contextItemStatus];
                                              // a notification if the playing item has not yet started to buffer
                                              [self.playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferEmpty];
                                              // a notification if the playing item has fully buffered
                                              [self.playerItem addObserver:self forKeyPath:@"playbackBufferFull" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferFull];
                                              // a notification if the playing item is likely to keep up with the current buffering rate
                                              [self.playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:contextPlaybackLikelyToKeepUp];
                                              // a notification to get information about the duration of the playing item
                                              [self.playerItem addObserver:self forKeyPath:@"duration" options:NSKeyValueObservingOptionNew context:contextDurationUpdate];
                                              // a notificaiton to get information when the video has finished playing
                                              [NotificationCenter addObserver:self selector:@selector(itemDidFinishedPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
                                              self.didRegisterWhenLoad = YES;
                                              
                                              self.videoPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
                                                                                                
                                              // a notification if the player has chenge it's rate (play/pause)
                                              [self.videoPlayer addObserver:self forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:contextRateDidChange];
                                              // a notification to get the buffering rate on the current playing item
                                              [self.videoPlayer addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:contextTimeRanges];
                                          }
                                      });
                                  }];
    });
});

where: videoContainer - is the view you want to add the player to

Let me know if you need any help or more explanations

Good luck :)

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionAntonioView Question on Stackoverflow
Solution 1 - IosdamianView Answer on Stackoverflow
Solution 2 - IosAndy PoesView Answer on Stackoverflow
Solution 3 - IosGreggView Answer on Stackoverflow
Solution 4 - IosJames BushView Answer on Stackoverflow
Solution 5 - IosmelakaView Answer on Stackoverflow
Solution 6 - IosYYfimView Answer on Stackoverflow