AlamoFire asynchronous completionHandler for JSON request

SwiftAlamofire

Swift Problem Overview


Having used the AlamoFire framework I've noticed that the completionHandler is run on the main thread. Im wondering if the code below is a good practice for creating a Core Data import task within the completion handler:

Alamofire.request(.GET, "http://myWebSite.com", parameters: parameters)
            .responseJSON(options: .MutableContainers) { (_, _, JSON, error) -> Void in
                dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { () -> Void in
                    if let err = error{
                        println("Error:\(error)")
                        return;
                    }
                    
                    if let jsonArray = JSON as? [NSArray]{                       
                        let importer = CDImporter(incomingArray: jsonArray entity: "Artist", map: artistEntityMap);
                        
                    }
                });
            }

Swift Solutions


Solution 1 - Swift

This is a really good question. Your approach is perfectly valid. However, Alamofire can actually help you streamline this even more.

Your Example Code Dispatch Queue Breakdown

In you example code, you are jumping between the following dispatch queues:

  1. NSURLSession dispatch queue
  2. TaskDelegate dispatch queue for validation and serializer processing
  3. Main dispatch queue for calling your completion handler
  4. High priority queue for JSON handling
  5. Main dispatch queue to update the user interface (if necessary)

As you can see, you're hopping all over the place. Let's take a look at an alternative approach leveraging a powerful feature inside Alamofire.

Alamofire Response Dispatch Queues

Alamofire has an optimal approach built into it's own low level processing. The single response method that ultimately gets called by all custom response serializers has support for a custom dispatch queue if you choose to use it.

While GCD is amazing at hopping between dispatch queues, you want to avoid jumping to a queue that is busy (e.g. the main thread). By eliminating the jump back to the main thread in the middle of the async processing, you can potentially speed things up considerably. The following example demonstrates how to do this using Alamofire logic straight out-of-the-box.

Alamofire 1.x
let queue = dispatch_queue_create("com.cnoon.manager-response-queue", DISPATCH_QUEUE_CONCURRENT)

let request = Alamofire.request(.GET, "http://httpbin.org/get", parameters: ["foo": "bar"])
request.response(
    queue: queue,
    serializer: Request.JSONResponseSerializer(options: .AllowFragments),
    completionHandler: { _, _, JSON, _ in

        // You are now running on the concurrent `queue` you created earlier.
        println("Parsing JSON on thread: \(NSThread.currentThread()) is main thread: \(NSThread.isMainThread())")
        
        // Validate your JSON response and convert into model objects if necessary
        println(JSON)
        
        // To update anything on the main thread, just jump back on like so.
        dispatch_async(dispatch_get_main_queue()) {
            println("Am I back on the main thread: \(NSThread.isMainThread())")
        }
    }
)
Alamofire 3.x (Swift 2.2 and 2.3)
let queue = dispatch_queue_create("com.cnoon.manager-response-queue", DISPATCH_QUEUE_CONCURRENT)

let request = Alamofire.request(.GET, "http://httpbin.org/get", parameters: ["foo": "bar"])
request.response(
    queue: queue,
    responseSerializer: Request.JSONResponseSerializer(options: .AllowFragments),
    completionHandler: { response in
        // You are now running on the concurrent `queue` you created earlier.
        print("Parsing JSON on thread: \(NSThread.currentThread()) is main thread: \(NSThread.isMainThread())")

        // Validate your JSON response and convert into model objects if necessary
        print(response.result.value)

        // To update anything on the main thread, just jump back on like so.
        dispatch_async(dispatch_get_main_queue()) {
            print("Am I back on the main thread: \(NSThread.isMainThread())")
        }
    }
)
Alamofire 4.x (Swift 3)
let queue = DispatchQueue(label: "com.cnoon.response-queue", qos: .utility, attributes: [.concurrent])

Alamofire.request("http://httpbin.org/get", parameters: ["foo": "bar"])
    .response(
        queue: queue,
        responseSerializer: DataRequest.jsonResponseSerializer(),
        completionHandler: { response in
            // You are now running on the concurrent `queue` you created earlier.
            print("Parsing JSON on thread: \(Thread.current) is main thread: \(Thread.isMainThread)")

            // Validate your JSON response and convert into model objects if necessary
            print(response.result.value)

            // To update anything on the main thread, just jump back on like so.
            DispatchQueue.main.async {
                print("Am I back on the main thread: \(Thread.isMainThread)")
            }
        }
    )

Alamofire Dispatch Queue Breakdown

Here is the breakdown of the different dispatch queues involved with this approach.

  1. NSURLSession dispatch queue
  2. TaskDelegate dispatch queue for validation and serializer processing
  3. Custom manager concurrent dispatch queue for JSON handling
  4. Main dispatch queue to update the user interface (if necessary)

Summary

By eliminating the first hop back to the main dispatch queue, you have eliminated a potential bottleneck as well as you have made your entire request and processing asynchronous. Awesome!

With that said, I can't stress enough how important it is to get familiar with the internals of how Alamofire really works. You never know when you may find something that can really help you improve your own code.

Solution 2 - Swift

Small Update for Swift 3.0 ,Alamofire (4.0.1),Edit for @cnoon answer:

let queue = DispatchQueue(label: "com.cnoon.manager-response-queue",
                          qos: .userInitiated,
                          attributes:.concurrent)
Alamofire?.request(SERVER_URL, method: .post,
parameters: ["foo": "bar"], 
encoding: JSONEncoding.default,//by default
headers: ["Content-Type":"application/json; charset=UTF-8"])
.validate(statusCode: 200..<300).//by default
responseJSON(queue: queue, options: .allowFragments, 
completionHandler: { (response:DataResponse<Any>) in
        
        switch(response.result) {
        case .success(_):
            break
        case .failure(_):
            print(response.result.error)
            if response.result.error?._code == NSURLErrorTimedOut{
                //TODO: Show Alert view on netwok connection.
            }
            break
        }
    })

Solution 3 - Swift

Just complementing the perfect answer from @cnoon, if you like me is using ResponseObjectSerializable you can embed this concurrent behaviour on the request extension itself:

extension Request {
    public func responseObject<T: ResponseObjectSerializable>(completionHandler: Response<T, NSError> -> Void) -> Self {
        let responseSerializer = ResponseSerializer<T, NSError> { request, response, data, error in
            guard error == nil else { return .Failure(error!) }

            let JSONResponseSerializer = Request.JSONResponseSerializer(options: .AllowFragments)
            let result = JSONResponseSerializer.serializeResponse(request, response, data, error)

            switch result {
            case .Success(let value):
                if let
                    response = response,
                    responseObject = T(response: response, representation: value)
                {
                    return .Success(responseObject)
                } else {
                    let failureReason = "JSON could not be serialized into response object: \(value)"
                    let error = Error.errorWithCode(.JSONSerializationFailed, failureReason: failureReason)
                    return .Failure(error)
                }
            case .Failure(let error):
                return .Failure(error)
            }
        }

        let queue = dispatch_queue_create("my.queue", DISPATCH_QUEUE_CONCURRENT)
        return response(queue: queue, responseSerializer: responseSerializer) { response in
            dispatch_async(dispatch_get_main_queue()) {
                completionHandler(response)
            }
        }
    }
}

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionTheM00s3View Question on Stackoverflow
Solution 1 - SwiftcnoonView Answer on Stackoverflow
Solution 2 - SwiftMike.RView Answer on Stackoverflow
Solution 3 - SwiftRaphael OliveiraView Answer on Stackoverflow