What is the Swift equivalent to Objective-C's "@synchronized"?

ConcurrencyMutexSwift

Concurrency Problem Overview


I've searched the Swift book, but can't find the Swift version of @synchronized. How do I do mutual exclusion in Swift?

Concurrency Solutions


Solution 1 - Concurrency

You can use GCD. It is a little more verbose than @synchronized, but works as a replacement:

let serialQueue = DispatchQueue(label: "com.test.mySerialQueue")
serialQueue.sync {
    // code
}

Solution 2 - Concurrency

I was looking for this myself and came to the conclusion there's no native construct inside of swift for this yet.

I did make up this small helper function based on some of the code I've seen from Matt Bridges and others.

func synced(_ lock: Any, closure: () -> ()) {
    objc_sync_enter(lock)
    closure()
    objc_sync_exit(lock)
}

Usage is pretty straight forward

synced(self) {
    println("This is a synchronized closure")
}

There is one problem I've found with this. Passing in an array as the lock argument seems to cause a very obtuse compiler error at this point. Otherwise though it seems to work as desired.

Bitcast requires both operands to be pointer or neither
  %26 = bitcast i64 %25 to %objc_object*, !dbg !378
LLVM ERROR: Broken function found, compilation aborted!

Solution 3 - Concurrency

I like and use many of the answers here, so I'd choose whichever works best for you. That said, the method I prefer when I need something like objective-c's @synchronized uses the defer statement introduced in swift 2.

{ 
    objc_sync_enter(lock)
    defer { objc_sync_exit(lock) }

    //
    // code of critical section goes here
    //

} // <-- lock released when this block is exited

The nice thing about this method, is that your critical section can exit the containing block in any fashion desired (e.g., return, break, continue, throw), and "the statements within the defer statement are executed no matter how program control is transferred."1

Solution 4 - Concurrency

You can sandwich statements between objc_sync_enter(obj: AnyObject?) and objc_sync_exit(obj: AnyObject?). The @synchronized keyword is using those methods under the covers. i.e.

objc_sync_enter(self)
... synchronized code ...
objc_sync_exit(self)

Solution 5 - Concurrency

Analog of the @synchronized directive from Objective-C can have an arbitrary return type and nice rethrows behaviour in Swift.

// Swift 3
func synchronized<T>(_ lock: AnyObject, _ body: () throws -> T) rethrows -> T {
    objc_sync_enter(lock)
    defer { objc_sync_exit(lock) }
    return try body()
}

The use of the defer statement lets directly return a value without introducing a temporary variable.


In Swift 2 add the @noescape attribute to the closure to allow more optimisations:

// Swift 2
func synchronized<T>(lock: AnyObject, @noescape _ body: () throws -> T) rethrows -> T {
    objc_sync_enter(lock)
    defer { objc_sync_exit(lock) }
    return try body()
}

Based on the answers from GNewc [1] (where I like arbitrary return type) and Tod Cunningham [2] (where I like defer).

Solution 6 - Concurrency

SWIFT 4

In Swift 4 you can use GCDs dispatch queues to lock resources.

class MyObject {
    private var internalState: Int = 0
    private let internalQueue: DispatchQueue = DispatchQueue(label:"LockingQueue") // Serial by default
    
    var state: Int {
        get {
            return internalQueue.sync { internalState }
        }
        
        set (newState) {
            internalQueue.sync { internalState = newState }
        }
    }
} 

Solution 7 - Concurrency

To add return functionalty, you could do this:

func synchronize<T>(lockObj: AnyObject!, closure: ()->T) -> T
{
  objc_sync_enter(lockObj)
  var retVal: T = closure()
  objc_sync_exit(lockObj)
  return retVal
}

Subsequently, you can call it using:

func importantMethod(...) -> Bool {
  return synchronize(self) {
    if(feelLikeReturningTrue) { return true }
    // do other things
    if(feelLikeReturningTrueNow) { return true }
    // more things
    return whatIFeelLike ? true : false
  }
}

Solution 8 - Concurrency

Using Bryan McLemore answer, I extended it to support objects that throw in a safe manor with the Swift 2.0 defer ability.

func synchronized( lock:AnyObject, block:() throws -> Void ) rethrows
{
    objc_sync_enter(lock)
    defer {
        objc_sync_exit(lock)
    }
    
    try block()
}

Solution 9 - Concurrency

In modern Swift 5, with return capability:

/**
Makes sure no other thread reenters the closure before the one running has not returned
*/
@discardableResult
public func synchronized<T>(_ lock: AnyObject, closure:() -> T) -> T {
	objc_sync_enter(lock)
	defer { objc_sync_exit(lock) }
	
	return closure()
}

Use it like this, to take advantage the return value capability:

let returnedValue = synchronized(self) { 
     // Your code here
     return yourCode()
}

Or like that otherwise:

synchronized(self) { 
     // Your code here
    yourCode()
}

Solution 10 - Concurrency

Swift 3

This code has the re-entry ability and can work with Asynchronous function calls. In this code, after someAsyncFunc() is called, another function closure on the serial queue will process but be blocked by semaphore.wait() until signal() is called. internalQueue.sync shouldn't be used as it will block the main thread if I'm not mistaken.

let internalQueue = DispatchQueue(label: "serialQueue")
let semaphore = DispatchSemaphore(value: 1)

internalQueue.async {

    self.semaphore.wait()

    // Critical section
        
    someAsyncFunc() {

        // Do some work here

        self.semaphore.signal()
    }
}

objc_sync_enter/objc_sync_exit isn't a good idea without error handling.

Solution 11 - Concurrency

In the "Understanding Crashes and Crash Logs" session 414 of the 2018 WWDC they show the following way using DispatchQueues with sync.

In swift 4 should be something like the following:

class ImageCache {
    private let queue = DispatchQueue(label: "sync queue")
    private var storage: [String: UIImage] = [:]
    public subscript(key: String) -> UIImage? {
        get {
          return queue.sync {
            return storage[key]
          }
        }
        set {
          queue.sync {
            storage[key] = newValue
          }
        }
    }
}

Anyway you can also make reads faster using concurrent queues with barriers. Sync and async reads are performed concurrently and writing a new value waits for previous operations to finish.

class ImageCache {
    private let queue = DispatchQueue(label: "with barriers", attributes: .concurrent)
    private var storage: [String: UIImage] = [:]
    
    func get(_ key: String) -> UIImage? {
        return queue.sync { [weak self] in
            guard let self = self else { return nil }
            return self.storage[key]
        }
    }

    func set(_ image: UIImage, for key: String) {
        queue.async(flags: .barrier) { [weak self] in
            guard let self = self else { return }
            self.storage[key] = image
        }
    }
}

Solution 12 - Concurrency

Try: NSRecursiveLock > A lock that may be acquired multiple times by the same thread without > causing a deadlock.

let lock = NSRecursiveLock()

func f() {
    lock.lock()
    //Your Code
    lock.unlock()
}

func f2() {
    lock.lock()
    defer {
        lock.unlock()
    }
    //Your Code
}

> The Objective-C synchronization feature supports recursive and > reentrant code. A thread can use a single semaphore several times in a > recursive manner; other threads are blocked from using it until the > thread releases all the locks obtained with it; that is, every > @synchronized() block is exited normally or through an exception. Source

Solution 13 - Concurrency

Use NSLock in Swift4:

let lock = NSLock()
lock.lock()
if isRunning == true {
        print("Service IS running ==> please wait")
        return
} else {
    print("Service not running")
}
isRunning = true
lock.unlock()

> Warning The NSLock class uses POSIX threads to implement its locking behavior. When sending an unlock message to an NSLock object, you must be sure that message is sent from the same thread that sent the initial lock message. Unlocking a lock from a different thread can result in undefined behavior.

Solution 14 - Concurrency

With Swift's property wrappers, this is what I'm using now:

@propertyWrapper public struct NCCSerialized<Wrapped> {
    private let queue = DispatchQueue(label: "com.nuclearcyborg.NCCSerialized_\(UUID().uuidString)")
    
    private var _wrappedValue: Wrapped
    public var wrappedValue: Wrapped {
        get { queue.sync { _wrappedValue } }
        set { queue.sync { _wrappedValue = newValue } }
    }
    
    public init(wrappedValue: Wrapped) {
        self._wrappedValue = wrappedValue
    }
}

Then you can just do:

@NCCSerialized var foo: Int = 10

or

@NCCSerialized var myData: [SomeStruct] = []

Then access the variable as you normally would.

Solution 15 - Concurrency

Figure I'll post my Swift 5 implementation, built off of the prior answers. Thanks guys! I found it helpful to have one that returns a value too, so I have two methods.

Here is a simple class to make first:

import Foundation
class Sync {
public class func synced(_ lock: Any, closure: () -> ()) {
        objc_sync_enter(lock)
        defer { objc_sync_exit(lock) }
        closure()
    }
    public class func syncedReturn(_ lock: Any, closure: () -> (Any?)) -> Any? {
        objc_sync_enter(lock)
        defer { objc_sync_exit(lock) }
        return closure()
    }
}

Then use it like so if needing a return value:

return Sync.syncedReturn(self, closure: {
    // some code here
    return "hello world"
})

Or:

Sync.synced(self, closure: {
    // do some work synchronously
})

Solution 16 - Concurrency

In conclusion, Here give more common way that include return value or void, and throw

import Foundation

extension NSObject {
    
    
    func synchronized<T>(lockObj: AnyObject!, closure: () throws -> T) rethrows ->  T
    {
        objc_sync_enter(lockObj)
        defer {
            objc_sync_exit(lockObj)
        }
        
        return try closure()
    }
    
    
}

Solution 17 - Concurrency

Details

Xcode 8.3.1, Swift 3.1

Task

Read write value from different threads (async).

Code

class AsyncObject<T>:CustomStringConvertible {
    private var _value: T
    public private(set) var dispatchQueueName: String
   
    let dispatchQueue: DispatchQueue
    
    init (value: T, dispatchQueueName: String) {
        _value = value
        self.dispatchQueueName = dispatchQueueName
        dispatchQueue = DispatchQueue(label: dispatchQueueName)
    }
    
    func setValue(with closure: @escaping (_ currentValue: T)->(T) ) {
        dispatchQueue.sync { [weak self] in
            if let _self = self {
                _self._value = closure(_self._value)
            }
        }
    }
    
    func getValue(with closure: @escaping (_ currentValue: T)->() ) {
        dispatchQueue.sync { [weak self] in
            if let _self = self {
                closure(_self._value)
            }
        }
    }
    
    
    var value: T {
        get {
            return dispatchQueue.sync { _value }
        }
        
        set (newValue) {
            dispatchQueue.sync { _value = newValue }
        }
    }

    var description: String {
        return "\(_value)"
    }
}

Usage

print("Single read/write action")
// Use it when when you need to make single action
let obj = AsyncObject<Int>(value: 0, dispatchQueueName: "Dispatch0")
obj.value = 100
let x = obj.value
print(x)

print("Write action in block")
// Use it when when you need to make many action
obj.setValue{ (current) -> (Int) in
    let newValue = current*2
    print("previous: \(current), new: \(newValue)")
    return newValue
}

Full Sample

> extension DispatchGroup

extension DispatchGroup {
    
    class func loop(repeatNumber: Int, action: @escaping (_ index: Int)->(), completion: @escaping ()->()) {
        let group = DispatchGroup()
        for index in 0...repeatNumber {
            group.enter()
            DispatchQueue.global(qos: .utility).async {
                action(index)
                group.leave()
            }
        }
        
        group.notify(queue: DispatchQueue.global(qos: .userInitiated)) {
            completion()
        }
    }
}

> class ViewController

import UIKit

class ViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()

        //sample1()
        sample2()
    }
    
    func sample1() {
        print("=================================================\nsample with variable")
        
        let obj = AsyncObject<Int>(value: 0, dispatchQueueName: "Dispatch1")
        
        DispatchGroup.loop(repeatNumber: 5, action: { index in
            obj.value = index
        }) {
            print("\(obj.value)")
        }
    }
    
    func sample2() {
        print("\n=================================================\nsample with array")
        let arr = AsyncObject<[Int]>(value: [], dispatchQueueName: "Dispatch2")
        DispatchGroup.loop(repeatNumber: 15, action: { index in
            arr.setValue{ (current) -> ([Int]) in
                var array = current
                array.append(index*index)
                print("index: \(index), value \(array[array.count-1])")
                return array
            }
        }) {
            print("\(arr.value)")
        }
    }
}

Solution 18 - Concurrency

You can create propertyWrapper Synchronised

Here example with NCLock underhood. You can use for synchronisation whatever you want GCD, posix_locks e.t.c

@propertyWrapper public struct Synchronised<T> {
    private let lock = NSLock()

    private var _wrappedValue: T
    public var wrappedValue: T {
        get {
            lock.lock()
            defer {
                lock.unlock()
            }
            return _wrappedValue
        }
        set {
            lock.lock()
            defer {
                lock.unlock()
            }
            _wrappedValue = newValue
        }
    }

    public init(wrappedValue: T) {
        self._wrappedValue = wrappedValue
    }
}

@Synchronised var example: String = "testing"

based on @drewster answer

Solution 19 - Concurrency

Why make it difficult and hassle with locks? Use Dispatch Barriers.

A dispatch barrier creates a synchronization point within a concurrent queue.

While it’s running, no other block on the queue is allowed to run, even if it’s concurrent and other cores are available.

If that sounds like an exclusive (write) lock, it is. Non-barrier blocks can be thought of as shared (read) locks.

As long as all access to the resource is performed through the queue, barriers provide very cheap synchronization.

Solution 20 - Concurrency

Based on ɲeuroburɳ, test an sub-class case

class Foo: NSObject {
    func test() {
        print("1")
        objc_sync_enter(self)
        defer {
            objc_sync_exit(self)
            print("3")
        }
        
        print("2")
    }
}


class Foo2: Foo {
    override func test() {
        super.test()
        
        print("11")
        objc_sync_enter(self)
        defer {
            print("33")
            objc_sync_exit(self)
        }
        
        print("22")
    }
}

let test = Foo2()
test.test()

Output:

1
2
3
11
22
33

Solution 21 - Concurrency

dispatch_barrier_async is the better way, while not blocking current thread.

dispatch_barrier_async(accessQueue, { dictionary[object.ID] = object })

Solution 22 - Concurrency

What about

final class SpinLock {
    private let lock = NSRecursiveLock()

    func sync<T>(action: () -> T) -> T {
        lock.lock()
        defer { lock.unlock() }
        return action()
    }
}

Solution 23 - Concurrency

Another method is to create a superclass and then inherit it. This way you can use GCD more directly

class Lockable {
    let lockableQ:dispatch_queue_t

    init() {
        lockableQ = dispatch_queue_create("com.blah.blah.\(self.dynamicType)", DISPATCH_QUEUE_SERIAL)
    }
    
    func lock(closure: () -> ()) {
        dispatch_sync(lockableQ, closure)
    }
}


class Foo: Lockable {
    
    func boo() {
        lock {
            ....... do something
        }
    }

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionBillView Question on Stackoverflow
Solution 1 - ConcurrencyconmulliganView Answer on Stackoverflow
Solution 2 - ConcurrencyBryan McLemoreView Answer on Stackoverflow
Solution 3 - ConcurrencyɲeuroburɳView Answer on Stackoverflow
Solution 4 - ConcurrencyMatt BridgesView Answer on Stackoverflow
Solution 5 - ConcurrencywerediverView Answer on Stackoverflow
Solution 6 - ConcurrencySebastian BoldtView Answer on Stackoverflow
Solution 7 - ConcurrencyGNewcView Answer on Stackoverflow
Solution 8 - ConcurrencyTod CunninghamView Answer on Stackoverflow
Solution 9 - ConcurrencyStéphane de LucaView Answer on Stackoverflow
Solution 10 - ConcurrencyHannyView Answer on Stackoverflow
Solution 11 - ConcurrencyrockdaswiftView Answer on Stackoverflow
Solution 12 - ConcurrencyNuzhdin VladimirView Answer on Stackoverflow
Solution 13 - ConcurrencyDàChúnView Answer on Stackoverflow
Solution 14 - ConcurrencydrewsterView Answer on Stackoverflow
Solution 15 - ConcurrencyTheJeffView Answer on Stackoverflow
Solution 16 - ConcurrencyVictor ChoyView Answer on Stackoverflow
Solution 17 - ConcurrencyVasily BodnarchukView Answer on Stackoverflow
Solution 18 - ConcurrencyTony MacarenView Answer on Stackoverflow
Solution 19 - ConcurrencyFrederick C. LeeView Answer on Stackoverflow
Solution 20 - ConcurrencyAechoLiuView Answer on Stackoverflow
Solution 21 - ConcurrencyLiangWangView Answer on Stackoverflow
Solution 22 - ConcurrencypvllnspkView Answer on Stackoverflow
Solution 23 - ConcurrencyJimView Answer on Stackoverflow