How to detect vertical planes in ARKit?

SwiftAugmented RealityArkit

Swift Problem Overview


How is it possible to implement a vertical plane detection (i.e. for walls)?

let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal //TODO

Swift Solutions


Solution 1 - Swift

Edit: This is now supported as of ARKit 1.5 (iOS 11.3). Simply use .vertical. I have kept the previous post below for historical purposes.


TL;DR

Vertical plane detection is not (yet) a feature that exists in ARKit. The .horizontal suggests that this feature could be being worked on and might be added in the future. If it was just a Boolean value, this would suggest that it is final.

Confirmation

This suspicion was confirmed by a conversation that I had with an Apple engineer at WWDC17.

Explanation

You could argue that creating an implementation for this would be difficult as there are infinitely many more orientations for a vertical plane rather than a horizontal one, but as rodamn said, this is probably not the case.

From rodamn’s comment: At its simplest, a plane is defined to be three coplanar points. You have a surface candidate once there are sufficient detected coplanar features detected along a surface (vertical, horizontal, or at any arbitrary angle). It's just that the normal for horizontals will be along the up/down axis, while vertical's normals will be parallel to the ground plane. The challenge is that unadorned drywall tends to generate few visual features, and plain walls may often go undetected. I strongly suspect that this is why the .vertical feature is not yet released.

However, there is a counter argument to this. See comments from rickster for more information.

Solution 2 - Swift

Support for this is coming with iOS 11.3:

> static var vertical: ARWorldTrackingConfiguration.PlaneDetection > > The session detects surfaces that are parallel to gravity (regardless of other orientation).

https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration.planedetection https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration.planedetection/2867271-vertical

Solution 3 - Swift

Apple has release iOS 11.3 will feature various updates for AR, including ARKit 1.5. In this update ARKit includes the ability for ARKit to recognize and place virtual objects on vertical surfaces like wall and door.

Support for vertical is supported now in ARWorldTrackingConfiguration

let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
sceneView.session.run(configuration)

Solution 4 - Swift

As the iPhone X is featuring a front facing depth camera, my suspicion is that a back facing one will be on the next version and perhaps the .vertical capability will be delegated until then.

Solution 5 - Swift

i did it with Unity, but i need to do my math.

I use Random Sample Consensus to detect vertical plane from the point cloud returned by ARkit. It's like having a loop that randomly picks 3 points to create a plane and counts points that matches it, and see which try is the best.

It's working. But because ARkit can't return many points when the wall is in plain color. So it doesn't work in many situation.

Solution 6 - Swift

>In ARKit 1.0 there was just .horizontal enum's case for detecting horizontal surfaces like a table or a floor. In ARKit 1.5 and higher there are .horizontal and .vertical type properties of a PlaneDetection struct that conforms to OptionSet protocol.

To implement a vertical plane detection in ARKit 2.0 use the following code:

configuration.planeDetection = ARWorldTrackingConfiguration.PlaneDetection.vertical

Or you can use detection for both types of planes:

private func configureSceneView(_ sceneView: ARSCNView) {

    let configuration = ARWorldTrackingConfiguration()
    configuration.planeDetection = [.horizontal, .vertical]    //BOTH TYPES
    configuration.isLightEstimationEnabled = true
    sceneView.session.run(configuration)
}

Also you can add an extension to ARSceneManager to handle the delegate calls:

extension ARSceneManager: ARSCNViewDelegate {

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard let planeAnchor = anchor as? ARPlaneAnchor else { 
            return 
        }
        print("Found plane: \(planeAnchor)")
    }
}

Solution 7 - Swift

Apple is said to be working on extra AR capabilities for the new iPhone i.e extra sensors for the Camera. Maybe this will be a feature when those device capabilities are known. Some speculation here. http://uk.businessinsider.com/apple-iphone-8-rumors-3d-laser-camera-augmented-reality-2017-7 and another source https://www.fastcompany.com/40440342/apple-is-working-hard-on-an-iphone-8-rear-facing-3d-laser-for-ar-and-autofocus-source

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJan F.View Question on Stackoverflow
Solution 1 - SwiftZackView Answer on Stackoverflow
Solution 2 - SwiftjanpioView Answer on Stackoverflow
Solution 3 - SwiftshriView Answer on Stackoverflow
Solution 4 - SwiftMike MView Answer on Stackoverflow
Solution 5 - SwiftSunny ChowView Answer on Stackoverflow
Solution 6 - SwiftAndy JazzView Answer on Stackoverflow
Solution 7 - SwiftAlex McPhersonView Answer on Stackoverflow