Depth data from ARKit

Can I access AVDepthData from ARKit? ARFrame contains an image of the camera, but not information about its depth.

I tried to create a separate AVCaptureSession to get AVDepthData, but I can not start AVCaptureSession at the same time as ARKit. Either ARSCNView updates or AVCaptureDepthDataOutputDelegate are updated, but not both.

+4
source share
4 answers

As mentioned in this thread and mentioned in this video , no, ARKit does not provide you AVDepthDatawhen you are in tracking mode in the world. The only time you are provided AVDepthData, you are in face tracking using iPhone X.

+1
source

arkit also uses AVCaptureSession, so it is currently not possible to use arscnview during AVCaptureSession because it will call the delegate - (void) sessionWasInterrupted: (ARSession *) session; Only the method uses the playlist to record the arkit screen.

0
source

iOS 13 frameSemantics

let configuration = ARWorldTrackingConfiguration()
configuration.frameSemantics = .personSegmentationWithDepth

, ARSessionDelegate, ARFrame.

func session(_ session: ARSession, didUpdate frame: ARFrame) {
     let estimatedDepthData = frame.estimatedDepthData
     ....
}

https://developer.apple.com/documentation/arkit/arframe/3152989-estimateddepthdata

https://developer.apple.com/documentation/arkit/arframe/2984226-segmentationbuffer

0

( 15 ), , hitTest. , , , , :

let height = Int(arView.frame.height)
let width = Int(arView.frame.width)


UIGraphicsBeginImageContextWithOptions(CGSize(width: width, height: height), true, 0)
while y < height {
  var x = 0
  while x < width {
    let location = CGPoint(x: x, y: y)
    let results = arView.hitTest(location, types: [.featurePoint, .existingPlane, .estimatedVerticalPlane, .estimatedHorizontalPlane, .existingPlaneUsingGeometry, .existingPlaneUsingExtent])
    let alpha = results.isEmpty ? 1.0 : (1.0 / CGFloat(results.count))
    for result in results {
      let value = 1.0 / (result.distance + 1.0)
      switch result.type {
        case .featurePoint:
          UIColor(red: value, green: 0, blue: 0, alpha: alpha).setFill()
        case .existingPlane:
          UIColor(red: 0, green: 1, blue: 0, alpha: alpha).setFill()
        case .estimatedVerticalPlane:
          UIColor(red: 0, green: 0, blue: value, alpha: alpha).setFill()
        case .estimatedHorizontalPlane:
          UIColor(red: 0, green: 0, blue: value, alpha: alpha).setFill()
        case .existingPlaneUsingGeometry:
          UIColor(red: value, green: value, blue: value, alpha: alpha).setFill()
        case .existingPlaneUsingExtent:
          UIColor(red: value, green: value, blue: value, alpha: alpha).setFill()
        default:
          UIColor.black.setFill()
      }
      UIRectFill(CGRect(x: x, y: y, width: 1, height: 1))
    }

    x += 1
  }

  y += 1

}

let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()

imgView.image = image
Hide result

, . , , , ARKit .

0

Source: https://habr.com/ru/post/1683780/


All Articles