قالب وردپرس درنا توس
Home / IOS Development / [ARKit3] How does the world / face track at the same time? : iOSProgramming

[ARKit3] How does the world / face track at the same time? : iOSProgramming



Hello together,

I'm currently playing with ARKit3, and I'm kind of stuck in simultaneousl world and face tracking.

I've done some things with ARKit before, but, I'm totally lost in how I can work on both track configurations at once.

My goal is to show a "face" in the real word (back camera) tracked with the front camera.

This is the function called in viewDidLoad ()

private func setupFaceTracking () {

guard ARFaceTrackingConfiguration.isSupported else {return}

la configuration = ARWorldTrackingConfiguration ()

configuration.isLightEstimationEnabled = true

configuration.userFaceTrackingEnabled = true

arView.session.run (configuration , options: [])

}

I couldn't find any training or example of how to get things done this way.

Hope someone can help me. 🙂


Source link