You can achieve this print statement by detecting a nodding motion from ARFaceAnchor's blendShapes parameter. ARFaceAnchor is an anchor that ARKit uses to track the position and orientation of a user's face in the AR scene. The blendShapes parameter provides information about various facial expressions, including nodding.
Here's how you can print "nodded" when person nods using ARKit's ARFaceAnchor in Swift:
main.swift821 chars34 lines
main.swift315 chars9 lines
This method gets called every time a new frame is rendered for ARSCNView. You can access ARFaceAnchor from the ARAnchor parameter, and then you can check the blendShapes parameter to detect nodding motion. In this case, we're checking the .jawOpen blend shape, which represents how open the user's mouth is. If it's greater than 0.5, then we can assume that the user is nodding.
That's it! Now you should see "nodded" printed in the console every time the user nods.
gistlibby LogSnag