I’m currently making an AR application using unity for iOS. I can’t get ARKit and Affectiva to work at the same time. If Affectiva is using the camera to detect emotions, ARKit stops working. ARKit uses a method GetARVideoTextureHandles() which prevents Affectiva from running until it’s done. Is there a way for ARKit and Affectiva to share the iPhone front camera?