In ARKit I can left the left-eye transform, but strangely this is not supported in visionOS.
In MetalKit I can get the leftView.transform from the provided LayerRenderer.Drawable views:
func renderDonut(drawable : LayerRenderer.Drawable,
renderEncoder : MTLRenderCommandEncoder,
simdDeviceAnchor : simd_float4x4) {
let leftView = drawable.views[0]
let leftViewMatrix = (simdDeviceAnchor * leftView.transform).inverse
let leftProjectionMatrix = drawableputeProjection(viewIndex: 0)
...
}
There does not seem to be anything in the API get this transform in visionOS using RealityKit. I can get the Camera Position in a Reality Composer Pro material shader, but is this the position of the left or right eye depending on which frame is being rendered?
We can get the deviceAnchor and fetch its transform which gives its position and orientation for the sensor which is roughly around the user's forehead. But there is nothing provided in RealityKit to get the position of the left or right relative to this.
This related to a similar question I asked earlier.