VIRTUAL CAMERA
When we set out to develop the virtual camera tool for our app, there were basically only 2 apps that recorded video and 3d camera data. But we felt they were both missing some features that Apple was exposing through their ARKit API, so we decided to give it our best.
DISCLAIMER: Our virtual camera tool is still in development..
So, what were we missing you might ask...
For starters, we wanted 4K support, as no other app was offering this, even though it has been supported by ARKit for well over a year. But to be fair, Byplay has implemented this in the paid tier of their recent big 2.0 release as well..
Next, we wanted to add multi device support, so if multiple angles of a scene are filmed, VFX could be added stress free. For this we use ARKit’s ARImageAnchor to detect a surface and set the world origin to its position. After this, we use our Syncing protocol as described here to start and stop all devices at once. In post-production it’s as easy as importing all .dae files into your scene and tada; position- and time-synchronized cameras are ready to go!
During development we found Apple exposes depth maps from LiDAR enabled devices, so these can also be recorded for quick roto previews or other depth related effects.
Another quirk we found was that ARKit doesn’t actually deliver real life rotations but moves within a -180 to +180 degree range. When going over this angular limit, ARKit simply flips back to the other side. In post, this means random smeary frames if mo-blur is turned on during rendering, as the camera rotates by almost 360° from one frame to the next. Sadly, many other apps don’t take this into consideration.
Finally, we wanted image stabilization, something that ARKit doesn’t offer, because it needs to use raw camera data in tandem with sensor readouts for tracking. Luckily, we found a workaround by implementing GyroFlow support. Every Recording comes with a sidecar .gcsv file containing accelerometer, gyro and magnetometer data. After rendering out all your 3d elements, you can use this data to stabilize everything to your liking.
Other QoL features include h.265 support for smaller file sizes, HDR video and LiDAR reconstructed meshes.
Sadly, some things are out of our hands, as Apple only allows limited control over the AVCaptureSession which is responsible for audio and video during an ARSession.
Things we hope improve with future iOS releases are:
- Audio quality
- LiDAR mesh fidelity
- LiDAR depth maps (limited to 256×192)
- Inconsistent Readouts of Sensors on some os versions
Current bugs that we are looking into:
- Excessive Autofocus bursts in sub-optimal lighting
- Choppy video when device is under a lot of stress
- Random sensor readout break-offs.
DISCLAIMER: Our virtual camera tool is still in development..
So, what were we missing you might ask...
For starters, we wanted 4K support, as no other app was offering this, even though it has been supported by ARKit for well over a year. But to be fair, Byplay has implemented this in the paid tier of their recent big 2.0 release as well..
Next, we wanted to add multi device support, so if multiple angles of a scene are filmed, VFX could be added stress free. For this we use ARKit’s ARImageAnchor to detect a surface and set the world origin to its position. After this, we use our Syncing protocol as described here to start and stop all devices at once. In post-production it’s as easy as importing all .dae files into your scene and tada; position- and time-synchronized cameras are ready to go!
During development we found Apple exposes depth maps from LiDAR enabled devices, so these can also be recorded for quick roto previews or other depth related effects.
Another quirk we found was that ARKit doesn’t actually deliver real life rotations but moves within a -180 to +180 degree range. When going over this angular limit, ARKit simply flips back to the other side. In post, this means random smeary frames if mo-blur is turned on during rendering, as the camera rotates by almost 360° from one frame to the next. Sadly, many other apps don’t take this into consideration.
Finally, we wanted image stabilization, something that ARKit doesn’t offer, because it needs to use raw camera data in tandem with sensor readouts for tracking. Luckily, we found a workaround by implementing GyroFlow support. Every Recording comes with a sidecar .gcsv file containing accelerometer, gyro and magnetometer data. After rendering out all your 3d elements, you can use this data to stabilize everything to your liking.
Other QoL features include h.265 support for smaller file sizes, HDR video and LiDAR reconstructed meshes.
Sadly, some things are out of our hands, as Apple only allows limited control over the AVCaptureSession which is responsible for audio and video during an ARSession.
Things we hope improve with future iOS releases are:
- Audio quality
- LiDAR mesh fidelity
- LiDAR depth maps (limited to 256×192)
- Inconsistent Readouts of Sensors on some os versions
Current bugs that we are looking into:
- Excessive Autofocus bursts in sub-optimal lighting
- Choppy video when device is under a lot of stress
- Random sensor readout break-offs.