ARDK 1.0.1 Release Notes 2021-11-03

What’s New

  • [AR Scanning] Updated the Meshing.unity example to add more controls for the session and the meshing features; moved the mesh file saving functionality to a new separate sample scene: MeshSaver.unity; added a function to delete saved meshes.

  • [Contextual Awareness] Added release level warning for when meshing is enabled but depth isn’t. Since meshing now explicitly requires depth to also be enabled, depth will be automatically enabled if ARSession.Run is called with an ARWorldTrackingConfiguration where meshing but not depth is enabled.

  • [Contextual Awareness] Added ARSessionRunOptions change to clear mesh. Calling ARMeshManager.ClearMeshObjects (both on device and in Editor), will re-run an active ARSession with the ARSessionRunOptions.RemoveExistingMesh value flagged.

Known Issues

  • [Contextual Awareness] Voyage example Snowball Fight MultiPlayer Mode - Scene Text disappears/not visible on Android devices once multiple players join session.

  • [Contextual Awareness] Preloading example - ‘Dbow’ status changes to ‘Invalid’ while cancelling the ‘contextawareness’ download a second time.

  • [Contextual Awareness] Meshing example - Text ‘Loading contextual awareness’ freezes on black background when user taps on ‘Stop AR’ button.

  • [Contextual Awareness] Depth example - Show current point cloud fails to function when user taps on it.

  • [Contextual Awareness] Awareness buffers distorted in RemoteConnection

  • [Developer Tools] Mock awareness buffers are not sensor-aligned. In Unity, mock awareness buffers will be portrait-aligned, even when Unity is in landscape orientation.

  • [Multiplayer] When 8 players join a session at same time, peers will end up in 2 sessions instead of 1.

  • [Multiplayer] MarkerSync example - red rectangle overlay continues to appear on screen even after tapping on reset and starting a new session.

  • [Multiplayer] Pong example - ‘Join’ button remains in in-active state when user re-enters the scene.

Improvements

  • [Basic AR] ARSessionManager methods can no longer be mistakenly subscribed to CapabilityChecker.Success as a persistent listener. The ARSessionManager is set up to create an ARSession after the capability check passes, and should thus not be subscribed to the CapabilityChecker.Success event separately, or else bugs will occur.

  • [Contextual Awareness] Render and occlusion systems improved and refactored. New single-pass ARRenderingManager pipeline implemented that supports z-buffering as an occlusion technique.

  • [Contextual Awareness] Added new DefaultBackProjectionDistance that represents a default normalized distance of the back-projection plane. All managers now use this as a default.

  • [Contextual Awareness] Use an array of semantic channel names for the suppression texture instead of a single channel index.

  • [Developer Tools] Modified ARSession/NetworkingSession/ARNetworking managers so that the methods to control the session lifecycle are the same as the UnityLifecycleDriver’s methods.

  • [Developer Platform] All IArdkEventArgs constructors are public, allowing for easier mocking of interfaces and events

Bug Fixes

  • [Basic AR] ImageAnchors on Android devices now always return a valid transform.

  • [Contextual Awareness] Fixed minor computational error in the DepthPointCloudGenerator

  • [Contextual Awareness] Fixed bug where in-Editor semantic buffers used CamelCase instead of lowercase like on device. Updated semantics documentation page to accurately list semantic channel names.

  • [Context Awareness] Fix occasional crash when recording and meshing at the same time.

  • [Developer Tools] Fixed a bug where RemoteConnection messages sometimes weren’t sent in order when they needed to be.

  • [Developer Tools] Virtual Studio Mock Mode bug/docs fixes

Breaking Changes

  • [Contextual Awareness] IARMesh interface now makes available the parsed collection of mesh blocks, each with vertex, normal and triangle arrays.

  • [Contextual Awareness] Depth is now enabled through IARWorldTrackingConfiguration.IsDepthEnabled, separate from point clouds, which are now enabled through IARWorldTrackingConfiguration.DepthPointCloudSettings.IsEnabled.

  • [Developer Tools] In order to enable more flexibility when testing how your app reacts to ARNetworking’s PeerState updates, mock updates can now be triggered through either the Virtual Studio Editor Window or through the MockPlayer.SetPeerState method. The MockMap component has been simplified to accommodate this change, and will now only impact the local player’s state.

Upgrade Guide#

Depth API Updates

Depth is now enabled through the IARWorldTrackingConfiguration.IsDepthEnabled boolean property instead of the DepthFeatures enum property.

Similarly, the depth point cloud can now be enabled through the IARWorldTrackingConfiguration.DepthPointCloudSettings.IsEnabled property.

ARMesh API Updates

If you were only using the ARMeshManager, this API change likely requires very little work to integrate.

  • Instead of the AreBlockUpdatesPaused value, use the GenerateUnityMeshes property (also settable through the inspector) to control whether Unity meshes are updated or not. The underlying ARMesh will continue to update unless meshing is disabled in the ARSession.

  • Meshing now explicitly requires depth to also be enabled. This was always a requirement, but in previous releases ARDK would silently enable depth in the background. Now ARDK will log a warning in addition to enabling depth if needed. For more fine-grained control over how depth is configured, you can either use the ARDepthManager or script to set the other configuration values. If you were working with the IARMesh interface, these are the things to note.

  • Access to lower level ARMesh data is no longer supported. IARMesh interface now surfaces the data pre-processed into blocks. This is in order for ARDK to manage memory more safely and also provide a way to provide block-level data separate from the ARMeshManager for developers who want to manage the mesh themselves instead of through a manager.

    • Each MeshBlock consists of some metadata as well as the vertices, triangles, and normals arrays describing a cube-shaped portion of the larger mesh.

  • Get notified of mesh updates by subscribing to the IARSession.Mesh.MeshBlocksUpdated and MeshBlocksCleared events instead of IARSession.MeshUpated event. You can continue to subscribe to ARMeshManager.MeshObjectsUpdated and MeshObjectsCleared to be notified when Unity meshes created from the ARMesh’s data are updated.

Rendering and Occlusion Pipeline Updates

The ARDK rendering and occlusion pipeline has been updated to produce improved occlusions with reduced GPU and memory impact. To access these improvements requires using the updated ARDK depth manager.

General Changes

  • AR Rendering Manager replaces AR Camera Rendering Helper.

  • The Depth and Semantic managers now return the raw buffers untranslated and unaligned to the screen. If you use these awareness buffers in a shader you also need to pass in a SamplerTransform in order to correctly sample the texture on the GPU.

  • Within the Depth and Semantics Managers there are Processor objects which can be used to interrogate the buffers for per-pixel depth or semantic information on the CPU.

For a step by step walkthrough of working with the new managers please refer to the tutorials, in particular, Depth Textures Tutorial and Semantic Segmentation Textures Tutorial.

Depth Changes

Update your Unity scene camera by removing the AR Camera Rendering Helper and replacing it with an AR Rendering Manager.

Change any OnDepthBufferUpdated handler methods to take ContextAwarenessArgs args:

// New callback
private void OnDepthBufferUpdated(ContextAwarenessArgs<IDepthBuffer> args)

Update how you access the buffer from the args:

// access the current buffer
IDepthBuffer depthBuffer = args.Sender.AwarenessBuffer;

Note that ARDepthManager.DispartyTexture has been deprecated. If you need a depth texture for debugging purposes, use ARDepthManager.ToggleDebugVisualization.

If you need access to a depth texture, DepthBuffer.CreateOrUpdateTexture has been deprecated. You can get the raw, unaligned texture via DepthBuffer.CreateOrUpdateTextureRFloat or DepthBuffer.CreateOrUpdateTextureARGB32:

// Get raw float depth buffer
depthBuffer.CreateOrUpdateTextureRFloat(ref _depthTexture);

// Or if you need an ARGB texture you can do this instead.
float maxDisp = depthBuffer.NearDistance;
float minDisp = depthBuffer.FarDistance;
depthBuffer.CreateOrUpdateTextureARGB32(
	ref _depthTexture,
	FilterMode.Point,
	// normalize the texture between near and far plane
	depth => (depth - minDisp) / (maxDisp - minDisp)
);

And then access the ARDepthManager.DepthBufferProcessor.SamplerTransform and use this in a custom shader:

_depthManager.DepthBufferProcessor.SamplerTransform

See Depth Textures Tutorial and Rendering in ARDK for more details on rendering the depth texture using a custom shader.

The DepthBufferProcessor has helper functions for CPU-side querying for the depth, position, and normal of any points on the screen.

Semantics Changes

Update your Unity scene camera by removing the AR Camera Rendering Helper and replacing it with an AR Rendering Manager.

Change any OnSemanticBufferUpdated handler methods to take ContextAwarenessArgs args:

private void OnSemanticBufferUpdated(ContextAwarenessArgs<ISemanticBuffer> args)

Update how you access the buffer from the args:

ISemanticBuffer semanticBuffer = args.Sender.AwarenessBuffer;

For accessing the semantic buffer, you can use ISemanticBuffer.CreateOrUpdateTextureARGB32 to get the raw, unaligned buffer:

int channel = semanticBuffer.GetChannelIndex("sky");
semanticBuffer.CreateOrUpdateTextureARGB32(ref _semanticTexture, channel);

And then access the SamplerTransform from the SemanticBufferProcessor and use this in a custom shader:

_semanticManager.SemanticBufferProcessor.SamplerTransform

See the Semantic Segmentation Textures Tutorial for more info on how to use the transform in your shaders.

Released: November 03, 2021