Use the sample code to quickly build and see examples of ARDK features. Samples come either as complete Unity projects or Unity packages you can add to your Unity project.
Download samples at ARDK Downloads.
This Unity package provides several examples for many of the ARDK features. You can import this package into your Unity project and build and run any of the following included scenes:
The ARDK-Examples package has not been updated to support the Unity Input System package. Avoid importing ARDK-Examples in projects that have the Unity Input System enabled, as this will cause build problems.
This example demonstrates how to set anchors without needing to generate planes first, using
This example is, very simply, a networked augmented reality session. Clients and hosts can map out the environment in order to sync and know each other’s positions, which are indicated by a red rectangular phone w/ gray “antenna”. Once synced, you should be able to see this digital depiction of the phone by looking at one phone with another one synced in the same session.
Does each device need to be held while syncing?
Yes, they have to be held and viewing the same object - we are using the camera frame to map the world and attempt to determine where each phone is relative to each other.
Does the wait increase per device that is added to the network?
No, each phone is doing its mapping/syncing independently - the only caveat is that the first phone in the sessions is the “host“, and actually sending the authoritative map to all other phones, so that is the blocker.
Is there some complications with the type of devices?
The better the device, the faster the sync. Generally, iPhone 11s can take 5-10 seconds, while iPhone SE (1st gen) will take 40s - minutes.
How should I move to sync effectively?
It is better to pan around an object (move left and right) than staying still, since it takes multiple views to triangulate a point well.
Located in the ContextAwareness folder, this example demonstrates how to generate depth data to create immersive AR experiences. For more guidance on usage please see ARDK’s Depth documentation.
Located in the ContextAwareness folder, this example demonstrates how to query the environment for object placement, procedural gameplay and character navigation through an intermediate system called Gameboard. The Gameboard is a high level component that finds playable areas in the environment and provides a developer-friendly interface for gameplay. For more guidance on usage please see ARDK’s Gameboard documentation.
This example demonstrates how to search for specified images in the real world and surface them as image anchors. This anchor can then be used in a variety of ways to spawn AR content (ex. creating AR effects overlaying a real world mural). For more guidance on usage please see ARDK’s ImageDetection documentation.
This is an example that demonstrates sending messages between multiple devices in the same networked session. Devices that join the same session (through the user inputted Session ID) will receive notifications that other peers have joined the session, and can send each other messages by pressing the “Send Message” button.
There is no AR visual asset included with this sample.
The example utilizes a QR code to quickly join and sync multiple players into a multiplayer AR session.
Located in the ContextAwareness/Meshing folder, this example is a good starting point for exploring how to set up and configure Meshing. For more guidance on usage please see ARDK’s Meshing documentation.
Located in the ContextAwareness/Meshing folder, this example demonstrates how to use the MeshSaver class to create a binary file clone of the current generated AR mesh. The saved mesh can then be loaded in the Unity editor through the MockMesh system.
Located in the ContextAwareness folder, this example runs Semantic Segmentation, Occlusion and Meshing at the same time, showing the performance when running all systems at the same time.
Located in the ContextAwareness folder, this example demonstrates how to use generated depth data to place or move content behind real world objects without breaking immersion. For more guidance on usage please see ARDK’s Depth Occlusion documentation.
This is an example that demonstrates storing data in the server-side persistent key-value store associated with each networked session. By tapping the screen, the device will attempt to store the value representing the upper circle’s color (locally set) on the server side. Upon receiving an updated value (stored by any device in the same session), the bottom circle (only modified through server messages) will change to the new color.
A basic example demonstrating how to set up a minimal AR scene. This includes detecting and displaying planes, displaying feature points, and hit testing against planes.
This is an example Unity project in which ARDK’s features are used to create an AR Multiplayer version of Pong. Pong has a fully documented tutorial covering all of the Unity steps as well as C# script implementations in order to make the project work. Please note that this implementation uses low level messages to send data between players. Another version of this project, PongHLAPI, demonstrates the setup and use of High Level API objects (HLAPI) to streamline the message sending and receiving process for synchronizing players.
This version of Pong uses the same base as the basic Pong AR Multiplayer demo (low level version), but makes some changes to take advantage of ARDK’s High Level Networking API (HLAPI). The first difference is the lack of a MessagingManager. PongHLAPI has a fully documented tutorial covering all of the Unity steps as well as C# script implementations in order to make the project work.
This example demonstrates how to use FeaturePreloadManager to pre-download AR assets. Also demonstrates how to stop in-progress downloads and clear the downloaded cache.
Located in the ContextAwareness folder, this example demonstrates how to access the semantic data in the scene and identify real world features like ground and sky. For more guidance on usage please see ARDK’s Semantic Segmentation documentation.
VirtualStudio - ARDK Remote Feed App
Built and installed on a mobile device, this scene can be used to pipe real world AR data from the device to the Unity Editor. For guidance, see ARDK’s Virtual Studio (Remote) documentation.
Virtual Studio - ExampleMockPlaythrough
Mock AR assets can be used to simulate receiving environmental AR data. For more guidance on usage please see ARDK’s Virtual Studio (Mock) documentation.
Located in the VpsCoverage folder, VpsCoverageExample demonstrates how to use the VPS Coverage API to get information for Coverage Areas and Localization Targets.
Also located in the VpsCoverage folder, the VpsCoverageListExample example demonstrates how to use the VPS Coverage API to get Coverage Area and Localization Target information, and how to present this information to the user in a list view with details such as the Localization Target name and “hint image”.
Located in the WayspotAnchors folder, the WayspotAnchors example demonstrates how to use the VPS Wayspot Anchors API and the WayspotAnchorsService interface to localize with a VPS-activated Wayspot, and place and restore Wayspot Anchors.
For more details on VPS and the VPS Wayspot Anchors API, see Lightship VPS and <VPS Wayspot Anchors API>.
This package contains example meshes from scans of various real-world locations. Import this package into your Unity project to use these meshes as mock meshes, as described in Advanced Tutorial: Meshes in the Unity Editor.
This package contains mock environment prefabs you can use to test AR features in your Unity project. Import this package into your Unity project to use these mock enviroments in Mock mode, as described in Playing in Mock Mode. Mock environments from this package are used in many of the tutorials.