Open the MultiplayerSample scene found in the
Log in to the Developer Portal and download your map file of a premapped location.
The scene is very similar to the Multimap sample. The difference is the added networking components.
Game objects provided by AR Foundation and you can find more about their functionalities from Unity's documentation. They provide the basic functionalities for the AR session.
Basic settings for our SDK and the
Developer Token is where you would insert your token found in our Developer Portal. Since this sample uses an embedded map file, there is no need for a token (used for server connection).
Target Frame Rate for the app. iOS can manage 60fps, but most Android devices are locked to 30fps.
Android Resolution specifies the image size used by our SDK computer vision. Some older Android devices have performance issues with bigger image sizes so you can change this if needed.
iOS Resolution is the same setting, but for iOS devices. You can leave this at default, which for most devices is the maximum resolution.
AR Localizer is the script that uses our SDK to find the device pose within the map.
Localization Interval sets the interval between attempted localizations. At the app start, the script has an automatic burst mode to find the first ten or so poses as fast as possible.
Downsample reduces the image size for improved performance.
Debug Text is an optional reference to a TextMeshProUGUI object to display localization attempt information.
These are components of Unity's UI system. The debug text displays the successful and total localization attempts.
The Pose Indicator is a prefab that gives an estimate of the current localization status. The prefab displays a red icon when no poses have been found or it has been a long time since the previous found pose. It also resets when AR Foundation loses tracking. When the SDK finds multiple poses, the indicator turns green.
The Pose Indicator also has events for
OnPoseLost that can be used to trigger actions in an AR app.
These game objects are used by our SDK to transform the AR content to match the real-world.
When a pose is found by
ARLocalizer.cs, the AR Space is transformed in a way that the AR content visually matches the real-world. The AR Camera in the scene is not transformed, but the AR Space is brought to the camera.
Technically the content can be placed under the AR Maps themselves if that makes organizing the scene easier.
Map files in AR Maps are used to move the parent AR Space object. Every AR Map needs to have a parent AR Space object. If one does not exist, it will be created at runtime.
Since this sample uses an embedded map file, you should place the previously downloaded map to the Map File slot.
Map File is where you place your
.bytes map data files available from the Developer Portal.
Color changes the color of the preview point cloud.
Render Mode sets the visibility of the preview point cloud. You can disable or render it in Editor and/or at runtime.
Network Manager has network settings and player settings such as model for avatars.
UIController enables and disables UI elements depending on network state.
Place your downloaded map file into the AR Map game object's map file slots in the inspector.
Open Build Settings (Ctrl + Shift + B) and make sure only the MultiplayerSample is included in the Scenes in Build.
Change your Player Settings if necessary. It's a good idea to change the Product Name and Package Name so the app will not overwrite a previously installed app on your device.
Build the app and install it to your devices.
When you start the app and look around at the location you mapped, the devices should localize and find poses in just seconds.
You should now see the Axes Avatar object on top of the devices.