Open the NavigationSampleNavMesh scene found in the
Log in to the Developer Portal and download your map file of a premapped location.
The scene is very similar to the Multimap sample. The difference is the added navigation features.
Game objects provided by AR Foundation and you can find more about their functionalities from Unity's documentation. They provide the basic functionalities for the AR session.
Basic settings for our SDK and the
Developer Token is where you would insert your token found in our Developer Portal. Since this sample uses an embedded map file, there is no need for a token (used for server connection).
Target Frame Rate for the app. iOS can manage 60fps, but most Android devices are locked to 30fps.
Android Resolution specifies the image size used by our SDK computer vision. Some older Android devices have performance issues with bigger image sizes so you can change this if needed.
iOS Resolution is the same setting, but for iOS devices. You can leave this at default, which for most devices is the maximum resolution.
AR Localizer is the script that uses our SDK to find the device pose within the map.
Localization Interval sets the interval between attempted localizations. At the app start, the script has an automatic burst mode to find the first ten or so poses as fast as possible.
Downsample reduces the image size for improved performance.
Debug Text is an optional reference to a TextMeshProUGUI object to display localization attempt information.
These are components of Unity's UI system. The debug text displays the successful and total localization attempts.
The Pose Indicator is a prefab that gives an estimate of the current localization status. The prefab displays a red icon when no poses have been found or it has been a long time since the previous found pose. It also resets when AR Foundation loses tracking. When the SDK finds multiple poses, the indicator turns green.
The Pose Indicator also has events for
OnPoseLost that can be used to trigger actions in an AR app.
These game objects are used by our SDK to transform the AR content to match the real-world.
When a pose is found by
ARLocalizer.cs, the AR Space is transformed in a way that the AR content visually matches the real-world. The AR Camera in the scene is not transformed, but the AR Space is brought to the camera.
Technically the content can be placed under the AR Maps themselves if that makes organizing the scene easier.
Map files in AR Maps are used to move the parent AR Space object. Every AR Map needs to have a parent AR Space object. If one does not exist, it will be created at runtime.
Since this sample uses an embedded map file, you should place the previously downloaded map to the Map File slot.
Map File is where you place your
.bytes map data files available from the Developer Portal.
Color changes the color of the preview point cloud.
Render Mode sets the visibility of the preview point cloud. You can disable or render it in Editor and/or at runtime.
There are a couple of objects placed under the AR Space object.
The example_mesh is geometry representation of the real-world locations floor area. This is used to create the NavMeshSurface used in pathfinding.
Then there are two Navigation Targets in the scene.
The Generate NavMeshSurface Here object has a
NavMeshSurface.cs script and we need to bake the the NavMeshSurface data with the script before building the app.
Unity's NavMeshComponents uses 3d geometry to create a NavMesh for pathfinding.
You need to build the geometry in Unity Editor based on the point cloud preview using Unity's ProBuilder and ProGrids packages.
For this option you can download the sparse point could .ply file from the Developer Portal. It is identical to the point cloud preview in Unity Editor.
This help page covers the workflow using Blender.
.ply file into Blender.
You'll notice that the point cloud is oriented sideways. This is because we generate the point cloud for a right-hand Y-up coordinate system and Blender uses a right-handed system with Z-up.
In other software with matching coordinate systems, such as Maya, Houdini or Modo, the point cloud would be oriented correctly. But you may have to rotate the point cloud depending on your software of choice.
Rotate the point cloud 90 degrees on the X-axis to fix the point cloud orientation.
Now we can use the point cloud as a reference for building our scene. For navigation, we only need to create a mesh for areas where the user can move. In this case, the floor.
Use the basic modelling tools to build the geometry and align it to the point cloud. Make sure the floor level matches the floor level in the point cloud.
Export the geometry you created as a
.fbx file. The Default settings work just fine, but be sure to check "Selected Objects" to only export the floor geometry.
Import the floor geometry into Unity and place it into the scene under the AR Content game object.
The point cloud preview of your downloaded map and floor geometry should align.
Put the floor geometry into the Navigation layer in the inspector.
The Generate NavMeshSurface Here is set up to include all geometry in the Navigation layer for baking.
Select the Generate Navigation Mesh Here Game Object and click Bake in the inspector.
You should now see the NavMeshSurface data overlaid on top of the floor with a blue color.
Navigation is all set up, now we only need Navigation Targets.
A Navigation Target in this sample is simply a Game Object with a
The script has options for setting up the targets Category, Name and Icon that will appear in the Target List Menu later.
Place the Navigation Targets on top of the NavMeshSurface.
Place your downloaded map files into the AR Map game object's map file slots in the inspector and align the maps.
Open Build Settings (Ctrl + Shift + B) and make sure only the NavigationSample is included in the Scenes in Build.
Change your Player Settings if necessary. It's a good idea to change the Product Name and Package Name so the app will not overwrite a previously installed app on your device.
Build the app and install it to your device.
When you start the app and look around at the location you mapped, the device should localize and find its pose in just seconds.
If you set the AR Map Render Mode options to Editor and Runtime, you should see the point cloud previews of the maps align to the real world as poses are found.
After successfully localizing, the floor geometry should align to the real world. Press Show Navigation Targets and pick a target to test the navigation.