Using Apple ARKit for Unity

In June 2017, at the Worldwide Developers Conference (WWDC17), Apple announced their first plunge into augmented reality with the introduction of ARKit, a new framework on iOS for creating AR experiences for iPhone and iPad. Although new, it is being rapidly adopted and promises to be a leading platform for augmented reality, albeit limited to Apple devices and not for image target or marker-based applications.

For some of your projects, you can use the Apple ARKit for iOS instead of a general-purpose toolkit, such as Vuforia or ARToolkit. While Vuforia, for example, is best a target-based AR, ARKit is especially good at anchoring virtual objects in real-world 3D space and recognizing surfaces in the real world. In that sense, it shares similar use cases with Microsoft HoloLens.

Like any iOS development, developing with ARKit requires using a macOS computer and a current version of the Xcode development environment.

At the time of this writing, ARKit is in beta version and requires iOS 11. This means that it is available to developers and limited to specific iOS devices. It also means that the system requirements, API, and SDK are subject to change. Based on past experiences, our expectation is that by the time it is released and you read this, the interface to ARKit will not be very different from the version we are working with, but there is that possibility.

The Unity-ARKit-Plugin project, maintained by Unity Technologies, provides a thin wrapper around the native ARKit SDK. It provides Unity scripts, components, and prefabs you can use. It also includes several example scenes.

Unity-ARKit-Plugin is an open source project hosted on Bitbucket at https://bitbucket.org/Unity-Technologies/unity-arkit-plugin. If you want the latest and greatest version, go there. But, there is also an Asset Store package, which is regularly kept up to date, and we recommend using it.

To install Unity-ARKit-Plugin, perform the following steps:

  1. In Unity, if the Asset Store tab is not visible, go to Window | Asset Store.
  2. Enter ARKit in the search box to find the current package.
  3. Select Download, then click on Import to import it into your project.

Review the folders and files installed in your Assets folder. The Assets/UnityARKitPlugin folder contains the example scenes along with various supporting subfolders. The ARKit plugin's actual assets reside in the Assets/UnityARKitPlugin/Plugsin/iOS/UnityARKit/ folder.

The project's Asset folders are shown in the following screenshot:

Once installed, you can try ARKit by opening one of the example scenes. The UnityARKitScene scene is a basic scene with a simple cube. It demonstrates all of the basic functionality of ARKit. Open the scene as follows:

  1. In the Project window, navigate to Assets/UnityARKitPlugin/.
  2. Double-click on UnityARKitScene.

You will notice that the Hierarchy scene contains the following objects that are basic to any ARKit scenes:

  • Main Camera, parented by an empty CameraParent object, has a UnityARVideo component for rendering live video feed and handling device orientation.
  • ARCameraManager, with the UnityARCameraManager component, interfaces the camera with the native ARKit SDK, including the current camera pose transform.
  • Directional Light, with an optional UnityARAmbient component, will adjust the scene lighting based on the ARKit's detection of ambient lighting conditions in the real world.
  • GeneratePlanes, with the UnityARGeneratePlane component, will generate Unity objects in the scene for planes detected by ARKit. You can supply prefabs to render, for example, occlusion of objects or shadows of your virtual objects in the real world.

The UnityARKitScene hierarchy is shown in the following screenshot:

The other objects are examples specific to this demo application scene. Although subject to change as the plugin matures, the current scene includes the following:

  • RandomCube: This is a cube with a checkboard pattern rendered one meter in front of you when the app starts.
  • HitCube (parented by HitCubeParent) with UnityARHitTestExample: This is an example of using screen touch input to place cubes in the AR scene.
  • PointCloudParticleExample: This along with PointCloudParticleExample renders ARKit's current scanning results as fuzzy particles. As ARKit scans your environment, it generates a point cloud of key points in 3D space on depth surfaces it has detected.
  • ARKitControl: This along with the ARKitControl component provides a simple GUI for controlling the AR session with buttons for start, stop, and other options.

In preparation of building the scene, perform the following steps:

  1. From the main menu, navigate to File | Build Settings.
  2. Select Add Open Scenes and/or uncheck all the scenes in Scenes in Build, except UnityARKitScene.
  3. In the Platform pane, ensure iOS is selected and click on Switch Platform, if needed.

The Build Settings box is shown in the following screenshot:

There are build and player settings you should consider. If you load ARKit into a new Unity project, these settings (or similar) may already be set up. Let's take a look at the following steps:

  1. Click on Player Settings....
  2. In Inspector, uncheck the Auto Graphics API checkbox and click on Metal Graphics API.
  3. Enter a valid Bundle Identifier (in the form of com.company.product), as described previously.
  4. Fill in Camera Usage Description, as described earlier (such as augmented reality).
  5. Optionally, for a better user experience, check the Use Animated Autorotation and Render Extra Frame on Pause checkboxes.

We can also optimize the quality settings. The default High settings are closest to what are used in the ARKit example projects; so, if you're starting a new project, you can start as follows:

  1. Go to Edit | Project Settings | Quality.
  2. In the Levels table, select the High row.
  3. For Shadows, click on Hard Shadows Only.
  4. For Shadows Projection, click on Close Fit.
  5. Set Shadow Distance to 20.
  6. For Shadowmask Mode, select Shadowmask.
  7. Set Shadow Near Plane Offset to 2.

 

  1. Now, in the Levels column for iOS, at the bottom, select the down triangle and choose High as the default quality setting, as shown in the following screenshot:

When you're ready, go ahead and click on Build And Run. As described earlier, like any iOS app, Unity will generate Xcode project files and launch into Xcode, which in turn will do the heavy lifting of compiling and building the actual iOS code.

At the present time, there is no emulator for running and testing ARKit apps from within Unity. However, there is a remote option that lets you use play mode from Unity to test your app without the need to build and run it each time. With the remote app running on your attached device, the Unity project can access the video and sensor data on the device and behave as if it's running on the device, but it plays in the Unity Editor Game window. This remote app comes as a scene in the ARKit package we just installed and can be found at Assets/UnityARKitPlugin/ARKitRemote/UnityARKitRemote.unity. To use it, add another game object to your scene with the ARKitRemoteConnection component added. Instructions are included in the readme file in that folder. For more information, refer to https://blogs.unity3d.com/2017/08/03/introducing-the-unity-arkit-remote/.