Inside Placenote

Inside Placenote • Articles on spatial computing design and tech for developers.

Design a Responsive 3D Cursor for ARKit Apps in Unity

Design a Responsive 3D Cursor for ARKit Apps in Unity

Hit Testing is one of my favorite features in ARKit on iOS. It lets you point at an arbitrary surface or object in the real world and get an accurate 3D coordinate position of the point you're aiming at. As simple as it is, this means you can place and manipulate content at very precise locations in your environment and that opens up a huge spectrum of design possibilities for your AR app. For example, the GIF below shows how Apple uses the Hit Test to let users measure distances on flat surfaces.

At Placenote, we use Hit Testing in many of our sample apps to enable interfaces that either place a 3D model on the floor, or add a "virtual note" at a point in your environment.

Let's take a look at the Placenote Unity SDK: Hello-World sample project. We use Hit Testing to let the user pick a location to place a 3D model. The GIF below shows what this looks like. You'll notice that the reticle glides along the floor and jumps to the couch when you point at it. (i.e. when a new flat surface is detected).

How does Hit Testing Work?

Hit Testing works using a simple concept known as Ray Casting. This means that we use a point on the screen as a starting point and extend a "ray" or a line outwards from the screen (and perpendicular to it) until it collides with the first item in its path. In the case of planar surface Hit Tests, as in the GIFs above, the Hit Test returns the point of intersection between this ray and a plane that has been detected. The figure below illustrates this concept.

A Ray can be extended from any point on the screen to find its Hit Point on a Plane

What about surfaces that aren't flat?

Now, in some cases you may want to Hit Test on a surface that isn't planar, such as on a pipe, a kitchen appliance or under the hood of your car. Especially in apps like "Placenote: Annotate the world" (video below), we need to Hit Test and add virtual sticky notes on objects that will most likely not have planes detected by ARKit on them. In the video, you'll see that we're still able to move a cursor along non-planar surfaces. So how does this work?

Luckily for us, planar surfaces aren't the only kind of Hit Test Apple allows. Here's a list of all the Hit Test result types available. Notice that you don't actually need a plane to intersect with. You can get a Hit Result on any surface that ARKit detects feature points on using the Result Type - FeaturePoint. In the rest of this article, I'm going to dive into how to work with this option. I'll be focusing on Unity and C# as the dev platform, but if you're interested in the Swift version, please feel free to reach out to us on our Slack channel.

You can get a Hit Test Result on any non-planar surface on which ARKit has detected feature points using the Hit Test Result Type - FeaturePoint.

How does the Feature point Hit Testing work?

Here's a description of Feature Point based Hit Testing from Apple's documentation.

During a world-tracking AR session, ARKit builds a coarse point cloud representing its rough understanding of the 3D world around the user (see rawFeaturePoints). Individual feature points represent parts of the camera image likely to be part of a real-world surface, but not necessarily a planar surface.

When you search using this hit-test option, ARKit finds the feature point nearest to the hit-test ray (the extension of the 2D hit-test point into 3D world space), then returns the point on the ray nearest to that feature point.

The problem with Feature-Point Hit Testing

So... this means we now have a completely flexible Hit Testing system that lets us add AR content anywhere around us...right ? Not quite!! The very nature of this Hit Test mechanism makes it vulnerable to noisy results that make your cursor or reticle jitter and appear unstable. There's really nothing more frustrating to a user than poor visual feedback.

It's time to dive into a code example! Let's investigate the noisy results in Feature Point Hit Tests with some code.

We're going to use the Placenote SDK Unity sample project for this test. To get the project, download the Unity SDK and build the Hello World Sample Scene to your iOS device. You can follow the instructions in the linked documentation below to do it. Feel free to browse this article without downloading any code, but following along with code examples might help you understand the concepts better.

Download the Unity SDK
Download the Unity SDK and try out the sample apps

Building a Feature Point based ARKit Reticle

Once you open the project, in the HelloWorld example folder under Assets/Placenote/Examples/1_HelloWorld you will find a script called ReticleController.cs.

In this script, we use the function IEnumerator ContinuousHittest() to continually run a Hit Test from the center of the screen and update the position of the reticle to appear at the Hit Result point to create the effect of the reticle "dragging" along flat surfaces in the environment. Notice below that we're using the TrackableType.Planes.

private IEnumerator ContinuousHittest() {
    while (true) {
        // getting screen point
        var screenPosition = new Vector2(Screen.width / 2, Screen.height / 2);
        // ARKit Hit Test - Type Planes
        if (m_RaycastManager.Raycast(screenPosition, s_Hits, TrackableType.Planes)) {
            // get the closest hit.
            var targetPose = s_Hits[0].pose;

            // move reticle to the hit test point
            mReticle.transform.position = targetPose.position;

            // Show reticle if there's an active hit result
            mReticle.SetActive(true);
        }
        else {
            // hide reticle if there's no hit test result
            mReticle.SetActive(false);
        }
        
        // go to next frame
        yield return null;
    }
}

Now let's change this to a feature point based Hit Result and see what happens. We'll change TrackableType.Planes to TrackableType.FeaturePoint and make the reticle "LookAt" the camera as it's moving along (instead of being flush with the planar surface). To do this, we'll add mReticle.transform.LookAt(Camera.main.transform); right under the mReticle.SetActive(true); line. Here's a video of our changes in action.

Notice that the Feature Point Hit Test is a lot more sensitive to the structure of the environment than the Plane Hit Test. The downside of this sensitivity is that, since it is using the ARKit point cloud to find the closest intersection point with the ray, it is a bit too vulnerable to variations in the quality of feature points detected in the environment.

It's important to understand this. The feature points detected by ARKit are visual features in the environment identified by ARKit as the camera moves around. If the camera is stationary, goes out of focus or is simply looking at a plain white textureless area, the number of detected points drops drastically. This impacts the Hit Test and causes the noisy jitter in the reticle motion that you see in the video.

Reducing Jitter in Feature Point Hit Tests

Through some trial and error, the design team at Placenote was able to reduce the amount of jitter in the feature point hit test to create a much more stable and smooth reticle motion effect. Before, I dive into how we did this, you can try our results in our notes sample app to see how it works. In the next section, we'll recreate this effect in the Hello World example code.

To download the notes app, open the camera on your iPhone or iPad and point at this QR Code. Alternatively, use the download link.

Building the Reticle Smoothing Controller

The quality of the Feature Point Hit Test tests depends squarely on the quality of the point cloud detected by ARKit in each camera frame. That means we can measure a few metrics that let us peek into the status of the point cloud and filter the Hit Tests using this information. Here's a list of data we can access to check the point cloud status:

  1. Number of tracked points in the current frame: If there are fewer than 30 features, ignore the result. Usually well tracked areas have between 100 - 400 features in each frame.
  2. Last point cloud update time: ARKit sends a callback whenever the point cloud is updated. We can use a delegate function to monitor when the point cloud was last updated.
  3. Distance to Hit Point: Continuously measure distance to the hit surface and ignore results further than ~1.5 meters away. Also, if the cursor has moved less than 3 - 4 cm, you can ignore the results (We don't want to react to small vibrations or accidental shaking).

Let's now implement these checks and create a new ReticleController script that monitors point cloud status and smooths out the hit test results. To start, we need to configure our project for point cloud monitoring.

Configure the project for Point Cloud monitoring

Add AR Point Cloud Manager to the PNSessionOrigin object in your scene. If you're building a basic ARKit app without Placenote, you need to add the point cloud manager to the ARSessionOrigin object.

Add AR Point Cloud Manager to the PNSessionOrigin object in your HelloWorld scene

Initialize the ARPointCloudManager variable in the ReticleController class. Drag the AR Point Cloud Manager into the serialized field in this script.

[SerializeField] ARPointCloudManager mPointCloudManager;

The updated Reticle Controller script

Refer the updated reticle controller script in this Github gist. You can copy this and completely replace the contents of your ReticleController.cs script with it. In the code snippets below, I'll refer to the main sections of this script to help you understand how it works. Open the script now and use it as a reference before you continue reading.

First, we initialize a few neccessary variables in the ReticleController class.

private Vector3 lastCursorPosition;
private int mCurrentTrackedFeatureCount;
private float timeOfLastPtCloudUpdate;
Add to the ReticleController class

Then, we add a delegate function that monitors the status of the ARKit point cloud and save the current tracked feature count and time with each update.

void OnPointCloudChanged(ARPointCloudChangedEventArgs eventargs) {
	if (eventargs.updated.Count == 1) {
    	
        foreach (var ptcloud in eventargs.updated) {
        	mCurrentTrackedFeatureCount = ptcloud.positions.Length; 
       	}
      	
		timeOfLastPtCloudUpdate = Time.time;
    }
}

In your start function initialize the delegate function above and set an initial lastCursorPosition.

lastCursorPosition = new Vector3(-200f, -200f, -200f);
mPointCloudManager.pointCloudsChanged += OnPointCloudChanged;
Add to Start() function

Now in the IEnumerator ContinuousHittest() function, we perform a series of checks for the tracked feature count, time of last point cloud update, distance change and distance between reticle and camera that you can read through in the ReticleController script. We also add some hysteresis to filter high frequency noise in the hit test results.

Finally, we create a function called GoToTarget() that animates the cursor moving towards the hit result point to create the fluid motion effect you see in apps like Apple's AR Measure app.

IEnumerator GoToTarget(Vector3 destination)
{
    float distance = (destination - mReticle.transform.position).magnitude;
    while (distance > 0)
    {
        float step = distance * Time.deltaTime / 0.2f;

        // Move our position a step closer to the target.
        mReticle.transform.position = Vector3.MoveTowards(mReticle.transform.position, destination, step);


        // update distance
        distance = (destination - mReticle.transform.position).magnitude;

        yield return null;
    }

}

Putting this all together,  you can copy and paste the new ReticleController script into your ReticleController.cs file (replace everything in the file with it).

The final result looks like the GIF below. It's not only smooth, fluid and robust, but it also drastically minimizes the likelihood of a user accidentally placing content at an unintended location!

Our smooth reticle controller

I hope this is useful to you as you build your own AR apps. Given that mobile AR is still extremely new technology, AR developers need to be very careful as they design apps, to ensure that their interfaces are robust and easy to use, even for the uninitiated.

If there's one take away from this article it should be this:

Don't assume that ARKit or ARCore features will work out of the box. You will need to stress test every platform you use and know that there is no perfect solution. Limit functionality in ways that makes your app as fool proof as you can possibly make it. In other words, make your AR experience nearly impossible to break!

At Placenote, we're working on building the easiest to use AR SDK for apps that persistently link AR content with the real world. To learn more about Placenote, try the SDK for free and get in touch with us if you'd like to chat!

Enjoying these posts? Subscribe for more
Subscribe now

Subscribe to be notified of new content and support Inside Placenote! You'll be a part of the community helping keep this site independent and ad-free.

You've successfully subscribed to Inside Placenote
Great! Next, complete checkout for full access to Inside Placenote
Welcome back! You've successfully signed in
Success! Your account is fully activated, you now have access to all content.