ARKit Developer Tutorial: How To Build a Measuring App with Apple’s Augmented Reality SDK

In this post, we are going to discuss the basic settings, commands and tools for Apple’s new augmented reality SDK, “ARKit,” available for iOS 11. In order to explain all the steps to build a basic app and use its functionality in ARKit, we will be creating a “shoe measuring app” that will measure the length of a shoe and get its size. But first, let’s talk a little bit about ARKit.

ARKit is a new framework that allows you to easily implement augmented reality experiences on the iPhone and iPad. If you are not already familiar with augmented reality, it is essentially a blending of virtual elements with the real world that allows one to “place” 2D or 3D objects in the real world using the device’s camera.

ARKit includes features that make it easy to implement, such as World Tracking (which tracks the objects in the scene), Light Estimation (used to place realistic shadows on virtual objects), and Surface Detection (for realistic object placement). There are a few different rendering options for your virtual objects, including SceneKit (ARSCNView), SpriteKit (ARSKView), and Metal. Metal is a bit more involved and more suitable if you want to build your own rendering engine (or integrate it with a third-party engine). For the simple needs of our project, though, we decided to use SceneKit.

Much inspiration for this post can be found here. You can also visit Apple’s ARKit developer site for more information.

Let’s dive into the project.

ARKit Shoe Measuring App

Creating an ARKit Project

Open Xcode and create a new project using the “Augmented Reality App” template.

Xcode

Xcode

Project name: ARkitApp.

Language: Swift.

Content Technology: SceneKit (for its 3D and 2D features)

 

Navigator Note that in the Navigator Area, there is a New File Group called “art.scnassets.”

Here we found a ship.scn file and the texture.png file; in this group one can add all the 3D assets and the textures for the objects in SceneKit.

*For our example, please delete all those files along with the code in the ViewController.swift file

 

 

 

NSCameraUsageDescription in Info.plist

Selecting the Augmented Reality Template in the new project wizard will add NSCameraUsageDescription to the targets Info.plist. Since augmented reality requires the use of a camera, you need to add a usage description that will be displayed to the user before the app is granted use of the camera.

Augmented Reality Template

MeasureShoeViewController.swift

Create a new Swift file called “MeasureShoeViewController” and clean out all the objects in the main.storyboard.

In the MeasureShoeViewController file, add the following imports along with the ARSCNViewDelegate.  

*ARSCNViewDelegate: Provide SceneKit content corresponding to ARAnchor objects tracked by the view’s AR session, or to manage the view’s automatic updating of such content. This protocol extends the ARSessionObserver protocol, so your session delegate can also implement those methods to respond to changes in session status.

Adding the scene view to the storyboard.

1. In the main.storyboard, add a new ViewController.Storyboard

2. Set the custom class for MeasureShoeViewController to the ViewController, add the ARKit SceneView object to the view controller.

 

 

 

Storyboard

Setting up the SceneView

 

1. Add the IBOutlet to the SceneView object in the storyboard file.

2. Create a setup scene function and set up the ARSCNViewDelegate.

3. Call the setupScene function from viewDidLoad()

  • Please notice the debug options. This will help us understand how the camera detects surfaces and the objects in world tracking.

 

Setting up the ARSession

1. Add a setup function in which you initialize an ARSession. Next, create a world-tracking session with horizontal plane detection, set the configuration to the scnView variable, and then call that function from viewWillAppear.

  2. Run the project.

Note that in the camera there are yellow points; those points represent the surface and the objects in the image (in this case, world tracking).

  • That is the origin point.  

Camera                     Camera

  • It is also important to note that the camera and the session take some time to get ready. The delegate function func session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera) could help us inform the user when the app is ready.

3. Implement the ARSCNViewDelegate function: session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera). To check the states, we can print the camera state to the console.

 

Creating a Sphere

1. Create a new file called: SCNSphere+Init.swift

1.1. Create a new extension of SCNSphere.

1.2. Add an extension to SCNSphere to include a convenience initializer for setting color and radius.

We will use this new initializer to create a sphere node of a specific color and radius further down in this post.

Measuring the Distance between Two Points

1. Add a Tap Gesture to the SceneView to take the taps into account to start and finish the measuring. 

2. Add an extension to SCNVector3 for calculating the distance between 2 vectors.

3. Create a new array of nodes in the MeasureShoeViewController class.

3.1. Add the code to place a sphere in the Scene View when the user taps on the Scene view

3.2. The hitTestResults variable contains the right position in the 3D vector using the world-tracking feature; in this way, we create an object and place it in the right position in the SceneView (by adding a child node).

4. Run the app and check to see how the spheres were added to the sceneView.

5. Clean the nodes in the SceneView by creating a function to remove all the nodes in the sceneView.

6. Calculate the distance between two points and draw a line.

6.1. Add the extension to the SCNNode with the functions:

  • static func createLineNode(fromNode: SCNNode, toNode: SCNNode) -> SCNNode
  • static func lineFrom(vector vector1: SCNVector3, toVector vector2: SCNVector3) -> SCNGeometry

6.2.  Add the code to calculate the distance in the tapHandler function.  

6.3. Add the code to delete the measureLines in the scene.

6.4 Run the project and check the console.  

Console                          Console

6.5. Add the UI to show the distance and an image to recognize the center of the screen.

6.5.1. Add an image with a target or bullseye and set it in the middle of the view.

Target

Presenting a TextNode

The distance will be presented as a node in the SceneView; for that, we are going to create a new SCNNode class.

1. Create a new Swift file called TextNode.swift.

2. Create the TextNode class that inherits from the SCNNode class.

3. Adding the TextNode to the SceneView.

3.1. We are going to create a new function in the MeasureShoeViewController class called: func presentShoeSizes(distance: Double). The main thing here is to calculate the position in which to put the text node; in this case, between the two sphereNodes.

 

3.2. Call the presentShoeSizes function after getting two nodes in the func handleTap(sender: UITapGestureRecognizer)

3.2.1. At this moment, we are not showing the shoe size based on the distance, but we will do that soon.

 

Getting the Shoe Size

There are several ways to calculate the size of a shoe, such as a formula or a comparative table. The problem here is that formulas don’t always return an accurate measurement. To find a relatively accurate formula, this Wikipedia article could help you: https://en.wikipedia.org/wiki/Shoe_size. In this example, I created a plist file with all the sizes for men, women, and kids. This is an example of the file:  

File

Based on this file, we were able to get the sizes and code simple conditions in order to get the shoe sizes given the distance of the shoes.

1. Create a new file called: ShoeSizeCalculatorHelper.swift and set two enums to get the region and type of shoe.

2. Now create the ShoeSizeCalculatorHelper class. In the Init, load the array of sizes and filter the sizes for men, women, and kids. The arrays also need to be sorted.

 

3. Now create a function to get the size depending on the region and the genre category.

4. Add two variables to the MeasureShoeViewController class to hold the region and the shoe genre and initialize in the ViewDidLoad to .men and .us.

5. Call the function to determine the size of the shoe given the distance, the region and the genre, store the result in the let and set the text to our TextNode with the calculation of the position between the nodes.

TextNode

How to create a NodeLine and draw it in real time?

As we did with the TextNode and the SphereNode, a line requires the same process: create a geometry and a material and put them in a SCNNode. The only trick here is calculate the position starting in a SphereNode and finishing in another.

1. Create a new Swift file called: LineNode and set the class as LineNode: SCNNode.

2. Create a new Init function with two SCNVector3 parameters (start position and final position), along with the color of the line.

3. Replace the function to createLine in the handleTap(sender: UITapGestureRecognizer) function.

4. We only need two SphereNodes at the same time to measure the distance between two points. For this reason, add this condition after getting the hitTestResult.first.

5. To draw a line in real time, we will need the Delegate function: func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval).

6. We also need a function to get the center position and return.  

7. In the Delegate function, add a LineNode from the start node to the current position. That will draw the line. The second time the function is called, we need to delete the line and redraw it.   

Shoe Size                  Shoe Size

Final Details: Changing the Size Region and the Shoe Type (Men, Women, Kids)

For the final touches in our app, we need to create an interface for selecting/changing the shoe-measuring settings. For this, we are going to implement a simple Present a ViewController and return the information using a Delegate function.

1. Add a view with a label to represent the selection in the MeasureShoeViewController in the storyboard file

2. Add a new ViewController to the storyboard with navigation and set two UIPickerView for selecting the options.

ViewController

3. Create a new Swift file for our new ViewController: ShoeCategoriesViewController and set the class with the UIPickerViewDelegates to return the selection to MeasureShoeViewController to create your own Delegate function.

4. To populate the UIPickers, you could use the enums ShoeSizeRegion and ShoeType. The code below explains how to do that.

5. Set the SelectedShoeType and seletedRegion at the func prepare(for segue: UIStoryboardSegue, sender: Any?) in MeasureShoeViewController.

6. Finally, implement the Delegate function: dismissViewController.

7. Don’t forget to include all the nodes in the cleanAllNodes function: the LineNode, the TextNode, Nodes Array (SphereNodes) and the startNode.

 

Conclusions

ARKit is a great option for creating an Augmented Reality application. The integration of SceneKit, SpriteKit and Metal makes it easy and simple. You could also save a lot of money by implementing your own solutions rather than buying a third-party solution.

While there is a lot to like about ARKit, there are some limitations. The world-tracking function is not always reliable, and sometimes ARKit loses the information of the surface or the objects it is tracking. This requires the user to constantly move the camera to detect the objects again. This is a problem when you are trying to measure something precisely. The other issue with ARKit is the number of devices it’s compatible with, since ARKit only runs on devices with processors A9 or greater. Even with its shortcomings, though, ARKit is a great 1.0 that will enable a whole new world of applications.

Interested in more “How To” posts? Check out our IoT Success Series!

 

Subscribe to our Blog
 

Jorge Mendoza
Jorge Mendoza
Jorge, a former Gorilla, is a Senior iOS Developer. He has over 10 years of experience in software development and has focused on iOS development for the past 6. Jorge received his Bachelor's in Computer Science from the University of Costa Rica (UCR).

Deliver off-the-chart results.

WordPress Video Lightbox Plugin