iOS 11 ARKit

Tutorial: Build an Augmented Reality (AR) App for iOS using ARKit – LESSON 2

ARKit is coming to consumer iOS devices this Fall. This second short tutorial demonstrates how to build your first AR app using the new ARKit with 3D assets.

Not sure what AR is? Augmented Reality (AR), is when a view of the real physical world is augmented to include additional inputs through computer generation to include sound, video and graphics.

You can find the previous article using 2D assets here. The full source code for the tutorial can be found here.

Device Configurations

To complete this tutorial prior to the release of iOS11, you will need to setup your developer machine and iOS device with Beta versions of XCode 9 and iOS 11. You can download the files here. Note: You will need to have an active Apple Developer account to do so.

XCode 9 Beta

You can download and install the beta version on your machine in parallel with XCode 8 without any additional effort. Just run the downloaded install file.

 iOS Device

You will need to have an iOS device running iOS 11 because you will need access to the camera. To install the device with the beta build, visit the above link from the device you wish to upgrade. Follow the installation instructions that are presented once the download has completed.

Create an ARKit Project

1. Open XCode 9 and choose to start a new project. Choose “Augmented Reality App”

Start ARKit Project

2. Your project settings should be similar to the below with your own information.

3. The default app should be ready to go. Let’s try it out. Choose your device and run the app. You should see a screen similar to below. And as you move around, you should be able to walk around the object!

What is this App doing?

Now that the base project is running, let’s look into some key parts and see what they are doing.


There is an ARSCNView outlet called “sceneView”, this is where all of your settings and nodes will be set.

The ViewController is an ARSCNViewDelegate so that we can add more functionality to the view.

sceneView.delegate = self

The frames per second (FPS) display and other statistics that you saw on the bottom of the view is showing with the showStatistics property.

sceneView.showsStatistics = true

Attaching the ARWorldTrackingSessionConfiguration to the ARSCNView and setting the run will start the view.

let configuration = ARWorldTrackingSessionConfiguration()

These two lines are what uses the default 3d asset scene (the plane) and imports it into our scene view.

let scene = SCNScene(named: "art.scnassets/ship.scn")!
sceneView.scene = scene

Using a Custom Asset

First let’s get rid of the default scene. Click the “art.scnassets” folder and select to Move to Trash:

Next we are going to want to import a Collada file (*.dae). If you do not have the correct version format, you should be able to export it from most 3d creation software. If you do not have software already, I recommend the free Blender (download here).

Add the Collada file into the project.

One thing to note. Most 3D Creator apps like Blender have Z-axes as up. iOS uses Y-Up. If your assets are Z-Up, you will need to rotate them before exporting/importing. Or else you will see something similar to this:

Now we are going to change the scene to the default scene and make a call to add our asset (a milk can in this case)

    let scene = SCNScene()
    sceneView.scene = scene

Now create a function to create the node from the file, set the position and add the node to our scene

  private func addMilkCan() {
    let milkcan = ViewController.colladaToSCNNode(filepath: "milkcan.dae")
    milkcan.position = SCNVector3(0, 0, -0.2)

To grab the node correctly from the Collada file, I have included this helper method

class func colladaToSCNNode(filepath:String) -> SCNNode {
    let node = SCNNode()
    let scene = SCNScene(named: filepath)
    let nodeArray = scene!.rootNode.childNodes
    for childNode in nodeArray {
      node.addChildNode(childNode as SCNNode)
    return node

That’s it! Hit run, and see how everything looks. (sorry I missed up my lighting in my final version)


I hope that you have enjoyed creating your first 3D Augmented Reality application with ARKit. If you would like to give me feedback on what I had to say, or have questions, please reach out over Twitter (@matt_ridley) or through our contact form. Thanks!


Matthew Ridley is mobile developer and principal of Milk Can. Milk Can designs and builds mobile solutions for local, national and international clients. He has been developing professionally for nearly 15 years and has worked across a variety of platforms and languages. He started his career building applications and websites and has now shifted his focus to mobile app solutions.


  1. June 2017
  2. May 2017