This demo application shows how to place an object and how to communicate with your virtual items utilizing gestures and hit testing.
Xcode Installed on your Mac
iOS 11.3 or higher installed on your iPhone
iPhone 6s or up
Open Xcode and create a new project. Choose “Augmented Reality App” and fill the required details.
Apple provides options for Content Technology like SceneKit, SpriteKit, and Metal. Here we will choose SceneKit. If you want to place any 3D object model then Xcode needs to read that 3D object file in SceneKit supported format(.scn) or .dae.
Ensure that "Include Unit Tests" and "Include UI Tests" are unchecked as we will not be needing them. We will be coding in Swift and using Apple's SceneKit for creating content.
The first thing we are going to do is go into the storyboard by clicking "Main.storyboard" in the dropdown menu on the left side. We will now be taken to a page presenting the "ARSCNView". This is the main view in which all virtual objects are rendered onto. This is what makes Augmented Reality possible.
Now in here, we need to add a UILabel to the scene. This infoLabel will be used for informing the user about the AR session states and any updates of the node. Follow this example to add the label to the scene visually and logically. Don't forget to add the constraints so that it stays it constrained to the top of the device!
NOTE: The "Green Circle" indicates that I am holding "Control" as I click and drag from the label.
Now open, ViewController.swift. For debugging purpose we can set sceneView.debugOptions = .showFeaturePoints and can see how ARKit detects surfaces. When you run the app, you should see a lot of yellow dots on the scene. These are feature points and it is helpful to estimate properties like the orientation and position of physical objects in the current base environment. The more feature points in the area, the better chance ARKit can determine and track the environment.
Now it’s time to set a world-tracking session with a horizontal plane. As you can see in your viewWillAppear method session has been already created and set to run.
So now your method will look like this:
First ARKit App @ZakW
Detect plane and place object
When we detect any surface in ARKit, it will provide ARPlaneAnchor an object. An ARPlaneAnchor object is basically containing information about position and orientation of a real world detected surface.
To know when surfaces will be detected, updated or removed , use ARSCNViewDelegate methods (which work like a magic in ARKit). Implement the following ARSCNViewDelegate methods so you will be notified when an update is available in sceneview.
ARSessionDelegate protocol provides current tracking state of the camera so you are able to know that your app is ready to detect or not. When you are getting a normal state, you are ready to detect plane. For that implement these delegates (You may have to replace some of the functions if they are already preloaded into the file).
When plane has been detected, add object onto it. Here we are going to add 3D model named “ship.scn”.
Now build and run your app. You can see that some surfaces show more feature points and in some area, you can not get much better result compared to others. Surfaces which are shiny or one colored make it difficult for the ARKit to obtain a strong reference point for plane detection and to be able to determine unique points in the environment. If you are unable to see more feature points then move your device around the area and try to detect it with different objects or surfaces. Once ARKit is ready with the detected plane then your object will be added on it.
When you find a plane, you should see your object in AR. Walk around and notice how it stays pinned in its position.
Change position of object to tap location with UITapGestureRecognizer
For placing an object on tap first add UITapGestureRecognizer in scene view.
Then in the handling of tap gesture add a node at tap position. A node represents the position and the coordinates of an object in a 3D space. Here we set a position of the node to tap position.
Xcode is probably giving you an error with something like Value of type 'simd_float4x4' has no member 'translation'
For getting the translation of worldTransform add this extension to the bottom of the Viewcontroller.swift file (outside all the brackets).
Build and run the app again! Notice you can touch to replace where the ship is located! NOTE: The movement is probably a bit funky because of how the default ship.scn is made. Don't worry if the ship doesn't move to your finger correctly, the code is executing properly, but the ship.scn is reacting differently... anyway...
Scaling object with UIPinchGestureRecognizer
For zoom-in and zoom-out 3D object we have to change a scale of object while user pinch. For recognize when user pinch on sceneview, add UIPinchGestureRecognizer.
Here we set a maximum zoom scale as 2 (200% zoom-out then original object) and minimum scale as 0.5 (50% zoom-in then original object). You can play around with it!
Rotate object using UIPanGestureRecognizer
For rotation of any object using a UIRotationGestureRecognizer is useful. This will recognize a rotation using two fingers.