iOS SDK >
Local Planar Marker v.2.6

This tutorial will guide you through the creation of a simple AR application that detect and track an image marker stored in the app assets and draws a simple 3D model on it. Our SDK is designed for use with Xcode (versions 7.3 or later), and support iOS SDK 9.0 or later.

First create a new Single View Template application in Xcode app and click Next Button

Write your Product name (i.e. MyARApplication), Organization Name and Choose your Organization Identifier. Leave Language setting as "Swift" and Device  setting as "Universal". Click Next

Click on Create and save Xcode Project

After a short while Xcode will show you a new window with your app skeleton and base classes created by the Single View App Skeleton.

Now it's time to set-up your app for development with Pikkart's AR SDK, in a very similar way to that described in the Getting Started section.

Disable Bitcode flag in Build Settings tab:

Add  pikkartAR.framework as a linked framework inside Xcode sample project

If you have already purchased a license, please set the bundle identifier to the package name you provided when buying your Pikkart's AR license.

Copy the license file (license.spz) we provided to you inside your app main bundle. Also copy the markers.bundle we provide in our sample package (<pikkart_sample_dir>/sample/).

Now it's time to enable Pikkart AR SDK functionality! In order to use our Framework (written in Objective C) in a Swift project, we have to add a bridge header file. First at all, Choose Menu File -> New -> File and add a Header File to Xcode Project. Click Next Button

Select File name as <MyARApplication>-bridging-header.h and Create in Xcode root folder project

Open new bridging file header in Xcode Editor and add following line

 #import <pikkartAR/PKTRecognition.h>

From Xcode Project Build Settings , add following settings on "Other Linker Flags":

Add libz.tbd and libsqlite.tbd shared library in Linked Frameworks and Libraries

Open ViewController.swift and make class as subclass of our PKTCameraController 

class ViewController: PKTCameraController { ...

If you have used Xcode template Project, remove from storyboard ViewControllerScene, the View associated to ViewController

All is set, now you just have to start the recognition process. You can do it (as an example) in viewDidLoad method calling StartRecognition method:

 override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
         let authInfo:PKTCloudRecognitionInfo = PKTCloudRecognitionInfo(databaseName: "")
         let options:PKTRecognitionOptions = PKTRecognitionOptions(recognitionStorage:.PKTLOCAL, andMode: .PKTRECOGNITION_TAP_TO_SCAN,andCloudAuthInfo:authInfo )
         // -- tutorial for start recognition
         self.StartRecognition(options, andRecognitionCallback:self)
    }

PKTCameraController exposes, via PKTIRecognitionListener protocol, a list of override methods. We can use them as step by step tracing of recognition process.

In self.StartRecognition() method, we set self as RecognitionCallback object. Then, we can implement in ViewController.swift the PKTIRecognitionListener protocol methods

  • This function is called every time a cloud image search is initiated. In this tutorial it will never be called as we are working LOCAL only.
    override func executingCloudSearch() {
     print("executingCloudSearch called\n")
    }
  • This function is called when the cloud image search fails to find a target image in the camera frame. In this tutorial it will never be called as we are working PKTRecognitionStorage.LOCAL only.
    override func cloudMarkerNotFound() {
     print("cloudMarkerNotFound called\n")
    }
  • This function is called when the app is set to do cloud operations or image search (PKTRecognitionStorage.GLOBAL) but internet connection is absent.
    override func internetConnectionNeeded() {
     print("internetConnectionNeeded called\n")
    }
    • This function is called every time a marker is found and tracking starts. In this tutorial we show an alert view on UI.
      override func markerFound(markerId: String) {
              NSOperationQueue.MainQueue.AddOperationWithBlock({
              let alertController = UIAlertController(title: "Marker Found", message: "marker Found with id"+markerId, preferredStyle: UIAlertControllerStyle.Alert)
              let okAction = UIAlertAction(title: "OK", style: UIAlertActionStyle.Default) { (result : UIAlertAction) -> Void in
                  print("OK button touched")
              }
              alertController.addAction(okAction)
              self.presentViewController(alertController, animated: true, completion: nil)
              })
      }
  • This function is called every time a marker is not found (after searching through all local markers, or after a tap-to-scan failed).
    override func markerNotFound() {
     print("markerNotFound called\n")
    }
    • This function is called as soon as the current marker tracking is lost. In this tutorial we show a alert view on the UI.
      override func markerTrackingLost(markerId: String) {
       NSOperationQueue.MainQueue.AddOperationWithBlock({ 
              let alertController = UIAlertController(title: "Lost Marker", message: "marker lost with id"+markerId, preferredStyle: UIAlertControllerStyle.Alert)
              let okAction = UIAlertAction(title: "OK", style: UIAlertActionStyle.Default) { (result : UIAlertAction) -> Void in
                  print("OK button touched")
              }
              alertController.addAction(okAction)
              self.presentViewController(alertController, animated: true, completion: nil)
      }
      }

You can run the application now. Print one of the test marker images (<pikkart_sample_dir>/markers_printable/), compile and run the app on a device and try it.

The app currently doesn't show anything, it just sample images from the camera and does its magic in the background. It's time to add some visuals to it. In order to do that we need to create an OpenGL view, set it up and render the camera view and additional augmented content. We have created some helper class to help you set up the rendering process. To use them, first  copy the content of the folder <pikkart_sample_dir>/sample/classes/rendering/ of our sample package into the group folder <your_Xcode_project_root>/MyARApplication/ExtraRendering/.   Copy also texture.png and  monkey.json files from <pikkart_sample_dir>/sample/media.bundle into the group folder <your_Xcode_project_root>/MyARApplication/ExtraRendering/. You will find new Swift classes and files .m in your app project as in the following image:

Also copy the media.bundle from <pikkart_sample_dir>/sample/.
In order to rendering in an OpenGL View, we have subclass our PKTCameraController from GLKViewController. In this way, our view controller already have an open gl animation loop where we can draw our 3D objects. In ViewController.swift , we override  glkView(view: GLKView, drawInRect rect: CGRect) method called from GLKit animation loop renderer:

//MARK: GLView rendering callback 
override func glkView(view: GLKView, drawInRect rect: CGRect) { 
if (!self.isActive()) { return } 
glClear(GLbitfield(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)); 
self.RenderCameraWithViewPortSize(CGSize(width: _ViewportWidth, height: _ViewportHeight), andAngle: Int32(_Angle)) 
// Call our native function to render content 
var mvpMatrix:[Float] = [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]; 
if (self.computeModelViewProjectionMatrix(mvpMatrix)) 
{ 
_monkeyMesh!.DrawMesh(&mvpMatrix)
RenderUtils.checkGLError() 
} 
glFinish();
}

In our implementation we have added a couple of support function such as internal func computeModelViewProjectionMatrix(mvpMatrix:[Float]) -> Bool that computes the model-view-projection matrix that will be used by the OpenGL renderer starting from the projection and marker attitude/position matrices obtained from Pikkart's AR SDK.

We make use of 3 important static functions of the our PKTCameraController class:

  • public func getCurrentProjectionMatrix(matrixPointer: UnsafeMutablePointer< UnsafeMutablePointer<Float>>)
  • public func getCurrentModelViewMatrix(matrixPointer: UnsafeMutablePointer<Float>>)
  • public func isTracking() -> Bool

Other important functions are:

  • public func getCurrentMarker() -> PKTMarker!
  • public func RenderCameraWithViewPortSize(viewPortSize: CGSize, andAngle angle: Int32)

The first return the currently tracked Marker. The Marker class contains information about the tracked image, more importantly its associated custom data (this will be explained in the next tutorial). The RenderCameraWithViewPortSize function render the last processed camera image, it's the function that enables the app to render the actual camera view, this function must be called inside your OpenGL context rendering cycle.