Getting started >
iOS SDK

Welcome to Pikkart's Augmented Reality iOS SDK. This document will help you getting started with augmented reality using our SDK on iOS.
You will need iOS SDK 10.2 or later and Xcode 8.2 or later. This SDK supports arm64 architectures, but it doesn't support x86_64 platform (iOS Simulators).
You will also need to download our SDK from Pikkart's web page.

Pikkart's AR iOS SDK has been developed with the objective c/c++ languages, but you can use it in a Swift project as an imported library. If you are not used with Swift/Objective-C mixed environments, you can refer to the following guideline: Using Swift with Cocoa and Objective C. In the example below (and in the rest of the documentation), we will be using Swift because it's the standard language for iOS development nowadays.

If you are an iOS developer and already have set up your iOS development environment, go directly to the Installing Pikkart AR iOS SDK section, otherwise keep reading. 

Enjoy!

1. Setting up iOS Developing Environment

Important: Pikkart's AR iOS applications MUST be deployed on an actual iOS device. This means that you can't run the examples on an emulator, unfortunately. 
To deploy applications on an iOS device you must enroll in the iOS Developer program. Once done that, you should have access to the iOS Dev Center
Go there (or use the Mac AppStore App) to download XCode, the integrated development environment used to create iOS and MacOS apps. The download should include the latest version of the iOS SDK.

It is beyond the scope of this document to teach you how to use Apple's development site features, but we will list what you can (and will need to) do there as part of the app development process:

  • Create an App_Id
  • Register a iOS device for development
  • Create a development certificate
  • Create a development provisioning profile to sign iOS applications

Most of the above tasks can be performed directly from inside XCode as long as you have logged into your iOS developer account from there.

2. Installing Pikkart AR iOS SDK

You can either run and examine our iOSSDKSampleProject on github, or follow the steps below to create your first Pikkart AR Xcode Project in minutes!
Note: in the github example, Objective-c and Swift app targets are provided while here we will be using exclusively the Swift language.
An important thing to know for people who wish to use the demo license, is that since you are limited to one bundle identifier for your apps (com.pikkart.trial, as explained below) it is advisable to delete any app created with the Pikkart SDK before installing a new one, and/or if you are switching markers.

  1. Create a new Swift iOS project. The preselected, single-view project is fine.
  2. Disable the bitcode flag in the Build Settings tab of the main target in the project settings:Note: you may need to click on the "All" button to see the Enable Bitcode entry as it is not a basic setting.
  3. Add the following option in the "Other Linker Flags" setting in the same window:
  4. Decompress the SDK zip file you downloaded earlier and, from there, copy the pikkartAR.framework file in your Xcode project directory. Add it to the project as an existing file.
  5. Go to the "General" tab to add a few items to the "Linked Frameworks and Libraries" section:          Note: only the pikkartAR.framework file is provided in the zip file, all the other frameworks are already included with XCode.
  6. In the main target of your project, set the product bundle identifier value to the package name you were provided when buying/downloading your Pikkart's AR license. At the time of this writing, the trial license bundle identifier is "com.pikkart.trial"
  7. Copy the license file (license.spz) you downloaded earlier (or were sent) inside your app main bundle
  8. Using MacOS Finder, open the pikkartAR.framework file and copy the pikkartARLocalization.bundle from there to your app main directory. Add this file to your project so that Pikkart's AR SDK can access the i18n resources it needs.
  9. Create a bridge header file that will let you use our C++/Objective-C bindings in your Swift code:
    • in Xcode select File –> New –> File… from the menu then select Header File
    • Press Next and enter the name of your bridging header, like this: MyProjectName-Bridging-Header.h (replace MyProjectName with the name of your project of course)
    • Add all relevant targets (in case of doubt just add them all) and press the Create button to continue.
    • Replace the contents of the newly created file with the following:
      #ifndef MyProjectName_Bridging_Header_h
      #define MyProjectName_Bridging_Header_h
      #import <pikkartAR/PKTRecognitionController.h>
      #import <pikkartAR/PKTMarker.h>
      #endif
    • Add the following line to the import section of the ViewController.swift file:
      import Foundation  
    • Go to the project's main target build settings and edit the Swift Compiler section as follows:
  10. Open up the ViewController.swift file and change the class declaration as follows: 
    class ViewController : PKTRecognitionController 
  11. Add the following property inside the ViewController.swift class definition:
        var context:EAGLContext?
    
  12. Add the following privacy permission (Camera Usage Description) in the main project target, "info" tab, "custom ios target properties" section:
  13. If your project has a Main storyboard (this is the case if you have created a new project using the default XCode options) remove the View associated to ViewController.
    NOTE: only the View needs to be removed. All the rest needs to stay the same.
  14. Add the following methods to the ViewController Swift class (modify them if they are already present):
          override func loadView() {
            super.loadView()
            self.context=EAGLContext(api: EAGLRenderingAPI.openGLES3)
            let textureView:GLKView=GLKView(frame:UIScreen.main.bounds, context:self.context!)
            textureView.drawableColorFormat=GLKViewDrawableColorFormat.RGB565
            textureView.drawableDepthFormat=GLKViewDrawableDepthFormat.format24
            textureView.drawableStencilFormat=GLKViewDrawableStencilFormat.format8
            textureView.drawableMultisample=GLKViewDrawableMultisample.multisampleNone
            self.view=textureView
        }
    
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
            let authInfo:PKTCloudRecognitionInfo = PKTCloudRecognitionInfo(databaseName: "")
            let options:PKTRecognitionOptions = PKTRecognitionOptions(recognitionStorage:.PKTLOCAL,
                                                                      andMode:.PKTRECOGNITION_CONTINUOS_SCAN,
                                                                      andCloudAuthInfo:authInfo )
            startRecognition(options, andRecognitionCallback:self)
            
        }
    
        override func markerFound(_ marker: PKTMarker) {
            let alert = UIAlertController(title: "Success!", message: "Marker found:" + marker.markerId, preferredStyle: UIAlertControllerStyle.alert)
            alert.addAction(UIAlertAction(title: "Ok", style: .default, handler: {
                action in
                switch action.style{
                case .default:
                    print("default")
                case .cancel:
                    print("cancel")
                case .destructive:
                    print("destructive")
                }}))
            self.present(alert, animated: true, completion: nil)
        }
    

Please note: i​​​​​​n the previous listing we have set up a local marker recognition (PKTLOCAL) as opposed to cloud based recognition and PKTRECOGNITION_CONTINUOS_SCAN as the recognition mode. We will learn at a later time what this last option means.

Also note that it's not mandatory to start the recognition process in the viewDidLoad method, it was just a sensible choice for this example. The cloud authentication information parameter of the PKTRecognitionOptions is not necessary as we are working locally (it could be nil), but we chose to introduce it here because we will see it often from now on.

Right now all we need to make everything work is to add a marker file to the project, so that the SDK may know what to look for. We will also need a jpg representation of such marker, to be shown on a computer display so that the app can recognize it by pointing the camera at it.

Fetch the markers.bundle file from [here], add it to the project directory and to the project as an existing file.

Download the 001_small.jpg image from [here] and open it on your computer with whatever app you have available.

Now fire up the app, which will only show a black screen (it is ok for now) but will actually be actively searching for the marker using the phone/tablet camera. Point it in the general direction of the marker on screen trying not to position the phone too close to the screen, or the camera won't be able to properly see the marker. Shortly you will see a small dialog window on your phone that will let you know that the marker has been found. Success!

Note: we need to pass an instance of a class implementing the protocol PKTIRecognitionListener as the second parameter to the startRecognition() method.
In this example we did this by passing the ViewController itself which then needs to implement the following callback functions in order to respond to the SDK events:

// The executingCloudSearch function is called every time a cloud image search is initiated. In this tutorial it's never called as we are working locally.
override func executingCloudSearch() {} 

// cloudMarkerNotFound is called when the cloud image search fails to find a target image in the camera frame. 
// In this tutorial it will never be called as we are working locally.
override func cloudMarkerNotFound() {}

// internetConnectionNeeded is called when the app is set to perform cloud operations or image searches 
// (PKTRecognitionStorage.GLOBAL) but internet connection is absent. It's merely a warning that the engine cannot work as intended.
override func internetConnectionNeeded() {}

// markerFound is called every time a marker is found and its tracking has been started.
override func markerFound(_ marker: PKTMarker) {}

// markerNotFound is called while the engine is searching for a marker but it has not been found yet 
// (after searching through all local markers, or after a tap-to-scan failed). 
// The event is fired about once per second.
override func markerNotFound() {}

// markerTrackingLost is called as soon as the current marker tracking is lost.
override func markerTrackingLost(_ markerId: String!) {}

// This is an advanced recognition feature of our SDK, which will be explained in a later tutorial.
// For now, suffice to say that it will enable you to recognize different marker that look the same to the user
-(void)ARLogoFound:(NSString *) markerId withCode:(NSNumber *)code;

As you may have noticed, we have already implemented the markerFound event listener, which is the one displaying the small dialog window when the marker has been found.

Happy coding!