Xamarin NuGet Package >
Local Planar Marker for iOS

This tutorial will guide you through the creation of a simple AR application that detects and tracks an image marker stored in the app assets and draws a simple 3D model on it. 

Note: Our SDK is designed for use with  iOS SDK 10.2+ and need an actual device to run. You can't use simulators. This is important! 
A complete Visual Studio solution can be downloaded from github. The following explanations are based on the code found there.
An important thing to know for people who wish to use the demo license, is that since you are limited to one bundle identifier for your apps (com.pikkart.trial) it is advisable to delete any app created with the Pikkart SDK on the device before installing a new one.

Before we start with code analysis let's see what needs to be done with regards to profiles and licensing when you're using the demo license.
A couple of notes:

  • The demo license is already provided in the project. It's the "license.spz" file in the "Resources" folder. You may want to keep it there for this example, but you will have to replace it with a valid license once you plan to ship your own app.
  • You may need to install Fastlane, if you haven't already done so, in order to configure code signing in Visual Studio. Please refer to this page for more information.


The issues at play here are two:
1) the demo license only allows the creation of apps whose Bundle Identifier is "com.pikkart.trial" and
2) that Bundle Identifier, which needs to be unique all over the world, is already registered by Pikkart, so you're not going to be able to register it to yourself for your apps

In layman terms this means you're not going to be able to specify the "com.pikkart.trial" Bundle Identifier and choose a Provisioning Profile that belongs to your own Apple developer profile and at the same time is linked to the "com.pikkart.trial" Bundle Identifier. We're going to solve this by using a Wildcard provisioning profile.

  • Open the project options and, in the "iOS Bundle Signing" section, choose your own signing identity and the Wildcard provisioning profile. Close the options window.
  • Open the Info.plist file. Under the "Identity" section check that the Bundle Identifier is "com.pikkart.trial", then under "Signing" make sure that automatic signing is turned off and your own team signing identity is selected.

An unrelated step that is necessary to perform before being able to run the example is adding Pikkart's AR libraries to the solution:

  • Right click on the project (not the solution!) icon and choose "Add" --> "Add NuGet packages.."
  • Find the "Pikkart.ArSdk" package and install it. You will be asked to accept the SDK license.

We're finally ready to look into the existing sample code!

This is a fairly simple example, everything we need to look at resides in the RecognitionViewController.cs file
First thing to note is that the class inherits from PKTRecognitionController. This is necessary because the base class provides OpenGL hooks to render 3D stuff in the window.
In addition to that we are going to implement the IPKTIRecognitionListener interface. By doing so we're going to be able to respond to recognition events such as "marker found" and "marker lost".

The most important bit of the example is probably the recognition initialization, in the ViewDidLoad method:


            string[] dbNames={""};
            PKTCloudRecognitionInfo info = new PKTCloudRecognitionInfo(dbNames);
            PKTRecognitionOptions options = new PKTRecognitionOptions(PKTRecognitionStorage.PKTLOCAL, 
                                                                      PKTRecognitionMode.PKTRECOGNITION_CONTINUOS_SCAN,
                                                                      info);

            ApplyCameraGlOrientation(UIApplication.SharedApplication.StatusBarOrientation);
            StartRecognition(options,this);

The method that kickstarts the recognition process is, easily enough, StartRecognition. It needs two parameters, the first being a set of options to specify how the recognition should be performed.
In this example we are requesting that the recognition should be performed locally (in the device, as opposed to in a server on the cloud) by passing PKTRecognitionStorage.PKTLOCAL as the first PKTRecognition option.
The second PKTRecognition option (PKTRecognitionMode.PKTRECOGNITION_CONTINUOS_SCAN) tells the SDK that we want a continuous scan, as opposed to a scan triggered by a user touch on the screen. The latter is preferable when the recognition is performed remotely, as it generates less network traffic, but in this example we're working locally so we have requested continuous scanning.

The rest of the code is either the implementation of the IPKTIRecognitionListener interface or code that implements the OpenGL rendering of a little monkey that will pop up in the test marker (provided with the sample).

The IPKTIRecognitionListener interface provides the following listener methods:

        void ExecutingCloudSearch()

Called while waiting for a cloud-based search to be performed.

        void CloudMarkerNotFound() 

Called while the engine is searching for a marker but it has not been found yet  (after searching through all cloud based markers, or after a tap-to-scan failed).

        void InternetConnectionNeeded()

This method is called when the app is set to perform cloud operations or image searches (PKTRecognitionStorage.GLOBAL) but internet connection is absent. It's merely a warning that the engine cannot work as intended.

        void MarkerFound(PKTMarker marker)

markerFound is called every time a marker is found and its tracking has been started.

        void markerNotFound()

Called while the engine is searching for a local marker but it has not been found yet  (after searching through all markers, or after a tap-to-scan failed).

        void MarkerTrackingLost(string markerId)

markerTrackingLost is called as soon as the current marker tracking is lost.

        void ARLogoFound(string markerId, NSNumber code)

This is an advanced recognition feature of our SDK, which will be explained in a later tutorial. For now, suffice to say that it will enable you to recognize different markers that look the same to the user

        void MarkerEngineToUpdate(string markerId)

We are not going to explain in detail the rest of the code as it is not relevant to recognition, but rather to drawing 3D objects with OpenGL.
The last, important, bit of information is how to make everything work. In the Resources folder you will find another folder called Markers.bundle: this is where the SDK looks for information about the images it should recognize. We have already provided a sample marker in there.
In the Resources/markerImages folder you will find a jpg image that corresponds to the above mentioned marker file. Either print or open it in an image viewer and launch the app on a device.

At the end of the recognition process, you should see something like this:

Success!