Xamarin NuGet Package >
AR Logo for iOS
This tutorial starts from the project created following the planar marker tutorial, and teaches how to create an app that reacts to ARLogo prints.
ARLogo is a Pikkart proprietary technology that allows you to create multiple “versions” of the same image (marker) by adding to it information that is detectable by our SDK. This allows developers to create multiple augmented reality experiences on the same image. The ARLogo detection is on-device and thus does not need an internet connection. For more details visit the ARLogo page.
The complete Xcode project can be downloaded on github from this link. We are going to explain the code found there rather than starting from scratch and we kindly ask you to read the planar marker tutorial first, because most of what is required to run the AR Logo is already present there.
Most of the difference between the two tutorials is that here we are going to receive one additional event handler call; the recognition of the ARLogo code starts right after the detection of a marker, and the ARLogoFound() callback of the PKTIRecognitionListener protocol is called when an ARLogo code is recognized in the image. This means that there’s a short period of time (usually less than a second) between the detection of the marker and the recognition of the ARLogo.
To give an immediate response to the user, we will show a placeholder content during this period of time.
In this example we're going to provide you with predefined markers and the corresponding images so that you may try this technology right now.
To use the ARLogo technology in an app of yours, on a marker created by you on the CMS, make sure to check the “enable ARLogo” checkbox when creating markers, then download and use the resulting .dat marker file in your app.
For this tutorial we've already created the ARLogo marker and 3 ARLogo prints, that can be found in the project itself.
Looking at the RootViewController class we can see that the AR Logo recognition code is located in the ARLogoFound method:
func arLogoFound(_ markerId: String!, withCode code: NSNumber!) {
func arLogoFound(_ markerId: String!, withCode code: NSNumber!) {
print("arLogo called with code = \(code)")
switch (code.intValue) {
case 598073875:
mainCtrl!.selectMonkey(monkeyID: 1)
case 84895:
mainCtrl!.selectMonkey(monkeyID: 2)
case 65266:
mainCtrl!.selectMonkey(monkeyID: 3)
default:
mainCtrl!.selectMonkey(monkeyID: 1)
}
}
}
This is an event handler introduced in the very first tutorials but we never had a case to show it, until now. Just like other methods (markerFound, markerNotFound etc) it is called after registering to its events with a call to the startRecognition method.
Run the app and point the camera to the images in the markerImages folder inside the main project folder (1836_65266.png, 1836_84895.png, 1836_97703.png).
You should see several logs in the Xcode Debug Console regarding ARLogo detections and the color of the monkey rendered in the glkView method should change each time you point your camera towards a different image (out of the three mentioned above).
The last thing we do is set the monkey to the gray texture when we lose the tracking of a marker:
func markerTrackingLost(_ markerId: String) {
print("markerTrackingLost called")
mainCtrl!.selectMonkey(monkeyID: 0)
}
Now we can run the app and see the color of the monkey change on the different ARLogo prints.
This should be the behavior of the app: