Skip to main content

Importing the SDK

Remotely

Starting the installation

  pod init

Our SDK can be imported using CocoaPods.

SDKCurrent version
ZaigIosFaceReconpod 'ZaigIosFaceRecon', '~> 6.1.1'
iOS Minimum Deployment Target

15.5

Using simulators on MacBooks with arm64 chip

Currently, our FaceRecon SDK for iOS unfortunately does not support being compiled for simulators running on a MacBook with arm64 architecture chip (M1/M2/M3/M4), unless Rosetta is used, which translates the x86_64 architecture to arm64.

To start the installation, run the command on the side in your project's root folder.

Adding the source to podfile

   source 'https://github.com/ZaigCoding/iOS.git'
source 'https://cdn.cocoapods.org/'

The next step is to add the QI Tech source to the podfile file.

Adding the pod to podfile

  pod 'ZaigIosFaceRecon', '~> <version>'

Finally, just add the pod name according to the format on the side.

Attention:

Architecture Change (v6.0.0+) Starting from version 6.0.0, the SDK is distributed exclusively in static form. In your Podfile, you must use the :linkage => :static configuration.

Podfile example (Version 6.0.0 or higher)

  source 'https://github.com/ZaigCoding/iOS.git'
source 'https://cdn.cocoapods.org/'
target 'ExampleApp' do
use_frameworks! :linkage => :static
pod 'ZaigIosFaceRecon', '~> 6.0.0'
end

post_install do |installer|
installer.pods_project.targets.each do |target|
if ['DatadogCore', 'DatadogInternal', 'DatadogCrashReporting', 'DatadogLogs'].include?(target.name)
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.5'
config.build_settings['BUILD_LIBRARY_FOR_DISTRIBUTION'] = 'YES'
end
end
end
end

Podfile example (Previous Versions)

  source 'https://github.com/ZaigCoding/iOS.git'
source 'https://cdn.cocoapods.org/'
target 'ExampleApp' do
use_frameworks!
pod 'ZaigIosFaceRecon', '~> 5.0.0'
end

post_install do |installer|
installer.pods_project.targets.each do |target|
if ['DatadogCore', 'DatadogInternal', 'DatadogCrashReporting', 'DatadogLogs'].include?(target.name)
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.5'
config.build_settings['BUILD_LIBRARY_FOR_DISTRIBUTION'] = 'YES'
end
end
end
end
Attention

When integrating dependencies on iOS, the need may arise to use static linking for some libraries and dynamic for others. This configuration is relevant to ensure compatibility, avoid build errors and optimize project performance.

Hybrid Dependency Linking (if necessary)

The need for hybrid linking arises because some libraries have specific requirements, with some needing static linking to avoid internal conflicts and symbol duplication, and other dependencies may need dynamic linking, as they are designed for modularity and sharing between projects.

Differences Between Static and Dynamic Linking

  • Static (static_framework): The library code is directly incorporated into the final binary, reducing runtime load time and eliminating external dependencies during execution.
  • Dynamic (dynamic_framework): The library is loaded at runtime as a separate file. This reduces the final binary size and facilitates independent updates/modifications.

Configuring hybrid linking in Podfile

...

use_frameworks! :linkage => :dynamic # CONFIGURING THE DEFAULT LINKING MODE TO DYNAMIC

...

static_frameworks = ['framework_1', 'framework_2', ...] # INCLUDE ALL DEPENDENCIES THAT NEED TO BE LINKED STATICALLY
pre_install do |installer|
installer.pod_targets.each do |pod|
if static_frameworks.include?(pod.name)
def pod.static_framework?;
true
end
def pod.build_type;
Pod::BuildType.static_framework
end
end
end
end

Installing dependencies

  pod install

Finally, run the pod install command to download and install the dependencies.

Necessary Permissions

For the SDK to access device resources to collect the user's selfie, it is necessary to request permissions from the user.

In the info.plist file, add the permissions below:

PermissionReason
Privacy - Camera Usage DescriptionAccess to the camera to capture the user's selfie.

Starting the SDK

Important Warning!

Starting from version 5.0.0, the authentication system has been updated to use clientSessionKey instead of mobileToken. In addition, new configuration options have been added for feedback screens.

Obtaining the Client Session Key

Before configuring the SDK, you must generate a temporary clientSessionKey through a server-to-server request to our face recognition API.

Endpoint

EnvironmentURL
Sandboxhttps://api.sandbox.zaig.com.br/face_recognition/client_session
Productionhttps://api.zaig.com.br/face_recognition/client_session

Request

Method: POST

Headers:

{
"Authorization": "YOUR_FACE_RECON_API_KEY"
}

Body (Optional, but recommended):

{
"user_id": "unique_user_identifier"
}

Important: The user_id field is highly recommended for security and anti-fraud measures. Use a unique identifier for your application's user.

Response

The successful response will contain the client_session_key that should be passed to the SDK configuration.

{
"client_session_key": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."
}

SDK initialization example


import ZaigIosFaceRecognition

class ViewController: UIViewController, ZaigIosFaceRecognitionControllerDelegate {

var zaigFaceRecognitionConfiguration : ZaigIosFaceRecognitionConfiguration?

override func viewDidLoad() {
super.viewDidLoad()
self.setupFaceRecognition()
}

func setupFaceRecognition() -> Void {
// The environment can be 'Sandbox' or 'Production'
let environment = ZaigIosFaceRecognitionEnvironment.Sandbox

// ClientSessionKey is the key you got via Face Recognition request. Each environment requires a different API_KEY.
let clientSessionKey = fetchClientSessionKey()

self.faceRecognitionConfig = ZaigIosFaceRecognitionConfiguration(environment: environment,
clientSessionKey: clientSessionKey,
sessionId: "UNIQUE_SESSION_ID",
backgroundColor: "#000000",
fontColor: "#FFFFFF",
fontFamily: .open_sans,
showIntroductionScreens: true,
showSuccessScreen: false,
showInvalidTokenScreen: true,
activeFaceLiveness: true,
audioConfiguration: AudioConfiguration.Enable,
logLevel: .debug
)
}

// Event where you intend to call QI Tech FaceRecognition View Controller - on this example, when the user press 'next' button

@IBAction func pressNext(_ sender: Any) {
let zaigFaceRecognitionController = ZaigIosFaceRecognitionController(faceRecognitionConfiguration: self.faceRecognitionConfig)
zaigFaceRecognitionViewController.delegate = self
let zaigFaceRecognitionViewController = zaigFaceRecognitionController.getViewController()
present(zaigFaceRecognitionViewController, animated: true, completion: nil)
}

// Do something if QI Tech FaceRecognition's SDK successfully collected picture
func zaigIosFaceRecognitionController(_ faceRecognitionViewController: ZaigIosFaceRecognitionController, didFinishWithResults results: ZaigIosFaceRecognitionControllerResponse) {

}

// Do something if QI Tech FaceRecognition's SDK found any error when collecting picture
func zaigIosFaceRecognitionController(_ faceRecognitionViewController: ZaigIosFaceRecognitionController, didFailWithError error: ZaigIosFaceRecognitionControllerError) {

}

// Do something if the user canceled the picture collection on any steps
func zaigIosFaceRecognitionControllerDidCancel(_ faceRecognitionViewController: ZaigIosFaceRecognitionController) {

}
}

To incorporate the SDK into your application, you must configure your custom capture application through the ZaigIosFaceRecognitionConfiguration class and then instantiate the ViewController ZaigIosFaceRecognitionController passing the custom configurations as an argument.

To start the face analysis process, simply call the present function to call the QI Tech ViewController that will perform the selfie capture.

It is important to implement the Delegate responsible for receiving returns in case of success, error or if the user interrupts the journey at any stage of validation.

On the side we have a complete example of the implementation.

Previous Versions

SDK initialization


import ZaigIosFaceRecognition

class ViewController: UIViewController, ZaigIosFaceRecognitionControllerDelegate {

var zaigFaceRecognitionConfiguration : ZaigIosFaceRecognitionConfiguration?

override func viewDidLoad() {
super.viewDidLoad()
self.setupFaceRecognition()
}

func setupFaceRecognition() -> Void {
// The environment can be 'Sandbox' or 'Production'
let environment = ZaigIosFaceRecognitionEnvironment.Sandbox

// MobileToken is the key sent to you by QI Tech. Each environment requires a different MobileToken.
let mobileToken = "YOUR_MOBILE_TOKEN_SENT_BY_QITECH"

self.faceRecognitionConfig = ZaigIosFaceRecognitionConfiguration(environment: environment,
mobileToken: mobileToken,
sessionId: "UNIQUE_SESSION_ID",
backgroundColor: "#000000",
fontColor: "#FFFFFF",
fontFamily: .open_sans,
showIntroductionScreens: true,
activeFaceLiveness: true,
audioConfiguration: AudioConfiguration.Enable,
logLevel: .debug
)
}

// Event where you intend to call QI Tech FaceRecognition View Controller - on this example, when the user press 'next' button

@IBAction func pressNext(_ sender: Any) {
let zaigFaceRecognitionController = ZaigIosFaceRecognitionController(faceRecognitionConfiguration: self.faceRecognitionConfig)
zaigFaceRecognitionViewController.delegate = self
let zaigFaceRecognitionViewController = zaigFaceRecognitionController.getViewController()
present(zaigFaceRecognitionViewController, animated: true, completion: nil)
}

// Do something if QI Tech FaceRecognition's SDK successfully collected picture
func zaigIosFaceRecognitionController(_ faceRecognitionViewController: ZaigIosFaceRecognitionController, didFinishWithResults results: ZaigIosFaceRecognitionControllerResponse) {

}

// Do something if QI Tech FaceRecognition's SDK found any error when collecting picture
func zaigIosFaceRecognitionController(_ faceRecognitionViewController: ZaigIosFaceRecognitionController, didFailWithError error: ZaigIosFaceRecognitionControllerError) {

}

// Do something if the user canceled the picture collection on any steps
func zaigIosFaceRecognitionControllerDidCancel(_ faceRecognitionViewController: ZaigIosFaceRecognitionController) {

}
}

Mobile Token

We use a Mobile Token to allow authenticated access from your application to our API. It has probably already been sent to you by email. If you have not yet received your token, send an email to suporte.caas@qitech.com.br.

Our API expects to receive the Mobile Token in all requests to our server from the SDK, therefore, it must be included as a configuration parameter through the method mentioned above.

Attention

You must replace"YOUR_MOBILE_TOKEN_SENT_BY_QITECH" with the Mobile Token received from support.