DOT iOS Face library
v5.2.0
Introduction
DOT iOS Face as a part of the DOT iOS libraries family provides components for the digital onboarding process using the latest Innovatrics IFace image processing library. It wraps the core functionality of IFace library to a higher-level module which is easy to integrate into an iOS application.
Requirements
DOT iOS Face has the following requirements:
Xcode 14+
iOS 11.0+
Swift or Objective-C
CocoaPods or Swift Package Manager
Distribution
Modularization
DOT iOS Face is divided into core module and optional feature modules. This enables you to reduce the size of the library and include only modules that are actually used in your use case.
DOT iOS Face is divided into following modules:
dot-face-core
(Required) - provides API for all the features and functionalities.dot-face-detection-fast
(Optional) - enables the fast face detection feature.dot-face-detection-balanced
(Optional) - enables the balanced face detection feature.dot-face-verification
(Optional) - enables template extraction and verification feature.dot-face-eye-gaze-liveness
(Optional) - enables the eye gaze liveness feature.dot-face-passive-liveness
(Optional) - enables the passive liveness feature.dot-face-background-uniformity
(Optional) - enables the passive liveness feature.dot-face-expression-neutral
(Optional) - enables the face expression evaluation feature.
Each feature module can have other modules as their dependency and cannot be used without it, see the table below. Modules dot-face-detection-fast
and dot-face-detection-balanced
belong to one category, therefore only one of them can be activated.
Module | Dependency |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
dot-face-detection-* stands for either dot-face-detection-fast or dot-face-detection-balanced . |
For example, if you want to use Eye Gaze Liveness you will have to use these three modules: dot-face-eye-gaze-liveness
, dot-face-detection-*
(required by dot-face-eye-gaze-liveness
) and dot-face-core
(always required).
DOT iOS Face is distributed as a set of XCFramework packages. Each module is distributed as a single XCFramework package, see the table below.
Module | Swift module | XCFramework |
| DotFaceCore | DotFaceCore.xcframework |
| DotFaceDetectionFast | DotFaceDetectionFast.xcframework |
| DotFaceDetectionBalanced | DotFaceDetectionBalanced.xcframework |
| DotFaceVerification | DotFaceVerification.xcframework |
| DotFaceEyeGazeLiveness | DotFaceEyeGazeLiveness.xcframework |
| DotFacePassiveLiveness | DotFacePassiveLiveness.xcframework |
| DotFaceBackgroundUniformity | DotFaceBackgroundUniformity.xcframework |
| DotFaceExpressionNeutral | DotFaceExpressionNeutral.xcframework |
Swift Package Manager
DOT iOS Face can be easily integrated into Xcode project in: Project → Package Dependencies.
Use https://github.com/innovatrics/dot-ios-sdk-spm.git
repository and choose the version you want to use. There you can select set of DotFace*
packages you want to use. All the required dependencies will be downloaded with the selected package set.
Cocoapods
In order to integrate DOT iOS Face into your project, the first step is to insert the following line of code on top of your Podfile
.
source 'https://github.com/innovatrics/innovatrics-podspecs'
Then, add the module(s) which you want to use to your Podfile
. You also need to add dependencies of the module(s) you want to use.
Following Podfile
shows how to use all modules:
source 'https://github.com/innovatrics/innovatrics-podspecs'
use_frameworks!
target 'YOUR_TARGET' do
pod `dot-face-core`
pod `dot-face-detection-fast`
pod 'dot-face-verification'
pod 'dot-face-eye-gaze-liveness'
pod 'dot-face-passive-liveness'
pod `dot-face-background-uniformity`
pod `dot-face-expression-neutral`
end
Following Podfile
shows how to use only dot-face-detection-fast
module:
source 'https://github.com/innovatrics/innovatrics-podspecs'
use_frameworks!
target 'YOUR_TARGET' do
pod `dot-face-core`
pod 'dot-face-detection-fast'
end
If a CocoaPods problem with
|
Supported Architectures
DOT iOS Face provides all supported architectures in the distributed XCFramework package.
Device binary contains: arm64
.
Simulator binary contains: x86_64
, arm64
.
Licensing
In order to use DOT iOS Face in your iOS application, it must be licensed. The license can be compiled into the application as it is bound to Bundle Identifier specified in the General tab in Xcode.
The license ID can be retrieved as follows – required only once for license generation:
import DotFaceCore
...
print("LicenseId: " + DotFaceLibrary.shared.licenseId)
...
In order to obtain the license, please contact your Innovatrics’ representative specifying Bundle Identifier and License ID. After you have obtained your license file, add it to your Xcode project and use it during the DOT iOS Face initialization, as shown below.
Permissions
Set the following permission in Info.plist
:
<key>NSCameraUsageDescription</key>
<string>Your usage description</string>
Basic Setup
Initialization
Before using any of the DOT iOS Face components, you need to initialize it with the license and list of feature modules you want to use. Each module can be accessed by its singleton *Module
class.
Following code snippet shows how to initialize DOT iOS Face with all feature modules. If you want to handle initialization and deinitialization events of DotFaceLibrary
class, you need to implement DotFaceLibraryDelegate
.
import DotFaceCore
import DotFaceDetectionFast
import DotFacePassiveLiveness
import DotFaceVerification
import DotFaceEyeGazeLiveness
import DotFaceBackgroundUniformity
import DotFaceExpressionNeutral
if let url = Bundle.main.url(forResource: "iengine", withExtension: "lic") {
do {
let license = try Data(contentsOf: url)
let configuration = DotFaceLibraryConfiguration(license: license,
modules: [
DotFaceDetectionFastModule.shared,
DotFacePassiveLivenessModule.shared,
DotFaceVerificationModule.shared,
DotFaceEyeGazeLivenessModule.shared,
DotFaceBackgroundUniformityModule.shared,
DotFaceExpressionNeutralModule.shared])
DotFaceLibrary.shared.setDelegate(self)
DotFaceLibrary.shared.initialize(configuration: configuration)
} catch {
print(error.localizedDescription)
}
}
After you have successfully finished initialization, you can use all added features by importing only DotFaceCore
Swift module in your source files. Keep in mind that if you try to use any feature which was not added during initialization DOT iOS Face will generate fatal error.
DOT Face Configuration
You can configure DotFaceLibrary
using DotFaceLibraryConfiguration
class.
let configuration = try DotFaceLibraryConfiguration(license: license,
modules: modules,
faceDetectionConfidenceThreshold: 0.06)
Face Detection Confidence Threshold (faceDetectionConfidenceThreshold
)
The interval of the confidence score is [0.0, 1.0]
and the default value of the threshold is 0.06
. Faces with a confidence score lower than this value are ignored.
Deinitialization
When you have finished using the DOT iOS Face, it is usually a good practice to close it in order to free the memory. You can close DOT iOS Face only after the complete process is finished and not within the life cycle of individual components. This can be performed using the DotFaceLibrary.shared.deinitialize()
method. If you want to use the DOT iOS Face components again, you need to call DotFaceLibrary.shared.initialize()
again.
Following code snippet shows how to deinitialize DOT iOS Face:
DotFaceLibrary.shared.deinitialize();
Logging
DOT iOS Face supports logging using a global Logger
class. You can set the log level as follows:
import DotFaceCore
Logger.logLevel = .debug
Log levels:
info
debug
warning
error
none
Each log message contains dot-face
tag. Keep in mind that logging should be used just for debugging purposes.
Components
Overview
DOT iOS Face provides both non-UI and UI components. Non-UI components are aimed to be used by developers who want to build their own UI using the DOT iOS Face functionality. UI components are build on top of non-UI components. Components having UI are available as UIViewController
classes and can be embedded into the application’s existing UI or presented using the standard methods.
List of Non-UI Components
- FACE DETECTOR
A component for performing face detection on an image, creating templates and evaluating face attributes.
- TEMPLATE MATCHER
A component for performing template matching.
- FACE MATCHER
A component for performing face matching.
List of UI Components
- FACE AUTO CAPTURE
A visual component for capturing good quality face photos and creating templates suitable for matching.
- FACE SIMPLE CAPTURE
A visual component for capturing face photos and creating templates suitable for matching without considering photo quality requirements.
- EYE GAZE LIVENESS
A visual component which performs the liveness check based on object tracking. An object is shown on the screen and the user is instructed to follow the movement of this object with her/his eyes.
- SMILE LIVENESS
A visual component which performs the liveness check based on the changes in the face expression.
Non-UI Components
Face Detector
The FaceDetector
class provides the face detection functionality. Face detection stops when maximumFaces
is reached.
This component requires dot-face-detection-*
module.
Create FaceDetector
:
let faceDetector = FaceDetector()
To perform detection, call the following method on the background thread:
let detectedFaces = faceDetector.detect(faceImage: faceImage, maximumFaces: 10)
Template Matcher
In order to match face templates (1:1), use the TemplateMatcher
class. The recommended approach is to create face templates using FaceDetector
or Face Auto Capture component and use only templates for matching.
This component requires dot-face-verification
module.
Create TemplateMatcher
:
let templateMatcher = TemplateMatcher()
To perform matching, call the following method on the background thread:
let result = try? matcher.match(referenceTemplate: referenceTemplate, probeTemplate: probeTemplate)
Face Matcher
In order to match face images (1:1), use the FaceMatcher
class. It is also possible to match a face image against a template, which is a recommended approach if you already have an available reference template.
This component requires dot-face-verification
module.
Create FaceMatcher
:
let faceMatcher = FaceMatcher()
To perform matching, call one of the following methods on the background thread:
let result = try? faceMatcher.match(referenceFaceImage: referenceImage, probeFaceImage: probeImage)
let result = try? faceMatcher.match(referenceTemplate: referenceTemplate, probeFaceImage: probeImage)
UI Components
View Controller Configuration
Components containing UI are embedded into the application as view controllers. All view controllers can be embedded into your own view controller or presented directly. Each view controller can be configured using its *Configuration
class and each view controller can have its appearance customized using its *Style
class.
To present view controller:
let controller = FaceAutoCaptureViewController.create(configuration: .init(), style: .init())
controller.delegate = self
navigationController?.pushViewController(controller, animated: true)
To embed view controller into your view controller:
override func viewDidLoad() {
super.viewDidLoad()
addChild(viewController)
view.addSubview(viewController.view)
viewController.view.translatesAutoresizingMaskIntoConstraints = false
viewController.didMove(toParent: self)
NSLayoutConstraint.activate([
viewController.view.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor),
viewController.view.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor),
viewController.view.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor),
viewController.view.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor)
])
}
Safe Area
DOT iOS Face view controllers ignore safe area layout guide when they layout their subviews. Therefore, for example if you push DOT iOS Face view controller using UINavigationController
, you will get incorrect layout. If you want to respect safe area layout guide, you should embed DOT iOS Face view controller in a container view controller and setup the layout constraints accordingly.
Face Auto Capture
The view controller with instructions for obtaining quality face images suitable for matching.
This component requires dot-face-detection-*
module. If you want to evaluate background uniformity during the face auto capture process, you will also need dot-face-background-uniformity
module. If you want to evaluate face mask during the face auto capture process, you will need dot-face-detection-balanced
module.
The following properties are available in FaceAutoCaptureConfiguration
:
(Optional)
[CameraFacing.front]
cameraFacing: CameraFacing
– Camera facing.CameraFacing.front
CameraFacing.back
(Optional)
[CameraPreviewScaleType.fit]
cameraPreviewScaleType: CameraPreviewScaleType
– The camera preview scale type.CameraPreviewScaleType.fit
CameraPreviewScaleType.fill
(Optional)
[0.10]
minFaceSizeRatio: Double
– The minimum ratio of the face size to the shorter side of the image. This value must be equal or greater than minimum valid face size ratio.(Optional)
[0.30]
maxFaceSizeRatio: Double
– The maximum ratio of the face size to the shorter side of the image.(Optional)
[false]
isCheckAnimationEnabled: Bool
– Shows a checkmark animation after enrollment.(Optional)
qualityAttributes: Set<QualityAttribute>
– Provide the required quality attributes of the output image.
The following properties are available in FaceAutoCaptureStyle
:
(Optional)
[UIColor.black]
backgroundColor: UIColor
- Background color of top level view.(Optional)
[UIColor(red: 19.0/255.0, green: 19.0/255.0, blue: 19.0/255.0, alpha: 0.5)]
backgroundOverlayColor: UIColor
- Background color of overlay view.(Optional)
[UIColor.white]
circleOutlineColor: UIColor
- Color of circle in overlay view.(Optional)
[UIColor.white]
trackingCircleColor: UIColor
- Tracking circle color.(Optional)
[UIColor(red: 0, green: 191.0/255.0, blue: 178.0/255.0, alpha: 1)]
circleOutlineColorStayStill: UIColor
- Stay still phase color of circle in overlay view and color of animated tick.(Optional)
[UIFont.systemFont(ofSize: 16, weight: .semibold)]
hintFont: UIFont
- Hint label font.(Optional)
[UIColor(red: 2.0/255.0, green: 27.0/255.0, blue: 65.0/255.0, alpha: 1.0)]
hintTextColor: UIColor
- Hint label text color.(Optional)
[UIColor(red: 248.0/255.0, green: 251.0/255.0, blue: 251.0/255.0, alpha: 1.0)]
hintBackgroundColor: UIColor
- Hint view background color.
If a face present in an image has a face size out of the minimum or maximum face size interval, it won’t be detected. Please note that a wider face size interval results in a lower performance (detection FPS).
You can handle the FaceAutoCaptureViewController
events using its delegate FaceAutoCaptureViewControllerDelegate
.
@objc(DOTFaceAutoCaptureViewControllerDelegate) public protocol FaceAutoCaptureViewControllerDelegate: AnyObject {
@objc optional func faceAutoCaptureViewControllerViewDidLoad(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewDidLayoutSubviews(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewWillAppear(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewDidAppear(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewWillDisappear(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewDidDisappear(_ viewController: FaceAutoCaptureViewController)
@objc optional func faceAutoCaptureViewControllerViewWillTransition(_ viewController: FaceAutoCaptureViewController)
/// Tells the delegate that you have no permission for camera usage.
@objc optional func faceAutoCaptureViewControllerNoCameraPermission(_ viewController: FaceAutoCaptureViewController)
/// Tells the delegate that the component has stopped.
@objc optional func faceAutoCaptureViewControllerStopped(_ viewController: FaceAutoCaptureViewController)
/// Tells the delegate that the capture step has changed.
@objc func faceAutoCaptureViewController(_ viewController: FaceAutoCaptureViewController, stepChanged captureStepId: CaptureStepId, with detectedFace: DetectedFace?)
/// Tells the delegate that the face was captured. This callback is called after the component has stopped.
@objc func faceAutoCaptureViewController(_ viewController: FaceAutoCaptureViewController, captured detectedFace: DetectedFace)
}
Start the face auto capture process:
Check whether DOT iOS Face is initialized.
If DOT iOS Face is initialized, you can call the
start()
method immediately. If not, you need to initialize DOT iOS Face and callstart()
in theDotFaceLibraryDelegate.dotFaceLibraryInitializationFinished()
callback.
When the face auto capture process finishes successfully, the result will be returned via the faceAutoCaptureViewController(_:captured:)
callback.
Call restart()
method in order to start over the face auto capture process. You can also call restart()
method to stop and start over ongoing process.
In case you want to stop the face auto capture process prematurely, call the stopAsync()
method. The faceAutoCaptureViewControllerStopped()
callback indicates that the processing is over.
Once the face auto capture process has started, it is not safe to deinitialize the DOT iOS Face until one of these callbacks is called:
faceAutoCaptureViewController(_:captured:)
faceAutoCaptureViewControllerStopped()
CaptureStepId
events are emitted when the user enters each step.
presence
position
proximity
glassStatus
backgroundUniformity
pitchAngle
yawAngle
eyeStatus
mouthStatus
mask
light
Quality Attributes of the Output Image
You may adjust quality requirements for the output image. To perform this, you can use various QualityProvider
implementations with recommended values and pass this configuration via FaceAutoCaptureConfiguration
by setting the qualityAttributes
. You can also provide your own implementations according to your needs.
For example, if you wish to capture an image suitable for matching but you also want to make sure a user doesn’t wear glasses, you can use the following implementation:
let registry = DefaultQualityAttributeRegistry()
let provider = MatchingQualityProvider()
var customSet = provider.getQualityAttributes()
customSet.insert(registry.findBy(qualityAttributeId: .glassStatus))
let configuration = try! FaceAutoCaptureConfiguration(qualityAttributes: customSet)
let controller = FaceAutoCaptureViewController.create(configuration: configuration, style: .init())
controller.delegate = self
navigationController?.pushViewController(controller, animated: true)
See DefaultQualityAttributeRegistry
for default values and all available quality attributes.
Available quality providers:
MatchingQualityProvider
– The resulting image suitable for matching.PassiveLivenessQualityProvider
– The resulting image suitable for evaluation of the passive liveness.IcaoQualityProvider
– The resulting image passing ICAO checks.
Face Simple Capture
The view controller for obtaining images for matching without considering any photo quality requirements. This component requires dot-face-detection-*
module.
The following properties are available in FaceSimpleCaptureConfiguration
:
(Optional)
[CameraFacing.front]
cameraFacing: CameraFacing
– Camera facing.CameraFacing.front
CameraFacing.back
(Optional)
[CameraPreviewScaleType.fit]
cameraPreviewScaleType: CameraPreviewScaleType
– The camera preview scale type.CameraPreviewScaleType.fit
CameraPreviewScaleType.fill
(Optional)
[0.10]
minFaceSizeRatio: Double
– The minimum ratio of the face size to the shorter side of the image. This value must be equal or greater than minimum valid face size ratio.(Optional)
[0.30]
maxFaceSizeRatio: Double
– The maximum ratio of the face size to the shorter side of the image.
The following properties are available in FaceSimpleCaptureStyle
:
(Optional)
[UIColor.black]
backgroundColor: UIColor
- Background color of top level view.(Optional)
[UIColor.white]
trackingCircleColor: UIColor
- Tracking circle color.
If a face present in an image has a face size out of the minimum or maximum face size interval, it won’t be detected. Please note that a wider minimum or maximum face size interval results in a lower performance (detection FPS).
You can handle the FaceSimpleCaptureViewController
events using its delegate FaceSimpleCaptureViewControllerDelegate
.
@objc(DOTFaceSimpleCaptureViewControllerDelegate) public protocol FaceSimpleCaptureViewControllerDelegate: AnyObject {
@objc optional func faceSimpleCaptureViewControllerViewDidLoad(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewDidLayoutSubviews(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewWillAppear(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewDidAppear(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewWillDisappear(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewDidDisappear(_ viewController: FaceSimpleCaptureViewController)
@objc optional func faceSimpleCaptureViewControllerViewWillTransition(_ viewController: FaceSimpleCaptureViewController)
/// Tells the delegate that you have no permission for camera usage.
@objc optional func faceSimpleCaptureViewControllerNoCameraPermission(_ viewController: FaceSimpleCaptureViewController)
/// Tells the delegate that the component has stopped.
@objc optional func faceSimpleCaptureViewControllerStopped(_ viewController: FaceSimpleCaptureViewController)
/// Tells the delegate that the face was captured.
@objc func faceSimpleCaptureViewController(_ viewController: FaceSimpleCaptureViewController, captured detectedFace: DetectedFace)
}
Start the face simple capture process:
Check whether DOT iOS Face is initialized.
If DOT iOS Face is initialized, you can call the
start()
method immediately. If not, you need to initialize DOT iOS Face and callstart()
in theDotFaceLibraryDelegate.dotFaceLibraryInitializationFinished()
callback.
You need to call requestCapture()
method in order to request a capture. The result will be returned via the faceSimpleCaptureViewController(_:captured:)
callback.
In case you want to stop the face simple capture process prematurely, call the stopAsync()
method. The faceSimpleCaptureViewControllerStopped()
callback indicates that the processing is over.
Once the face simple capture process has started, it is not safe to deinitialize the DOT iOS Face until one of these callbacks is called:
faceSimpleCaptureViewControllerStopped()
Eye Gaze Liveness
The view controller with a moving or fading object on the screen. This component requires dot-face-eye-gaze-liveness
module.
The following properties are available in EyeGazeLivenessConfiguration
:
(Required)
[-]
segments: [Segment]
– Array of segments for the object animation.(Optional)
[0.10]
minFaceSizeRatio: Double
– The minimum ratio of the face size to the shorter side of the image. This value must be equal or greater than minimum valid face size ratio.(Optional)
[0.30]
maxFaceSizeRatio: Double
– The maximum ratio of the face size to the shorter side of the image.(Optional)
[0.5]
proximityTolerance: Double
– The tolerance of the face size ratio (The tolerance of the distance between the face and the camera). A value greater than 1.0 disables the proximity check.(Optional)
[4]
minValidSegmentCount: Int
– The minimum number of valid captured segments. The value can be within the interval [4, 7].(Optional)
[TransitionType.move]
transitionType: TransitionType
– The transition type used for the liveness check object animation.TransitionType.move
TransitionType.fade
(Optional)
[-]
objectImage: UIImage?
– The moving object image.(Optional)
[CGSize(width: 50, height: 50)]
objectImageSize: CGSize
– Size of the moving object.
The following properties are available in EyeGazeLivenessStyle
:
(Optional)
[UIColor.white]
backgroundColor: UIColor
- Background color of top level view.(Optional)
[UIColor.black]
objectColor: UIColor
- Moving object color.(Optional)
[UIFont.systemFont(ofSize: 16, weight: .semibold)]
hintFont: UIFont
- Hint label font.(Optional)
[UIColor(red: 2.0/255.0, green: 27.0/255.0, blue: 65.0/255.0, alpha: 1.0)]
hintTextColor: UIColor
- Hint label text color.(Optional)
[UIColor(red: 248.0/255.0, green: 251.0/255.0, blue: 251.0/255.0, alpha: 1.0)]
hintBackgroundColor: UIColor
- Hint view background color.
You can customize the color of the default objectImage
or you can replace the default objectImage
with custom image.
You can handle the EyeGazeLivenessViewController
events using its delegate EyeGazeLivenessViewControllerDelegate
.
@objc(DOTEyeGazeLivenessViewControllerDelegate) public protocol EyeGazeLivenessViewControllerDelegate: AnyObject {
@objc optional func eyeGazeLivenessViewControllerViewDidLoad(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewDidLayoutSubviews(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewWillAppear(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewDidAppear(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewWillDisappear(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewDidDisappear(_ viewController: EyeGazeLivenessViewController)
@objc optional func eyeGazeLivenessViewControllerViewWillTransition(_ viewController: EyeGazeLivenessViewController)
/// Tells the delegate that you have no permission for camera usage.
@objc optional func eyeGazeLivenessViewControllerNoCameraPermission(_ viewController: EyeGazeLivenessViewController)
/// Tells the delegate that the component has stopped.
@objc optional func eyeGazeLivenessViewControllerStopped(_ viewController: EyeGazeLivenessViewController)
/// Tells the delegate that the eye gaze liveness state has changed.
@objc func eyeGazeLivenessViewController(_ viewController: EyeGazeLivenessViewController, stateChanged state: EyeGazeLivenessState)
/// Tells the delegate that eye gaze liveness has finished with score and captured segments. This callback is called after the component has stopped.
@objc func eyeGazeLivenessViewController(_ viewController: EyeGazeLivenessViewController, finished score: Float, with segmentImages: [SegmentImage])
/// Tells the delegate that eye gaze liveness cannot continue because there are no segments left. This callback is called after the component has stopped.
@objc func eyeGazeLivenessViewControllerNoMoreSegments(_ viewController: EyeGazeLivenessViewController)
/// Tells the delegate that eye gaze liveness failed, because no eyes were detected. This callback is called after the component has stopped.
@objc func eyeGazeLivenessViewControllerEyesNotDetected(_ viewController: EyeGazeLivenessViewController)
/// Tells the delegate that face tracking has failed. This callback is called after the component has stopped.
@objc func eyeGazeLivenessViewControllerFaceTrackingFailed(_ viewController: EyeGazeLivenessViewController)
}
Start the eye gaze liveness process:
Check whether DOT iOS Face is initialized.
If DOT iOS Face is initialized, you can call the
start()
method immediately. If not, you need to initialize DOT iOS Face and callstart()
in theDotFaceLibraryDelegate.dotFaceLibraryInitializationFinished()
callback.
The liveness check follows segments: [Segment]
and renders an object in the specified corners of the screen. For the best accuracy it is recommended to display the object in at least three different corners.
If the user’s eyes can’t be detected in the first segment, the process will be terminated with the eyeGazeLivenessViewControllerEyesNotDetected(_:)
callback.
The process is automatically finished when the number of valid items in segmentImages: [SegmentImage]
reaches minValidSegmentCount
. After that, eyeGazeLivenessViewController(_:finished:with:)
callback is called and the score can be evaluated.
The process fails with the eyeGazeLivenessViewControllerNoMoreSegments(_:)
callback when all the segments in segments: [Segment]
were displayed but it wasn’t possible to collect a number of valid images specified in minValidSegmentCount
. You can use segmentImages: [SegmentImage]
items for matching purposes, even when the eyes weren’t detected in a segment.
For a better user experience, it is recommended to provide the user more attempts, so the size of segments: [Segment]
should be greater than minValidSegmentCount
. However, this should be limited, as it is better to terminate the process if the user is failing in many segments. The recommended way of segment generation is to use a RandomSegmentsGenerator
:
let generator = RandomSegmentsGenerator()
let segmentCount = 8
let segmentDurationMillis = 1000
let segments = generator.generate(segmentCount: segmentCount, segmentDurationMillis: segmentDurationMillis)
If you want to perform a server side validation of the liveness check, please follow this recommended approach:
The object movement is generated on your server and then rendered on the device using segments: [Segment]
. When the process is finished successfully, the segmentImages: [SegmentImage]
is transferred to the server to evaluate the liveness check. Please note that segments: [Segment]
is no longer transferred and you should store it in the session of the server.
You can evaluate the liveness check by combining the corresponding segmentImages: [SegmentImage]
with segments: [Segment]
and sending the request to DOT Core Server. If the user could finish the process without using all segments, the remaining items of segments: [Segment]
should be dropped to match the number of items in segmentImages: [SegmentImage]
.
In case you want to stop the eye gaze liveness process prematurely, call the stopAsync()
method. The eyeGazeLivenessViewControllerStopped()
callback indicates that the processing is over.
Once the eye gaze liveness process has started, it is not safe to deinitialize the DOT iOS Face until one of these callbacks is called:
eyeGazeLivenessViewController(_:finished:with:)
eyeGazeLivenessViewControllerNoMoreSegments()
eyeGazeLivenessViewControllerEyesNotDetected()
eyeGazeLivenessViewControllerFaceTrackingFailed()
eyeGazeLivenessViewControllerStopped()
Smile Liveness
The view controller which performs the liveness check based on the changes in the face expression. This component requires dot-face-expression-neutral
module.
The following properties are available in SmileLivenessConfiguration
:
(Optional)
[CameraFacing.front]
cameraFacing: CameraFacing
– Camera facing.CameraFacing.front
CameraFacing.back
(Optional)
[CameraPreviewScaleType.fit]
cameraPreviewScaleType: CameraPreviewScaleType
– The camera preview scale type.CameraPreviewScaleType.fit
CameraPreviewScaleType.fill
(Optional)
[0.10]
minFaceSizeRatio: Double
– The minimum ratio of the face size to the shorter side of the image. This value must be equal or greater than minimum valid face size ratio.(Optional)
[0.30]
maxFaceSizeRatio: Double
– The maximum ratio of the face size to the shorter side of the image.(Optional)
[false]
isDetectionLayerVisible: Bool
– Use this flag to show or hide detection circle during the smile liveness process.
The following properties are available in SmileLivenessStyle
:
(Optional)
[UIColor.black]
backgroundColor: UIColor
- Background color of top level view.(Optional)
[UIFont.systemFont(ofSize: 16, weight: .semibold)]
instructionFont: UIFont
- Instruction label font.(Optional)
[UIColor(red: 2.0/255.0, green: 27.0/255.0, blue: 65.0/255.0, alpha: 1.0)]
instructionTextColor: UIColor
- Instruction label text color.(Optional)
[UIColor(red: 248.0/255.0, green: 251.0/255.0, blue: 251.0/255.0, alpha: 1.0)]
instructionBackgroundColor: UIColor
- Instruction background color.(Optional)
[UIColor.white]
placeholderColor: UIColor
- Placeholder color.(Optional)
[UIColor.white]
detectionLayerColor: UIColor
- Detection layer color.(Optional)
[UIColor(red: 19.0/255.0, green: 19.0/255.0, blue: 19.0/255.0, alpha: 0.5)]
overlayColor: UIColor
- Overlay color, semi-transparent color is recommended.
You can handle the SmileLivenessViewController
events using its delegate SmileLivenessViewControllerDelegate
.
@objc(DOTFSmileLivenessViewControllerDelegate) public protocol SmileLivenessViewControllerDelegate: AnyObject {
@objc optional func smileLivenessViewControllerViewDidLoad(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewDidLayoutSubviews(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewWillAppear(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewDidAppear(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewWillDisappear(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewDidDisappear(_ viewController: SmileLivenessViewController)
@objc optional func smileLivenessViewControllerViewWillTransition(_ viewController: SmileLivenessViewController)
/// Tells the delegate that you have no permission for camera usage.
@objc optional func smileLivenessViewControllerNoCameraPermission(_ viewController: SmileLivenessViewController)
/// Tells the delegate that the component has stopped.
@objc optional func smileLivenessViewControllerStopped(_ viewController: SmileLivenessViewController)
/// Tells the delegate that the face was not detected during the critical part of the smile liveness process.
@objc optional func smileLivenessViewControllerCriticalFacePresenceLost(_ viewController: SmileLivenessViewController)
/// Tells the delegate that the new detection was processed.
@objc optional func smileLivenessViewController(_ viewController: SmileLivenessViewController, processed detection: FaceAutoCaptureDetection)
/// Tells the delegate that the smile liveness process has finished with result.
@objc func smileLivenessViewController(_ viewController: SmileLivenessViewController, finished result: SmileLivenessResult)
}
Start the smile liveness process:
Check whether DOT iOS Face is initialized.
If DOT iOS Face is initialized, you can call the
start()
method immediately. If not, you need to initialize DOT iOS Face and callstart()
in theDotFaceLibraryDelegate.dotFaceLibraryInitializationFinished()
callback.
The process is automatically finished when both the neutral and the smile face expression images are captured. After that, smileLivenessViewController(_: finished:)
callback is called and the passive liveness and face matching can be evaluated.
If the face presence is lost after the neutral face expression image is captured, smileLivenessViewControllerCriticalFacePresenceLost(_:)
callback is called. If the face presence between the neutral and the smile face expression images is required, call restart()
method. Otherwise the callback can be ignored.
In case you want to stop the smile liveness process prematurely, call the stopAsync()
method. The smileLivenessViewControllerStopped(_:)
callback indicates that the processing is over.
Once the smile liveness process has started, it is not safe to deinitialize the DOT iOS Face until one of these callbacks is called:
smileLivenessViewController(_: finished:)
smileLivenessViewControllerStopped(_:)
Customization of UI Components
Localization
String resources can be overridden in your application and alternative strings for supported languages can be provided following these two steps:
Add your own
Localizable.strings
file to your project using standard iOS localization mechanism. To change a specific text override corresponding key in thisLocalizable.strings
file.Set the localization bundle to the bundle of your application (preferably during the application launch in your
AppDelegate
).
Use this setup if you want to use standard iOS localization mechanism, which means your iOS application uses system defined locale.
import DotFaceCore
Localization.bundle = .main
Custom Localization
You can override standard iOS localization mechanism by providing your own translation dictionary and setting the Localization.useLocalizationDictionary
flag to true
. Use this setup if you do not want to use standard iOS localization mechanism, which means your iOS application ignores system defined locale and uses its own custom locale.
import DotFaceCore
guard let localizableUrl = Bundle.main.url(forResource: "Localizable", withExtension: "strings", subdirectory: nil, localization: "de"),
let dictionary = NSDictionary(contentsOf: localizableUrl) as? [String: String]
else { return }
Localization.useLocalizationDictionary = true
Localization.localizationDictionary = dictionary
"dot.face_auto_capture.instruction.face_not_present" = "Position your face into the circle";
"dot.face_auto_capture.instruction.face_centering" = "Center your face";
"dot.face_auto_capture.instruction.face_too_close" = "Move back";
"dot.face_auto_capture.instruction.face_too_far" = "Move closer";
"dot.face_auto_capture.instruction.lighting" = "Turn towards light";
"dot.face_auto_capture.instruction.glasses_present" = "Remove glasses";
"dot.face_auto_capture.instruction.background_nonuniform" = "Plain background required";
"dot.face_auto_capture.instruction.expression_neutral_too_high" = "Smile :)";
"dot.face_auto_capture.instruction.expression_neutral_too_low" = "Keep neutral expression";
"dot.face_auto_capture.instruction.pitch_too_high" = "Lower your chin";
"dot.face_auto_capture.instruction.pitch_too_low" = "Lift your chin";
"dot.face_auto_capture.instruction.yaw_too_right" = "Look left";
"dot.face_auto_capture.instruction.yaw_too_left" = "Look right";
"dot.face_auto_capture.instruction.eye_status_low" = "Open your eyes";
"dot.face_auto_capture.instruction.mouth_status_low" = "Close your mouth";
"dot.face_auto_capture.instruction.mask_present" = "Remove mask";
"dot.face_auto_capture.instruction.capturing" = "Stay still!";
"dot.eye_gaze_liveness.instruction.watch_object" = "Watch the object";
"dot.eye_gaze_liveness.instruction.lighting" = "Turn towards light";
"dot.eye_gaze_liveness.instruction.face_not_present" = "Look straight";
"dot.eye_gaze_liveness.instruction.face_too_close" = "Move back";
"dot.eye_gaze_liveness.instruction.face_too_far" = "Move closer";
Common Classes
ImageSize
Class which represents a size of an image. To create an instance:
let imageSize = ImageSize(width: 100, height: 100)
BgrRawImage
Class which represents an image.
To create an instance from CGImage
:
let bgrRawImage = BgrRawImageFactory.create(cgImage: cgImage)
To create CGImage
from BgrRawImage
:
let cgImage = CGImageFactory.create(bgrRawImage: bgrRawImage)
FaceImage
Class which represents a face image and can be used for face detection and matching. To create an instance:
let faceImage = try? FaceImageFactory.create(image: bgrRawImage, minFaceSizeRatio: 0.05, maxFaceSizeRatio: 0.3)
minFaceSizeRatio
and maxFaceSizeRatio
, or commonly face size ratio, must be equal or greater than minimum valid face size ratio.
DetectedFace
This class represents the face detection result. The following properties and methods are available:
image: BgrRawImage
– Get a full (original) image of the face.confidence: Double
- The confidence score of the face detection. It also represents the quality of the detected face.createFullFrontalImage() → BgrRawImage
- Creates a ICAO full frontal image of a face. If boundaries of the normalized image leak outside of the original image, a white background is applied.createTemplate() throws → Template
- The face template which can be used for matching. This method requiresdot-face-verification
module.evaluateFaceAspects() throws → FaceAspects
- Evaluates face aspects.evaluateFaceQuality() throws → FaceQuality
- Evaluates face attributes that can be used for a detailed face quality assessment.evaluateFaceQuality(faceQualityQuery: FaceQualityQuery) throws → FaceQuality
- Evaluates only specific face attributes that can be used for a detailed face quality assessment. This is the recommended way for face quality evaluation due to performance reasons.evaluatePassiveLiveness() throws → FaceAttribute
- Evaluates passive liveness. This method requiresdot-face-passive-liveness
module.
Appendix
Changelog
5.2.0 - 2023-03-06
Technical release. No changes.
5.1.1 - 2023-02-21
Added
support for Swift Package Manager
5.1.0 - 2023-02-08
Added
shared dependency
DotCore
shared dependency
DotCamera
Changed
update IFace to 5.0.3
types moved to
DotCore
:BgrRawImage
,BgrRawImageFactory
,CGImageFactory
,CIImageFactory
,ImageSize
,RectangleDouble
,WrappedDouble
,PointDouble
,IntervalFloat
,IntervalDouble
types moved to
DotCamera
:CameraFacing
,CameraPreviewScaleType
5.0.0 - 2023-01-27
Changed
New SDK versioning: All libraries (DOT Document, DOT Face, DOT Face Lite and DOT NFC) are released simultaneously with a single version name. Libraries with the same version name work correctly at build time and at run time.
text of localization key
dot.face_auto_capture.instruction.face_not_present
toPosition your face into the circle
Removed
deprecated
DotFace
,DotFaceDelegate
,DotFaceConfiguration
Fixed
@objc prefix pattern to
DOTF*
4.11.1 - 2023-01-18
Changed
all UI components will return images with aspect ratio 4:3
minimal required version to Xcode 14+
Fixed
maximal resolution of images returned from all UI components was too high
4.11.0 - 2022-12-15
Changed
update IFace to 4.21.0 - improved passive liveness algorithm.
all UI components will use 4:3 aspect ratio when front camera is selected, for back camera 16:9 aspect ratio will be used
4.10.1 - 2022-08-17
Fixed
crash when camera device is not available
camera session lifecycle
camera permission issue
4.10.0 - 2022-07-28
Changed
update IFace to 4.18.3 - improved passive liveness algorithm
4.9.0 - 2022-07-11
Added
DotFaceLibrary
to replaceDotFace
DotFaceLibraryConfiguration
to replaceDotFaceConfiguration
DotFaceLibraryDelegate
to replaceDotFaceDelegate
FaceAutoCaptureViewController.restart()
Changed
deprecated
DotFace
, useDotFaceLibrary
insteaddeprecated
DotFaceConfiguration
, useDotFaceLibraryConfiguration
insteaddeprecated
DotFaceDelegate
, useDotFaceLibraryDelegate
instead
4.8.2 - 2022-05-30
Fixed
camera permission issue
4.8.1 - 2022-05-19
Fixed
API visibility issue
4.8.0 - 2022-05-16
Added
Smile Liveness UI component (
SmileLivenessViewController
,SmileLivenessViewControllerDelegate
,SmileLivenessStyle
,SmileLivenessConfiguration
,SmileLivenessResult
)DotFaceExpressionNeutral
module andDotFaceExpressionNeutralModule
class. This module is required if the face expression quality attribute is evaluated and for Smile Liveness component.localization key
dot.face_auto_capture.instruction.face_not_present
localization key
dot.face_auto_capture.instruction.expression_neutral_too_high
localization key
dot.face_auto_capture.instruction.expression_neutral_too_low
FaceAutoCaptureStyle.circleOutlineColorStayStill
Changed
design of Face Auto Capture UI component
desing of Eye Gaze Liveness UI component
Removed
FaceAutoCaptureStyle.progressValidColor
FaceAutoCaptureStyle.progressIntermediateColor
FaceAutoCaptureStyle.progressInvalidColor
FaceAutoCaptureStyle.tickColor
4.7.1 - 2022-04-05
Fixed
instruction text wrapping in UI components
4.7.0 - 2022-03-22
Added
FaceAutoCaptureViewController.start()
FaceAutoCaptureViewController.stopAsync()
FaceAutoCaptureViewControllerDelegate.faceAutoCaptureViewControllerStopped()
FaceSimpleCaptureViewController.start()
FaceSimpleCaptureViewController.stopAsync()
FaceSimpleCaptureViewControllerDelegate.faceSimpleCaptureViewControllerStopped()
EyeGazeLivenessViewController.stopAsync()
EyeGazeLivenessViewControllerDelegate.eyeGazeLivenessViewControllerStopped()
Changed
FaceAutoCaptureViewController
is no longer started implicitly and has to be started explicitly by.start()
FaceSimpleCaptureViewController
is no longer started implicitly and has to be started explicitly by.start()
updated sharpness range in
DefaultQualityAttributeRegistry
4.6.0 - 2022-03-09
Changed
DotFaceConfiguration.faceDetectionConfidenceThreshold
default value to0.06
4.5.0 - 2022-02-17
Changed
mouth status range in
DefaultQualityAttributeRegistry
.
4.4.0 - 2022-02-15
Changed
update IFace to 4.15.0 - minor improvements
4.3.1 - 2022-01-31
Added
CameraPreviewScaleType.fill
to support full screen camera preview
4.3.0 - 2021-12-16
Changed
update IFace to 4.14.0 - minor improvements
4.2.0 - 2021-11-25
Added
DotFaceBackgroundUniformity
module andDotFaceBackgroundUniformityModule
class. This module is required only if the background uniformity quality attribute is evaluated.DotFaceDetectionBalanced
module andDotFaceDetectionBalancedModule
classQualityAttributeId.mask
(requiresDotFaceDetectionBalanced
module)FaceImageFactory
to replaceFaceImage.init()
Changed
renamed module
DotFaceDetection
toDotFaceDetectionFast
removed required quality attributes validation from
FaceAutoCaptureConfiguration
,FaceAutoCaptureConfiguration.init()
no longer throwscreation of
FaceImage
fails ifminFaceSizeRatio
is not valid
Removed
FaceImage.init()
4.1.1 - 2021-10-25
Fixed
eye gaze liveness segment evaluation in incorrect order
4.1.0 - 2021-10-14
Changed
update IFace to 4.13.0 - improved passive liveness algorithm
DotFaceConfiguration.faceDetectionConfidenceThreshold
default value to0.1
update sharpness range in
DefaultQualityAttributeRegistry
,IcaoQualityProvider
,MatchingQualityProvider
andPassiveLivenessQualityProvider
Fixed
incorrect implementation of
DetectedFace.createFullFrontalImage()
4.0.0 - 2021-09-28
Added
DotFaceDetection
module andDotFaceDetectionModule
classDotFaceVerification
module andDotFaceVerificationModule
classDotFacePassiveLiveness
module andDotFacePassiveLivenessModule
classDotFaceEyeGazeLiveness
module andDotFaceEyeGazeLivenessModule
classLogger
class to improve logging mechanismDotFaceDelegate
to handleDotFace
events.DotFaceConfiguration
to wrap all configuration options ofDotFace
into single object.BgrRawImage
to represent image suitable for biometric operationsBgrRawImageConverter
andCGImageConverter
to convert betweenCGImage
andBgrRawImage
protocol
SegmentsGenerator
and classRandomSegmentsGenerator
FaceQuality
andFaceQualityQuery
FaceImageQuality
andFaceImageQualityQuery
HeadPose
andHeadPoseQuery
HeadPoseAttribute
Wearables
andWearablesQuery
Glasses
Expression
andExpressionQuery
EyesExpression
andEyesExpressionQuery
FaceAttribute
FaceAspects
Changed
minimal required iOS version to iOS 11.0
DOT iOS Face is split into multiple iOS libraries, see sections Distribution and Initialization in the integration manual
renamed module name
DOT
toDotFaceCore
renamed
DOTHandler
toDotFace
DotFace
to singletonDotFaceHandler.initialize(with license: License? = nil, faceDetectionConfidenceThreshold: Int = 600)
to.initialize(configuration: DotFaceConfiguration)
localization keys
renamed
DotFaceLocalization
toLocalization
component "Liveness Check" to "Eye Gaze Liveness" and all related API
component "Face Capture" to "Face Auto Capture" and all related API
component "Face Capture Simple" to "Face Simple Capture" and all related API
FaceImageVerifier
toFaceMatcher
TemplateVerifer
toTemplateMatcher
matching scores, face confidence, face attribute scores and attribute quality values are in interval [0.0, 1.0]
FaceImage
now usesBgrRawImage
DetectedFace
now usesBgrRawImage
renamed
QualityAttributeConfiguration
toQualityAttribute
renamed
DOTRange
toValueRange
renamed
QualityAttributeConfigurationProvider
toDefaultQualityAttributeRegistry
renamed
VerificationQualityProvider
toMatchingQualityProvider
renamed
DOTSegment
toSegment
renamed
DotPosition
toCorner
default value of
FaceAutoCaptureConfiguration.isCheckAnimationEnabled
tofalse
Removed
component "Liveness Check 2" and all related API
DOTHandler.authorizeCamera()
use native API insteadDOTHandler.logLevel
useLogger.logLevel
insteadLicense
use nativeData
type to represent license fileDOTCamera
and all related API, camera can be configured using "Configuration" classes of UI componentsFace
all API was moved toDetectedFace
CaptureCandidate
useDetectedFace
insteadFaceAttributeScore
IcaoAttribute
andIcaoRangeStatus
3.8.2 - 2021-07-23
Fixed
fixed issue with active liveness making it more seamless
3.8.1 - 2021-06-24
Added
support for interface orientation
portraitUpsideDown
to all UI components
3.8.0 - 2021-06-17
Changed
updated IFace to 4.10.0 - improved background uniformity algorithm
Removed
FaceAttributeId.yaw
,.roll
,.pitch
use.yawAngle
,.rollAngle
,.pitchAngle
instead
3.7.1 - 2021-05-10
Fixed
updated IFace to 4.9.1 - minor issue
updated glass status range in
QualityAttributeConfigurationRegistry
3.7.0 - 2021-05-03
Changed
updated IFace to 4.9.0 - improved glass status evaluation
3.6.0 - 2021-04-13
Changed
updated IFace to 4.8.0 - improved Passive Liveness algorithm
3.5.1 - 2021-03-19
Added
FaceCaptureStyle.hintTextColor
and.hintBackgroundColor
Changed
renamed style properties to be consistent across all UI components
added 'Color' suffix to name of style properties which represent
UIColor
3.5.0 - 2021-03-17
Added
DotFaceLocalization
class to improve localization mechanismCaptureCandidate.init()
to initialize withDetectedFace
public access to
CaptureCandidate.detectedFace
Changed
updated IFace to 4.4.0
renamed
Attribute
toFaceAttributeId
renamed
Feature
toFaceFeature
range of
eyeStatus
inQualityAttributeConfigurationRegistry
removed
DOTHandler.localizationBundle
useDotFaceLocalization.bundle
insteadliveness localization keys
CaptureState.yawStep
and.pitchStep
to.yawAngleStep
and.pitchAngleStep
QualityAttribute.yaw
and.pitch
to.yawAngle
and.pitchAngle
ICAO attributes now have
yawAngle
,pitchAngle
androllAngle
instead ofyaw
,pitch
androll
3.4.2 - 2020-12-16
Added
support for iOS Simulator arm64 architecture
3.4.1 - 2020-11-25
Fixed
FaceCaptureController
user interface issues
3.4.0 - 2020-09-03
Changed
updated IFace to 3.13.1
CaptureCandidate.glassStatusDependenciesFulfilled
toCaptureCandidate.glassStatusConditionFulfilled
CaptureCandidate.passiveLivenessDependenciesFulfilled
toCaptureCandidate.passiveLivenessConditionFulfilled
removed
Face.attributeIsDependencyFulfilled
, addedFace.evaluateAttributeCondition
3.3.1 - 2020-08-18
Fixed
FaceCaptureController
layout warnings
3.3.0 - 2020-08-14
Fixed
Make sure all background tasks are stopped when
LivenessCheckController.stopLivenessCheck()
is called
3.2.2 - 2020-08-11
Fixed
improved interface of
DOTCamera
3.2.1 - 2020-08-06
Fixed
crash in
DOTImage
ifCGImage
is nil
Changed
init
DOTImage
withCGimage
instead ofUIImage
updated eye status
QualityAttributeConfiguration
ranges
3.2.0 - 2020-07-30
Changed
on screen messages during face capture remain shown longer to minimize instruction flickering
changed ranges of
QualityAttributeConfigurationRegistry
removed detected face indicator after face capture finished
3.1.0 - 2020-07-10
Added
DOTRange
QualityAttribute
QualityAttributeConfiguration
QualityAttributeConfigurationRegistry
QualityAttributePreset
VerificationQualityProvider
ICAOQualityProvider
PassiveLivenessQualityProvider
Changed
removed
useAlternativeInstructions
,requestFullImage
,requestCropImage
,requestTemplate
,lightScoreThreshold
fromFaceCaptureConfiguration
added
qualityAttributeConfigurations: Set<QualityAttributeConfiguration>
toFaceCaptureConfiguration
added
static func validate(configuration: FaceCaptureConfiguration)
toFaceCaptureConfiguration
removed
requestFullImage
,requestCropImage
,requestTemplate
fromFaceCaptureSimpleConfiguration
changed
func faceCapture(_ controller: FaceCaptureController, stateChanged state: FaceCaptureState)
tofunc faceCapture(_ controller: FaceCaptureController, stateChanged state: CaptureState, withImage image: DOTImage?)
inFaceCaptureControllerDelegate
changed
func livenessCheck2(_ controller: LivenessCheck2Controller, captureStateChanged captureState: FaceCaptureState, withImage image: DOTImage?)
tofunc livenessCheck2(_ controller: LivenessCheck2Controller, stateChanged state: CaptureState, withImage image: DOTImage?)
inLivenessCheck2ControllerDelegate
3.0.1 - 2020-07-02
Fixed
draw circle around face during face capture
face capture hint label not updating correctly
3.0.0 - 2020-06-15
Changed
Update IFace to 3.10.0
FaceCaptureControllerDelegate
returnsCaptureCandidate
instead ofFaceCaptureImage
FaceCaptureSimpleControllerDelegate
returnsCaptureCandidate
instead ofFaceCaptureImage