DOT Android Face library

v4.1.0

Introduction

DOT Android Face as a part of the DOT Android libraries family provides components for the digital onboarding process using the latest Innovatrics IFace image processing library. It wraps the core functionality of IFace library to a higher-level module which is easy to integrate into an Android application.

Requirements

DOT Android Face has the following requirements:

  • Android API level 21

Distribution

Modularization

DOT Android Face is divided into core module and optional feature modules. This enables you to reduce the size of the library and include only modules that are actually used in your use case.

DOT Android Face is divided into following modules:

  • dot-face-core (Required) - provides API for all the features and functionalities.

  • dot-face-detection (Optional) - enables the face detection feature.

  • dot-face-verification (Optional) - enables template extraction and face and template matching features.

  • dot-face-eye-gaze-liveness (Optional) - enables the eye gaze liveness feature.

  • dot-face-passive-liveness (Optional) - enables the passive liveness feature.

Each feature module can have other modules as their dependency and cannot be used without it, see the table below.

Table 1. Module dependencies

Module

Dependency

dot-face-detection

dot-face-core

dot-face-verification

dot-face-detection

dot-face-eye-gaze-liveness

dot-face-detection

dot-face-passive-liveness

dot-face-detection

For example, if you want to use Eye Gaze Liveness you will have to use these three modules: dot-face-core(always required), dot-face-eye-gaze-liveness, dot-face-detection(required by dot-face-eye-gaze-liveness).

Maven Repository

DOT Android Face is distributed as a set of Android libraries (.aar packages) stored in the Innovatrics maven repository. Each library represents a single module.

In order to integrate DOT Android Face into your project, the first step is to include the Innovatrics maven repository and Google repository to your top level build.gradle file.

build.gradle
allprojects {
    repositories {
        jcenter()
        google()
        maven {
            url 'https://maven.innovatrics.com/releases'
        }
    }
}

Then, specify the dependencies of DOT Android Face libraries in the app build.gradle file. Dependencies of these libraries will be downloaded alongside them. Version x.y.z must be replaced with the current version of the library.

build.gradle
dependencies {
    //…
    implementation 'com.innovatrics.dot:dot-face-core:x.y.z'
    implementation 'com.innovatrics.dot:dot-face-detection:x.y.z'
    implementation 'com.innovatrics.dot:dot-face-verification:x.y.z'
    implementation 'com.innovatrics.dot:dot-face-eye-gaze-liveness:x.y.z'
    implementation 'com.innovatrics.dot:dot-face-passive-liveness:x.y.z'
    //…
}

Supported Architectures

DOT Android Face provides binaries for these architectures:

  • armeabi-v7a

  • arm64-v8a

  • x86

  • x86_64

If your target application format is APK and not Android App Bundle, and the APK splits are not specified, the generated APK file will contain binaries for all available architectures. Therefore we recommend to use APK splits. For example, to generate arm64-v8a APK, add the following section into your module build.gradle:

build.gradle
splits {
    abi {
        enable true
        reset()
        include 'arm64-v8a'
        universalApk false
    }
}

If you do not specify this section, the resulting application can become too large in size.

Licensing

In order to use DOT Android Face in other apps, it must be licensed. The license can be compiled into the application as it is bound to the application ID specified in build.gradle:

build.gradle
defaultConfig {
    applicationId "com.innovatrics.dot.sample"
    …
}

The license ID can be retrieved as follows – required only once for license generation:

Log.i(TAG, "LicenseId: " + DotFace.getInstance().getLicenseId());

In order to obtain the license, please contact your Innovatrics’ representative specifying the License ID. If the application uses build flavors with different application IDs, each flavor must contain a separate license.

Permissions

DOT Android Face declares the following permission in AndroidManifest.xml:

AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA" />

Proguard

For applications that use Proguard, add the following rules to the Proguard configuration file:

proguard-rules.pro
-dontwarn com.sun.jna.**
-dontwarn com.innovatrics.commons.pc.**

# JNA
-keep class com.sun.jna.** { *; }

# DOT
-keep class com.innovatrics.dot.face.*.*Module { *; }
-keep class com.innovatrics.iface.** { *; }

Basic Setup

Initialization

Before using any of the DOT Android Face components, you need to initialize it with the license and list of feature modules you want to use. Each feature module can be activated by a *Module class. See the table below.

Table 2. DOT Android Face feature modules

Feature module

Class

dot-face-detection

DotFaceDetectionModule

dot-face-verification

DotFaceVerificationModule

dot-face-eye-gaze-liveness

DotFaceEyeGazeLivenessModule

dot-face-passive-liveness

DotFacePassiveLivenessModule

Following code snippet shows how to initialize DOT Android Face with all feature modules:

private void initialize() {
    List<DotFaceModule> modules = createModules();
    DotFaceConfiguration configuration = new DotFaceConfiguration.Builder(context, license, modules).build();
    DotFace.getInstance().initializeAsync(configuration, listener);
}

private List<DotFaceModule> createModules() {
    return Arrays.asList(
            DotFaceDetectionModule.of(),
            DotFaceVerificationModule.of(),
            DotFaceEyeGazeLivenessModule.of(),
            DotFacePassiveLivenessModule.of()
    );
}
As a result of the initialization a dot folder under the application files folder is created.

Keep in mind that if you try to use any feature which was not added during initialization, DOT Android Face will throw an exception.

DOT Face Configuration

You can configure DotFace using DotFaceConfiguration DTO and it’s Builder. Here is an example of building such an object:

DotFaceConfiguration configuration = new DotFaceConfiguration.Builder(license, modules)
    .faceDetectionConfidenceThreshold(0.1d)
    .build();
Face detection confidence threshold (faceDetectionConfidenceThreshold)

The interval of the confidence score is [0.0, 1.0] and the default value of the threshold is 0.1. Faces with a confidence score lower that this value are ignored.

Deinitialization

When a process (e.g. onboarding) using the DOT Android Face has been completed, it is usually a good practice to free the resources used by it.

You can perform this by calling DotFace.deinitializeAsync(). If you want to use the DOT Android Face components again after that point, you need to call DotFace.initializeAsync() again. This shouldn’t be performed within the lifecycle of individual Android components.

Following code snippet shows how to deinitialize DOT Android Face:

DotFace.getInstance().deinitializeAsync(listener);

Logging

By default, logging is disabled. You can enable it by using the following method from the com.innovatrics.android.commons.Logger class.

Logger.setLoggingEnabled(true);

The appropriate place for this call is within the onCreate() method of your subclass of android.app.Application. Each tag of a log message starts with the dot-face: prefix.

This setting enables logging for all DOT Android libraries.
Please note that logging should be used just for debugging purposes as it might produce a lot of log messages.

Components

Overview

DOT Android Face provides both non-UI and UI components. Non-UI components are aimed to be used by developers who want to build their own UI using the DOT Android Face functionality. UI components are build on top of non-UI components. These are available as abstract fragments and can be extended and then embedded into the application’s existing activity providing more control.

List of Non-UI Components

FACE DETECTOR

A component for performing face detection on an image, creating templates and evaluating face attributes.

TEMPLATE MATCHER

A component for performing template matching.

FACE MATCHER

A component for performing face matching.

List of UI Components

FACE AUTO CAPTURE

A visual component for capturing good quality face photos and creating templates suitable for matching.

FACE SIMPLE CAPTURE

A visual component for capturing face photos and creating templates suitable for matching without considering photo quality requirements.

EYE GAZE LIVENESS

A visual component which performs the liveness detection based on object tracking. An object is shown on the screen and the user is instructed to follow the movement of this object by her/his eyes.

Non-UI Components

Face Detector

The FaceDetector interface provides the face detection functionality. Face detection stops when maximumFaces is reached. This component requires dot-face-detection module.

Create a FaceDetector:

FaceDetector faceDetector = FaceDetectorFactory.create();

To perform detection, call the following method on the background thread:

List<DetectedFace> detectedFaces = faceDetector.detect(faceImage, maximumFaces);

Template Matcher

In order to match face templates (1:1), use the TemplateMatcher interface. The recommended approach is to create face templates using FaceDetector or Face Auto Capture component and use only templates for matching. This component requires dot-face-verification module.

Create a TemplateMatcher:

TemplateMatcher templateMatcher = TemplateMatcherFactory.create();

To perform matching, call the following method on the background thread:

TemplateMatcher.Result result = templateMatcher.match(referenceTemplate, probeTemplate);

Face Matcher

In order to match face images (1:1), use the FaceMatcher interface. It is also possible to match a face image against a template (This is a recommended approach if you already have an available reference template). This component requires dot-face-detection and dot-face-verification modules.

Create a FaceMatcher:

FaceMatcher faceMatcher = FaceMatcherFactory.create();

To perform matching, call one of the following methods on the background thread:

FaceMatcher.Result result = faceMatcher.match(referenceFaceImage, probeFaceImage);
FaceMatcher.Result result = faceMatcher.match(referenceTemplate, probeFaceImage);

UI Components

Fragment Configuration

Components containing UI are embedded into the application as fragments from Android Support Library. All fragments are abstract. They must be subclassed and override their abstract methods.

Fragments requiring runtime interaction provide public methods, for example start().

public class DemoEyeGazeLivenessFragment extends EyeGazeLivenessFragment {

    @Override
    public void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        start();
    }

    …
}

For configuration not intended to be changed in runtime, fragment arguments are available.

FaceAutoCaptureConfiguration faceAutoCaptureConfiguration = new FaceAutoCaptureConfiguration.Builder().build();

Bundle arguments = new Bundle();
arguments.putSerializable(FaceAutoCaptureFragment.CONFIGURATION, faceAutoCaptureConfiguration);

Fragment fragment = new DemoFaceAutoCaptureFragment();
fragment.setArguments(arguments);

getSupportFragmentManager()
        .beginTransaction()
        .replace(android.R.id.content, fragment)
        .commit();

Configuration parameters are wrapped by the *Configuration DTO and you must put them as a Serializable under the CONFIGURATION key to the fragment.

Builder.build() method throws IllegalArgumentException if any of the arguments is not valid. Keep in mind to handle the exception.

Orientation Change

In order to handle the orientation change in multi-window mode correctly, configure the activity in your AndroidManifest.xml file as follows:

<activity
    android:name=".MyActivity"
    android:configChanges="screenSize|smallestScreenSize|screenLayout|orientation" />

Face Auto Capture

The fragment with instructions for obtaining quality face images suitable for matching. This component requires dot-face-detection module. In order to configure the behaviour of FaceAutoCaptureFragment, use FaceAutoCaptureConfiguration (see Fragment Configuration).

The following arguments are wrapped in FaceAutoCaptureConfiguration:

  • (Optional) [CameraFacing.FRONT] CameraFacing cameraFacing – Camera facing

    • CameraFacing.FRONT

    • CameraFacing.BACK

  • (Optional) [CameraPreviewScaleType.FIT] CameraPreviewScaleType cameraPreviewScaleType – The camera preview scale type

    • CameraPreviewScaleType.FIT

    • CameraPreviewScaleType.FILL

  • (Optional) [0.10] double minFaceSizeRatio – The minimum ratio of the face size to the width of the shorter side of the image

  • (Optional) [0.30] double maxFaceSizeRatio – The maximum ratio of the face size to the width of the shorter side of the image

  • (Optional) [false] boolean checkAnimationEnabled – Shows a checkmark animation after enrollment (or a static icon on devices which don’t support animation)

  • (Optional) Set<QualityAttribute> qualityAttributes – Sets the required quality attributes, which the output image must meet

If a face present in an image has a face size out of the minimum or maximum face size interval, it won’t be detected. Please note that a wider minimum or maximum face size interval results in a lower performance (detection FPS).

To use the fragment, create a subclass of FaceAutoCaptureFragment and override appropriate callbacks:

public class DemoFaceAutoCaptureFragment extends FaceAutoCaptureFragment {

    @Override
    protected void onNoCameraPermission() {
        // Callback implementation
    }

    @Override
    protected void onStepChanged(@NonNull CaptureStepId captureStepId, @Nullable DetectedFace detectedFace) {
        // Callback implementation
    }

    @Override
    protected void onCaptured(@NonNull DetectedFace detectedFace) {
        // Callback implementation
    }
}

CaptureStepId events are emitted when the user enters each step.

  • PRESENCE

  • PROXIMITY

  • POSITION

  • BACKGROUND_UNIFORMITY

  • PITCH_ANGLE

  • YAW_ANGLE

  • EYE_STATUS

  • GLASS_STATUS

  • MOUTH_STATUS

  • LIGHT

Quality Attributes of the Output Image

You may adjust quality requirements for the output image. To perform this, you can use various QualityProvider implementations with recommended values and pass this configuration via FaceAutoCaptureConfiguration by setting the qualityAttributes. You can also extend the default implementations according to your needs.

For example, if you wish to capture an image suitable for matching but you also want to make sure a user doesn’t wear glasses, you can use the following implementation:

public class MatchingWithGlassesStatusQualityProvider extends MatchingQualityProvider {

    public MatchingWithGlassesStatusQualityProvider() {
        qualityAttributes.add(DefaultQualityAttributeRegistry.findById(QualityAttributeId.GLASS_STATUS));
    }
}

See DefaultQualityAttributeRegistry for default values and all available quality attributes.

Available quality providers:

  • MatchingQualityProvider – The resulting image suitable for matching.

  • PassiveLivenessQualityProvider – The resulting image suitable for evaluation of the passive liveness.

  • IcaoQualityProvider – The resulting image passing ICAO checks.

Face Simple Capture

The fragment for obtaining images for matching without considering any photo quality requirements. This component requires dot-face-detection module.

In order to configure the behavior of FaceSimpleCaptureFragment, use FaceSimpleCaptureConfiguration (see Fragment Configuration).

The following arguments are wrapped in FaceSimpleCaptureConfiguration:

  • (Optional) [CameraFacing.FRONT] CameraFacing cameraFacing – Camera facing

    • CameraFacing.FRONT

    • CameraFacing.BACK

  • (Optional) [CameraPreviewScaleType.FIT] CameraPreviewScaleType cameraPreviewScaleType – The camera preview scale type

    • CameraPreviewScaleType.FIT

    • CameraPreviewScaleType.FILL

  • (Optional) [0.10] double minFaceSizeRatio – The minimum ratio of the face size to the width of the shorter side of the image

  • (Optional) [0.30] double maxFaceSizeRatio – The maximum ratio of the face size to the width of the shorter side of the image

If a face present in an image has a face size out of the minimum or maximum face size interval, it won’t be detected. Please note that a wider minimum or maximum face size interval results in a lower performance (detection FPS).

To use the FaceSimpleCaptureFragment fragment subclass, override the appropriate callbacks:

public class DemoFaceSimpleCaptureFragment extends FaceSimpleCaptureFragment {

    @Override
    public void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        requestCapture();
    }

    @Override
    protected void onNoCameraPermission() {
        // Callback implementation
    }

    @Override
    protected void onCaptured(@NonNull DetectedFace detectedFace) {
        // Callback implementation
    }
}

You need to call requestCapture() method in order to request a capture. In the example above, the component will capture a face as soon as possible.

Eye Gaze Liveness

The fragment with a moving or a fading object on the screen. This component requires dot-face-detection and dot-face-eye-gaze-liveness modules.

In order to configure the behavior of EyeGazeLivenessFragment, use EyeGazeLivenessConfiguration (see Fragment Configuration).

The following arguments are wrapped in EyeGazeLivenessConfiguration:

  • (Required) [-] List<Segment> segments – List of segments for the object animation

  • (Optional) [0.10] double minFaceSizeRatio – The minimum ratio of the face size to the width of the shorter side of the image

  • (Optional) [0.30] double maxFaceSizeRatio – The maximum ratio of the face size to the width of the shorter side of the image

  • (Optional) [0.5] double proximityTolerance – The tolerance of the face size ratio (The tolerance of the distance between the face and the camera). A value greater than 1.0 disables the proximity check

  • (Optional) [4] int minValidSegmentCount – The minimum number of valid captured segments. The value can be within the interval [4, 7].

  • (Optional) [MOVE] EyeGazeLivenessConfiguration.TransitionType transitionType – The transition type used for the liveness detection object animation

    • MOVE

    • FADE

To start the liveness detection process, call start() method. To use the EyeGazeLivenessFragment fragment subclass, override the appropriate callbacks:

public class DemoEyeGazeLivenessFragment extends EyeGazeLivenessFragment {

    @Override
    public void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        start();
    }

    @Override
    protected void onStateChanged(@NonNull EyeGazeLivenessState state) {
        // Callback implementation
    }

    @Override
    protected void onFinished(float score, @NonNull List<SegmentImage> segmentImages) {
        // Callback implementation
    }

    @Override
    protected void onNoMoreSegments() {
        // Callback implementation
    }

    @Override
    protected void onEyesNotDetected() {
        // Callback implementation
    }

    @Override
    protected void onFaceTrackingFailed() {
        // Callback implementation
    }

    @Override
    protected void onNoCameraPermission() {
        // Callback implementation
    }
}

The liveness detection follows List<Segment> segments and renders an object in the specified corners of the screen. For the best accuracy it is recommended to display the object in at least three different corners.

If the user’s eyes can’t be detected in the first segment, the process will be terminated with the onEyesNotDetected() callback.

The process is automatically finished when the number of valid items in segmentImages reaches minValidSegmentCount. After that, onFinished() callback is called and the score can be evaluated.

The process fails with the onNoMoreSegments() callback when all the segments in List<Segment> segments were displayed but it wasn’t possible to collect a number of valid images specified in minValidSegmentCount. You can use SegmentImage items for matching purposes, even when the eyes weren’t detected in a segment.

For a better user experience, it is recommended to provide the user more attempts, so the size of List<Segment> segments should be greater than minValidSegmentCount. However, this should be limited, as it is better to terminate the process if the user is failing in many segments. The recommended way of segment generation is to use a RandomSegmentsGenerator:

SegmentsGenerator segmentsGenerator = new RandomSegmentsGenerator();
int segmentCount = 8;
int segmentDurationMillis = 800;
List<Segment> segments = segmentsGenerator.generate(segmentCount, segmentDurationMillis);

If you want to perform a server side validation of the liveness detection, please follow this recommended approach:

The object movement is generated on your server and then rendered on the device using List<Segment> segments. When the process is finished successfully, the List<SegmentImage> segmentImages is transferred to the server to evaluate the liveness detection. Please note that segments is no longer transferred and you should store it in the session of the server. You can evaluate the liveness detection by combining the corresponding segmentImages with segments and sending the request to DOT Core Server. If the user could finish the process without using all segments, the remaining items of segments should be dropped to match the number of items in segmentImages.

Custom object on the screen

The moving or fading object on the screen is a drawable resource res/drawable/eye_gaze_liveness_object.xml. It is a circle filled by ?attr/colorAccent color. If you want to use a custom object, override this resource.

Customization of UI components

Strings

You can override the string resources in your application and provide alternative strings for supported languages using the standard Android localization mechanism.

<!-- Face Auto Capture -->
<string name="dot_face_auto_capture_instruction_background_nonuniform">Plain background required</string>
<string name="dot_face_auto_capture_instruction_candidate_selection">Stay still&#8230;</string>
<string name="dot_face_auto_capture_instruction_eye_status_low">Open your eyes</string>
<string name="dot_face_auto_capture_instruction_face_centering">Center your face</string>
<string name="dot_face_auto_capture_instruction_face_too_close">Move back</string>
<string name="dot_face_auto_capture_instruction_face_too_far">Move closer</string>
<string name="dot_face_auto_capture_instruction_glasses_present">Remove glasses</string>
<string name="dot_face_auto_capture_instruction_lighting">Turn towards light</string>
<string name="dot_face_auto_capture_instruction_mouth_status_low">Close your mouth</string>
<string name="dot_face_auto_capture_instruction_pitch_too_high">Lower your chin</string>
<string name="dot_face_auto_capture_instruction_pitch_too_low">Lift your chin</string>
<string name="dot_face_auto_capture_instruction_yaw_too_left">Look right</string>
<string name="dot_face_auto_capture_instruction_yaw_too_right">Look left</string>

<!-- Eye Gaze Liveness -->
<string name="dot_eye_gaze_liveness_instruction_face_not_present">Look straight</string>
<string name="dot_eye_gaze_liveness_instruction_face_too_close">Move back</string>
<string name="dot_eye_gaze_liveness_instruction_face_too_far">Move closer</string>
<string name="dot_eye_gaze_liveness_instruction_lighting">Move towards light</string>
<string name="dot_eye_gaze_liveness_instruction_watch_object">Watch the object</string>
Colors

You may customize the colors used by DOT Android Face in your application. To use custom colors, override the specific color.

<!-- Face Auto Capture -->
<color name="dot_face_auto_capture_background_overlay">#e1ffffff</color>
<color name="dot_face_auto_capture_circle_outline">#ffffff</color>
<color name="dot_face_auto_capture_tracking_circle_outline">#1e000000</color>
<color name="dot_face_auto_capture_tracking_circle_background">#78ffffff</color>
<color name="dot_face_auto_capture_progress_valid">#88b661</color>
<color name="dot_face_auto_capture_progress_intermediate">#ed8500</color>
<color name="dot_face_auto_capture_progress_invalid">#dc4232</color>
<color name="dot_face_auto_capture_instruction_text">#ff000000</color>
<color name="dot_face_auto_capture_instruction_text_background">#ffffffff</color>
<color name="dot_face_auto_capture_instruction_text_stay_still">#ffffffff</color>
<color name="dot_face_auto_capture_instruction_text_background_stay_still">#88b661</color>

<!-- Eye Gaze Liveness -->
<color name="dot_eye_gaze_liveness_background">#ffffffff</color>
<color name="dot_eye_gaze_liveness_instruction_text">#ff000000</color>
<color name="dot_eye_gaze_liveness_instruction_text_background">#ffffffff</color>
Styles

You can style the text views and buttons by overriding the parent style in the application. The default style is AppCompat.

<style name="TextAppearance.Dot.Medium" parent="TextAppearance.AppCompat.Medium" />

Common Classes

ImageSize

DTO which represents a size of an image. To create an instance:

ImageSize imageSize = ImageSize.of(width, height);

BgrRawImage

DTO which represents and an image. To create an instance:

BgrRawImage bgrRawImage = BgrRawImage.of(size, bytes);

To create an instance from Bitmap:

BgrRawImage bgrRawImage = BgrRawImageFactory.create(bitmap);

FaceImage

DTO which represents a face image and can be used for face detection and matching. To create an instance:

FaceImage faceImage = FaceImage.of(bgrRawImage);
FaceImage faceImage = FaceImage.of(bgrRawImage, minFaceSizeRatio, maxFaceSizeRatio);

DetectedFace

This interface represents the face detection result. The following methods are available:

  • @NonNull BgrRawImage getImage(); – Get a full (original) image of the face.

  • float getConfidence(); - The confidence score of the face detection. It also represents the quality of the detected face.

  • @NonNull BgrRawImage createFullFrontalImage(); - Creates a ICAO full frontal image of a face. If boundaries of the normalized image leak outside of the original image, a white background is applied.

  • @NonNull Template createTemplate(); - The face template which can be used for matching. This method requires dot-face-verification module.

  • @NonNull FaceAspects evaluateFaceAspects(); - Evaluates face aspects.

  • @NonNull FaceQuality evaluateFaceQuality(); - Evaluates face attributes that can be used for a detailed face quality assessment.

  • @NonNull FaceQuality evaluateFaceQuality(@NonNull FaceQualityQuery faceQualityQuery); - Evaluates only specific face attributes that can be used for a detailed face quality assessment. This is the recommended way for face quality evaluation due to performance reasons.

  • @NonNull FaceAttribute evaluatePassiveLiveness(); - Evaluates passive liveness. This component requires dot-face-passive-liveness module.

Appendix


Changelog

4.1.0 - 2021-10-14

Changed
  • Update IFace to 4.13.0 - improved passive liveness algorithm.

  • DotFaceConfiguration.faceDetectionConfidenceThreshold default value to 0.1.

  • Update sharpness range in DefaultQualityAttributeRegistry, IcaoQualityProvider, MatchingQualityProvider and PassiveLivenessQualityProvider.

4.0.1 - 2021-10-06

Fixed
  • Face detection after onCaptured() callback in Face Simple Capture component.

  • Minor issues.

4.0.0 - 2021-09-28

Added
  • Class BgrRawImage.

  • Class BgrRawImageFactory.

  • Class BitmapFactory.

  • Class DotFaceDetectionModule.

  • Class DotFaceVerificationModule.

  • Class DotFacePassiveLivenessModule.

  • Class DotFaceEyeGazeLivenessModule.

  • Class FaceDetectorFactory.

  • Class RandomSegmentsGenerator.

  • Interface SegmentsGenerator.

  • Class Template.

  • Class Expression.

  • Class ExpressionQuery.

  • Class EyesExpression.

  • Class EyesExpressionQuery.

  • Class FaceAspects.

  • Class FaceAttribute.

  • Class FaceImageQuality.

  • Class FaceImageQualityQuery.

  • Class FaceQuality.

  • Class FaceQualityQuery.

  • Class Glasses.

  • Class HeadPose.

  • Class HeadPoseQuery.

  • Class HeadPoseAttribute.

  • Class Wearables.

  • Class WearablesQuery.

  • Class FaceMatcherFactory.

  • Class TemplateMatcherFactory.

Changed
  • groupId 'com.innovatrics.android' to 'com.innovatrics.dot'.

  • Minimum Android API level 21.

  • DOT Android Face is split into multiple android libraries. See sections Distribution and Initialization in the integration manual.

  • Class DotFaceParameters to DotFaceConfiguration.

  • Method DotFace.initAsync() to DotFace.initializeAsync().

  • Method DotFace.closeAsync() to DotFace.deinitializeAsync().

  • Component "Face Capture" to "Face Auto Capture" and all related API.

  • Component "Face Capture Simple" to "Face Simple Capture" and all related API.

  • Component "Liveness Detection" to "Eye Gaze Liveness" and all related API.

  • Class QualityAttributeConfiguration to QualityAttribute.

  • Class ComplianceRange to ValueRange.

  • Class DefaultQualityRegistry to DefaultQualityAttributeRegistry.

  • Class VerificationQualityProvider to MatchingQualityProvider.

  • CLass DetectedFace to a new interface DetectedFace.

  • Class FaceDetector to a new interface FaceDetector.

  • Class FaceImage contains BgrRawImage instead of Bitmap.

  • Class SegmentConfiguration to Segment.

  • Class SegmentPhoto to SegmentImage.

  • Enum DotPosition to Corner.

  • Class FaceImageVerifier to a new interface FaceMatcher.

  • Class TemplateVerifier to a new interface TemplateMatcher.

  • Renamed resource identifiers to match new component names.

  • Face confidence, matching score, face attributes and attribute quality value ranges are in interval [0.0, 1.0].

Removed
  • Component "Liveness Detection 2" and all related API.

  • Class FaceAttribute.

  • Class IcaoAttribute.

  • Enum IcaoAttributeId.

  • Class LicenseUtils.

3.8.0 - 2021-06-17

Changed
  • Update IFace to 4.10.0 - improved background uniformity algorithm.

Fixed
  • Requesting camera permission if it is already denied.

3.7.1 - 2021-05-10

Fixed
  • Update IFace to 4.9.1 - minor issue.

  • Update glass status range in DefaultQualityRegistry.

3.7.0 - 2021-05-03

Changed
  • Update IFace to 4.9.0 - improved glass status evaluation.

3.6.0 - 2021-04-12

Changed
  • Update IFace to 4.8.0 - improved passive liveness algorithm.

3.5.0 - 2021-03-17

Added
  • DotFaceParameters DTO.

  • DotFace.InitializationException exception.

Changed
  • Update IFace to 4.4.0 - face templates are incompatible and must be regenerated.

  • Signature of DotFace.initAsync() method.

  • Signature of DotFace.closeAsync() method.

  • DotFace.Listener to DotFace.InitializationListener and DotFace.CloseListener.

  • Ranges of DefaultQualityRegistry.

  • CaptureStepId.PITCH to CaptureStepId.PITCH_ANGLE.

  • CaptureStepId.YAW to CaptureStepId.YAW_ANGLE.

  • IcaoAttributeId.PITCH to IcaoAttributeId.PITCH_ANGLE.

  • IcaoAttributeId.ROLL to IcaoAttributeId.ROLL_ANGLE.

  • IcaoAttributeId.YAW to IcaoAttributeId.YAW_ANGLE.

  • QualityAttributeId.PITCH to QualityAttributeId.PITCH_ANGLE.

  • QualityAttributeId.YAW to QualityAttributeId.YAW_ANGLE.

Fixed
  • DotFace.initAsync() behavior when DOT Android Face is already initialized.

  • DotFace.closeAsync() behavior when DOT Android Face is not initialized.

3.4.0 - 2021-02-01

Changed
  • Update target Android SDK version to 30 (Android 11).

  • FaceCaptureArguments: change cameraFacing to cameraId.

  • FaceCaptureSimpleArguments: change cameraFacing to cameraId.

  • LivenessCheckArguments: change cameraFacing to cameraId.

  • LivenessCheck2Arguments: change cameraFacing to cameraId.

3.3.1 - 2020-09-23

Fixed
  • Animations not working in rare cases for active liveness.

3.3.0 - 2020-09-04

Changed
  • Adjusted default ranges for quality providers.

  • Update IFace to 3.13.1 - face templates are incompatible and must be regenerated.

  • Background uniformity calculation improved and added to IcaoQualityProvider.

3.2.2 - 2020-08-04

Fixed
  • QualityProvider and QualityAttributeId added to public API.

3.2.1 - 2020-07-31

Added
  • Add stay still instruction color configuration.

Fixed
  • Stay still indicator not colored during capture.

3.2.0 - 2020-07-30

Changed
  • On screen messages during face capture remain shown longer to minimize instruction flickering.

  • Changed ranges of DefaultQualityRegistry and made it public.

  • Removed detected face indicator in FaceCaptureFragment during animation if showCheckAnimation is set.

Fixed
  • Fix camera preview freezing.

3.1.1 - 2020-07-13

Added
  • New FaceAttributes section to documentation.

  • On device passive liveness evaluation provided by FaceAttributes. Artifact dot-face-passive-liveness must be used for this functionality.

  • QualityProvider implementations - VerificationQualityProvider, PassiveLivenessQualityProvider, IcaoQualityProvider which can be used by FaceCaptureFragment.

  • New CaptureStepId events available for FaceCaptureFragment - PITCH, YAW, EYE_STATUS, GLASS_STATUS and MOUTH_STATUS. These events are added by specific QualityProvider and instructions for these steps can be customized, see documentation for details.

Changed
  • Removed alternative instructions for FaceCaptureFragment.

Fixed
  • Crash in Liveness Detection when track is called without init.

  • Crash during premature finish of Liveness Detection 2.

  • Bug which caused that liveness detection could not be completed when animations are disabled.

  • Rare crash during face capture.

3.0.0 - 2020-06-02

Changed
  • New major release: DOT Android Kit becomes DOT Android Face - library focused on facial recognition.

  • Update IFace to 3.10.1 - face templates are incompatible and must be regenerated.

  • Removed onCaptureFail() in FaceCaptureFragment and onFaceCaptureFail() in LivenessCheck2Fragment. Need for these callbacks was eliminated by internal rework.

  • Calculate min and max face size ratio from width of the image in FaceDetector. Keep calculation from shorter side (height) in landscape mode for UI components.

Fixed
  • Rare dot tracking liveness detection sudden change of dot direction.

  • Crash during premature finish of Liveness Detection 2.