Getting started
Installation
npm install ng-naat-liveness
Dependencies
Add the folder provided by the technical team within the project assets
Import
In the file app.module import the module:
;...... imports:...
Usage
HTML
Add the selector inside some component and configure the input parameters:
Typescript
Listen to the events:
detectionReadycalibrationPercentageprogressPertentage: numbercalibrationReadyactionComplete$eventprocessCompleteimageonerror$event
Gestures allowed
Name | Actions | Description |
---|---|---|
LEFT | LEFT, FRONT | Turn left |
RIGHT | RIGHT, FRONT | Turn right |
Inputs
Name | Type | Required | Default | Description |
---|---|---|---|---|
gestures | string[] | true | null | Array of gestures to execute |
modelPath | string | true | null | Path of the model.json, this file is included in the dependencies folder |
workerPath | string | true | null | Path of the worker folder, this folder is included in the dependencies folder |
timer | number | false | null | Time in seconds to completed all gestures |
mask | boolean | false | true | View superimposed on the video (background transparent and ellipse) |
videoMirror | boolean | false | true | Show the camera rotated 180 degrees |
Outputs
Name | Return | Description |
---|---|---|
detectionReady | void | Fires when the browser has loaded dependencies needed to start the process |
calibrationReady | void | Fires when face calibration is successfully completed |
calibrationPercentage | number | Emit the percentage of calibration progress |
actionComplete | ResponseAction | Fires when an action was completed |
gestureComplete | ResponseAction | Fires when a gesture was completed |
processComplete | string | Fires when all gestures was completed and return an image in base64 of the captured face |
onerror | ResponseError | Is called when an error happens |