Skip to content

Phases

A description of the EyeScan Solo SDK user interface flow.

When adding EyeScan Solo SDK to your app, the user experience has two parts:

  • The views implemented by your app, also referred to as the "host app", to introduce the scan and display the results.
  • The EyeScan Solo SDK flow, which is entirely handled by SighticView.
---
config:
  theme: base
  themeVariables:
    actorLineColor: "#444444"
---
sequenceDiagram
  participant A as Host app
  participant B as SighticView
  participant C as Sightic Analytics server
  Note over B: Handles the test flow
  A ->> B: Presents
  alt Success
    B -->> A: SighticRecording
  else Failure
    B -->> A: SighticError
  end
  A ->> C : performInference()
    alt Success
    C -->> A: SighticInference
  else Failure
    C -->> A: SighticError
  end

An example implementation is available in the EyeScan Solo SDK Quickstart project.

Alignment

The alignment view helps the user position their face correctly in front of the device.

There is also a button to cancel the alignment and return control to the host app.

Scan

A moving dot is shown to the app user during the scan. The user should follow the dot with their eyes.

The scan will return to the alignment view if the user misaligns while doing the scan, for example by moving the device too far away from their face. A hint is provided on how to fix the misalignment. The scan restarts from the beginning when the user is correctly positioned. If the user fails to keep sufficiently aligned during the scan three times, control is returned back to the host app with a AlignmentError result.

Analysis

The SighticView completion callback provides the host app with the recorded data as a SighticRecording object when the user has completed the scan.

The host app sends the recorded data to the Sightic Analytics server for analysis by calling SighticRecording/performInference(). The data sent to the server contains device motions and features extracted from the face of the app user. The data does not contain a video stream that can be used to identify the user.

The app receives a SighticInference with the scan result, or a SighticError if an error occurs.