Racial Bias

  • Race, Age & Gender

Racial Bias means that a system performs differently, that is, discriminates and either penalizes or rewards a user depending on the race of the person using the system.

For FACEKI, this means that the liveness will either work better or worse for a class of users identified by race. FACEKI’s AI/ML modules were trained on large amounts of image data that is, many, many selfies. By ensuring that we have representation from all races, we are able to perform nearly equally across all races.

The differences are small error percentages that depend on the data under test, which is a finite number of images. FACEKI’s evaluate each new release performance across race, age, and gender.


Do you need to learn more? Book a meeting with our team.

Streamline your Identity proofing

Save time, save money & make your customers happy