Recently, there has been a wave of public and academic concerns regarding the existence of systemic bias in automated decision systems including biometrics. Very prominently, face recognition algorithms have often been labelled as “racist” or “biased” by the media, non-governmental organisations, and researchers alike. A collaboration of scientists from Hochschule Darmstadt (P. Drozdowski, C. Rathgeb, C. Busch), Inria (A. Dantcheva), and Fraunhofer IGD (N. Damer) investigated this topic and wrote an overview article with the title “Demographic Bias in Biometrics: A Survey on an Emerging Challenge”, aimed both at expert and lay audience.
The main contributions of the article are: (1) an overview of the topic of algorithmic bias in the context of biometrics, (2) a comprehensive survey of the existing literature on biometric bias estimation and mitigation, (3) a discussion of the pertinent technical and social matters, and (4) an outline of the remaining challenges and future work items, both from technological and social points of view.
The work was sponsored by the “Next Generation Biometric Systems” mission of the National Research Center for Applied Cybersecurity ATHENE in Germany and the National Research Agency in France. The article is being published with an open access license in the IEEE Transactions on Technology and Society journal. The early access version is freely available under this link: https://ieeexplore.ieee.org/document/9086771