Facial recognition has improved significantly in recent years. This opens up completely new possibilities and applications, but also carries the risk of misuse. People should therefore be able to protect their faces from unauthorised analysis and thus preserve their privacy.
One suitable way to address this problem would be facial anonymisation. Facial anonymisation methods should provide reliable protection for all people, regardless of demographic factors such as sex, skin tone, or age. It has been found that such demographic factors can indeed influence the outcome of automated decision-making systems or biometric systems. Therefore, it is investigated whether facial anonymisation methods are also affected by such demographic differentials.
For this purpose, several face anonymisation methods were examined for demographic differences in this thesis. In addition, the quality of the anonymised images and the strength of the protection provided were also considered. For this purpose, we used popular face image datasets and stateof-the-art face recognition systems. The experiments have shown that the anonymisation methods are indeed affected by such demographic differentials.