作者
Surbhi Mittal, Kartik Thakral, Puspita Majumdar, Mayank Vatsa, Richa Singh
发表日期
2023/1/5
研讨会论文
2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)
页码范围
1-7
出版商
IEEE
简介
The presence of bias in deep models leads to unfair outcomes for certain demographic subgroups. Research in bias focuses primarily on facial recognition and attribute prediction with scarce emphasis on face detection. Existing studies consider face detection as binary classification into ‘face’ and ‘non-face’ classes. In this work, we investigate possible bias in the domain of face detection through facial region localization which is currently unexplored. Since facial region localization is an essential task for all face recognition pipelines, it is imperative to analyze the presence of such bias in popular deep models. Most existing face detection datasets lack suitable annotation for such analysis. Therefore, we web-curate the Fair Face Localization with Attributes (F2LA) dataset and manually annotate more than 10 attributes per face, including facial localization information. Utilizing the extensive annotations from F2LA …
引用总数
学术搜索中的文章
S Mittal, K Thakral, P Majumdar, M Vatsa, R Singh - 2023 IEEE 17th International Conference on Automatic …, 2023