作者
Omkar N Kulkarni, Vikram Patil, Vivek K Singh, Pradeep K Atrey
发表日期
2021/11/15
研讨会论文
2021 IEEE Seventh International Conference on Multimedia Big Data (BigMM)
页码范围
17-24
出版商
IEEE
简介
Despite being highly accurate, widely used multimedia analysis algorithms can suffer from bias, affecting users’ trust in them. For instance, algorithms for face recognition, pedestrian detection, and image search have recently been reported to be biased. In this paper, we move the discussion on algorithmic fairness to a new domain, namely, pupil detection. We audit a widely used OpenCV algorithm for pupil detection from the perspective of fairness and accuracy. The algorithm is audited using two different datasets: a single-person image dataset (CelebA), and a group image dataset (Images of Groups). In both datasets, we found the OpenCV pupil detection algorithm to provide reasonably high accuracy but also yield statistically significant bias with respect to gender. The results provide the first empirical evidence for the existence of gender bias in pupil detection algorithms in both single person as well as group …
引用总数
学术搜索中的文章
ON Kulkarni, V Patil, VK Singh, PK Atrey - 2021 IEEE Seventh International Conference on …, 2021