M Qraitem, K Saenko, BA Plummer - IEEE Computer Society Conference …, 2023 - par.nsf.gov
This article addresses the problem of dynamic online estimation and compensation of hard- iron and soft-iron biases of three-axis magnetometers under dynamic motion in field …
M Qraitem, K Saenko, BA Plummer - arXiv preprint arXiv:2209.15605, 2022 - arxiv.org
Prior work has shown that Visual Recognition datasets frequently underrepresent bias groups $ B $(\eg Female) within class labels $ Y $(\eg Programmers). This dataset bias can …
M Qraitem, K Saenko, BA Plummer - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
Prior work has shown that Visual Recognition datasets frequently underrepresent bias groups $ B $(\eg Female) within class labels $ Y $(\eg Programmers). This dataset bias can …
M Qraitem, K Saenko, BA Plummer - cvpr2023.thecvf.com
Bias Mimicking: A Simple Sampling Approach For Bias Mitigation Page 1 Bias Mimicking: A Simple Sampling Approach For Bias Mitigation Maan Qraitem, Kate Saenko, Bryan A. Plummer …
Prior work has shown that Visual Recognition datasets frequently under-represent sensitive groups (\eg Female) within a category (\eg Programmers). This dataset bias can lead to …
M Qraitem, K Saenko, B Plummer - IEEE/CVF Conference on …, 2023 - research.ibm.com
Prior work has shown that Visual Recognition datasets frequently underrepresent bias groups B (eg Female) within class labels Y (eg Programmers). This dataset bias can lead to …
Bias Mimicking: A Simple Sampling Approach For Bias Mitigation Page 1 Bias Mimicking: A Simple Sampling Approach For Bias Mitigation Maan Qraitem, Kate Saenko, Bryan A. Plummer …
M Qraitem, K Saenko, BA Plummer - 2023 IEEE/CVF Conference on …, 2023 - computer.org
Prior work has shown that Visual Recognition datasets frequently underrepresent bias groups $ B $(eg Female) within class labels $ Y $(eg Programmers). This dataset bias can …
M Qraitem, K Saenko… - 2023 IEEE/CVF …, 2023 - ieeexplore.ieee.org
Prior work has shown that Visual Recognition datasets frequently underrepresent bias groups B (eg Female) within class labels Y (eg Programmers). This dataset bias can lead to …