Computer Vision News - February 2022

10 Presentation from WACV 2022 “ Weused the ChicagoFaceDatabase ,with pre-labeled images for race and gender, ” Abeba tells us. “ We created the same 3 x 1 grid images and passed those through the three platforms again. Google’s result was inconclusive in that 20% of the time it was just selecting the white space. Other times it selected the black face and other times the white face. But with Apple and Twitter, we found the algorithm preferred white faces over black faces. ” Part of the team’s motivation for carrying out this work is that these cropping algorithms exist and impact us every day . It is very likely images we upload, or images we encounter when using our phones and computers have been through them. Abeba looked at the reasons why companies and vision researchers use example, were passed through Twitter’s saliency cropping algorithms , it was cropping them in a way that was focused below the chest and above the knee. This is what experts term the male gaze. It’s that part of a woman’s body which tends to be objectified. ” The team curated its own dataset of 336 images of women at red carpet events and passed each image through the three cropping algorithms of Twitter, Apple, and Google. For Twitter in particular, as many as 79% of the images came out with a male-gaze-like crop. The team also observed racial bias in the results, similar to the viral Obama-McConnell results. Based on that observation, they ran a second experiment. (a) The Twitter SIC response to the Obama-McConnell image for varying aspect ratios; (b) The Google CROP_HINTS response to the Obama-McConnell image; (c) The Apple ABSC response to the Obama-McConnell image.

RkJQdWJsaXNoZXIy NTc3NzU=