The software would profile the user after taking a photograph of her. I tried the software many times on two different terminals. The first time it thought I was 23 years old. Then 26, 27, 32. Luckily, never more than that. In addition to age, it also detected sex, hair colour, wearing glasses and make-up or not. One machine thought I was wearing makeup, and the other didn’t. And my face dimensions changed every time.
No one at the workshop thought the software was accurate. And because it’s inaccurate, it could be biased when used. Unfortunately, I seemed to be so used to being profiled by AI or people that it is no longer an alien experience to me (sadly).
The software tried to be playful. Once it ‘guessed’ the user’s age, it’d play a tune that ‘commemorates’ that age. So a lot of these critical issues with profiling people with this AI algorithm are hidden behind the playfulness.
On Microsoft’s website, it also seems that the software is mainly aimed at white users (see e.g., Step 2: Create the PersonGroup in this user document where photographs of white people are used as examples).
Knowing how popular and accessible facial recognition software like Face is these days really worries me. My trust over those selfie photo booths available the theme parks or public places is completely broken. Who knows if the facial recognition algorithms embedded in these photo booths software are profiling me or not. Shall I sacrifice my privacy for exchanging for a fun experience? I guess not.