I am an expert in first impressions and body language. I wrote the book SNAP
Making the Most of First Impressions Body Language and Charisma.
There is racial bias in facial recognition software.
1.) How common is this
problem?
1) Are there problems with racial stereotyping in facial recognition software.
Researcher Patrick Grother, said race-based biases were evident in "the majority of the face recognition algorithms we studied." Compared to their performance against whites, some algorithms were up to 100 times more likely to confuse two different non-white people.
”https://edition.cnn.com/2019/12/19/tech/facial-recognition-study-racial-bias/index.html#:~:text=In%20a%20release%2C%20Patrick%20Grother%2C%20one%20of%20the,more%20likely%20to%20confuse%20two%20different%20non-white%20people.
Accurate first impressions are
formed in the limbic system and based on reading up to 10,000 nonverbal cues in
less than one minute. We can form that impression quickly and accurately based
on an enormous amount of nonverbal data.
2.) Should facial recognition be used in schools? I don’t think we
can stop the use of facial recognition software. We have too many guns and too
much violence so people want to protect their children and the companies that
make them can get huge government contracts and make millions of dollars
putting in and maintaining them so they will lobby for them. It will happen so
we need to make it as accurate as possible and take out the racial bias.
Stereotypes that are formed in
the neocortex and are based on things like parenting, experiences, social media,
film tv and social media and most research shows that impressions based on stereotypes
in 30 percent accurate or lower.
What can be done to ensure that facial recognition doesn't show
bias?
Research show we form Stereotypes based on skin
tone and even more so for facial structure (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3399873/)
Skin and Bones: The Contribution of Skin
Tone and Facial Structure to Racial Prototypicality Ratings)
I feel that we need to improve the
software to take into considered the implicit bias that skin tone and facial shape
make and to increase the accuracy of the software.