Quantcast
Published On: Sun, Mar 10th, 2019

Report: Facial Recognition Software ‘Misgendering’ Trans People, leftist VICE fears ‘dangerous results in the future’

In the category of obvious, comes a story from the leftist website Vice, attacking the fact that facial recognition software is struggling to to identify transgender and “nonbinary” people, noting that this “could have dangerous results in the future.”

“The problems can be severe for transgender and nonbinary people because most facial recognition software is programmed to sort people into two groups—male or female,” claimed Vice’s Motherboard, Tuesday. “Because these systems aren’t designed with transgender and gender nonconforming people in mind, something as common as catching a flight can become a complicated nightmare. It’s a problem that will only get worse as the TSA moves to a full biometric system at all airports and facial recognition technology spreads.”

“These biases programmed into facial recognition software means that transgender and gender non-conforming people may not be able to use facial recognition advancements that are at least nominally intended to make people’s lives easier, and, perhaps more importantly, may be unfairly targeted, discriminated against, misgendered, or otherwise misidentified by the creeping surveillance state’s facial recognition software,” Motherboard continued, before citing a study that revealed facial recognition research papers “followed a binary model of gender more than 90 percent of the time.”

Os Keyes, the Ph.D. student behind the study, proclaimed, “We’re talking about the extension of trans erasure… That has immediate consequences. The more stuff you build a particular way of thinking into, the hard it is to unpick that way of thinking… AGR research fundamentally ignores the existence of transgender people, with dangerous results.”

“The average [computer science] student is never going to take a gender studies class,” Keyes expressed. “They’re not probably going to even take an ethics class. It’d be good if they did.”

Facial recognition tools have frequently misidentified many groups of people.

Amazon’s facial recognition tool, Rekognition, reportedly misidentified 28 members of Congress as police suspects, and mistook criminals on the FBI’s Most Wanted List for famous celebrities.

Are we sure that is in error or just accurate pulls from the criminal database.

About the Author

- Writer and Co-Founder of The Global Dispatch, Brandon has been covering news, offering commentary for years, beginning professional in 2008 on sites like Examiner and blogs: Desk of Brian, Crazed Fanboy. Appearing on several radio shows, Brandon has hosted Dispatch Radio, written his first novel (The Rise of the Templar) will be a licensed Assembly of God Pastor by the Spring of 2017. "Why do we do this?" I was asked and the answer is simple. "I just want the truth. I want a source of information that tells me what's going and clearly attempts to separate opinion from fact. Set aside left and right, old and young, just point to the world and say, 'Look!'" To Contact Brandon email [email protected] ATTN: BRANDON

Tags

Leave a comment

XHTML: You can use these html tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>



Recent Posts

Categories

Archives

At the Movies