Quantcast
Published On: Sun, Mar 10th, 2019

Report: Facial Recognition Software ‘Misgendering’ Trans People, leftist VICE fears ‘dangerous results in the future’

In the category of obvious, comes a story from the leftist website Vice, attacking the fact that facial recognition software is struggling to to identify transgender and “nonbinary” people, noting that this “could have dangerous results in the future.”

“The problems can be severe for transgender and nonbinary people because most facial recognition software is programmed to sort people into two groups—male or female,” claimed Vice’s Motherboard, Tuesday. “Because these systems aren’t designed with transgender and gender nonconforming people in mind, something as common as catching a flight can become a complicated nightmare. It’s a problem that will only get worse as the TSA moves to a full biometric system at all airports and facial recognition technology spreads.”

“These biases programmed into facial recognition software means that transgender and gender non-conforming people may not be able to use facial recognition advancements that are at least nominally intended to make people’s lives easier, and, perhaps more importantly, may be unfairly targeted, discriminated against, misgendered, or otherwise misidentified by the creeping surveillance state’s facial recognition software,” Motherboard continued, before citing a study that revealed facial recognition research papers “followed a binary model of gender more than 90 percent of the time.”

Os Keyes, the Ph.D. student behind the study, proclaimed, “We’re talking about the extension of trans erasure… That has immediate consequences. The more stuff you build a particular way of thinking into, the hard it is to unpick that way of thinking… AGR research fundamentally ignores the existence of transgender people, with dangerous results.”

“The average [computer science] student is never going to take a gender studies class,” Keyes expressed. “They’re not probably going to even take an ethics class. It’d be good if they did.”

Facial recognition tools have frequently misidentified many groups of people.

Amazon’s facial recognition tool, Rekognition, reportedly misidentified 28 members of Congress as police suspects, and mistook criminals on the FBI’s Most Wanted List for famous celebrities.

Are we sure that is in error or just accurate pulls from the criminal database.

On the DISPATCH: Headlines  Local  Opinion

Subscribe to Weekly Newsletter

* indicates required
/ ( mm / dd ) [ALL INFO CONFIDENTIAL]

About the Author

- Writer and Co-Founder of The Global Dispatch, Brandon has been covering news, offering commentary for years, beginning professionally in 2003 on Crazed Fanboy before expanding into other blogs and sites. Appearing on several radio shows, Brandon has hosted Dispatch Radio, written his first novel (The Rise of the Templar) and completed the three years Global University program in Ministerial Studies to be a pastor. To Contact Brandon email [email protected] ATTN: BRANDON

Leave a comment

XHTML: You can use these html tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

like_us_on_facebook

 

The Global Dispatch Facebook page- click here

Movie News Facebook page - click here

Television News Facebook page - click here

Weird News Facebook page - click here 

DISPATCH RADIO

dispatch_radio

THE BRANDON JONES SHOW

brandon_jones_show-logo

Archives