Fighting for existence in the digital future

Janus Rose Kopfstein had a “strict no photos policy.” None of her friends were allowed to take pictures of her and they definitely couldn‘t post them on the internet. She didn’t feel comfortable in her body and wanted to have control over who was able to see her. “I just felt like all this technology was progressing so quickly and it was outpacing my ability to assert control over my own image and over my own data,” explained Kopfstein. This was before she came out as transgender and started to make her outer image mirror the way she felt on the inside.

Kopfstein’s demands were founded in more than just insecurity. She didn’t want to create a “data trail” processed by Facebook’s algorithms that would record information like who she’s friends with, what she looked like at the time and where she goes. This lack of control that is mediated by the internet is what sparked Kopstein’s interest in exploring issues of privacy and consent online.

Last year Kopfstein wrote an article for Dazed Magazine that detailed why algorithms are concerning to her as a trans woman. In the article she described algorithms as a series assumptions about how the world should work, which can be dangerous. “Left to their own devices algorithms can function as tools of oppression, entrenching the structural inequality that permeates our society,” wrote Kopfstein.

She continues to describe how gender recognition, a subcategory of facial recognition, can affect the trans community in harmful ways. Kopfstein points to a study conducted by the University of North Carolina at Wilmington that created a program that could identify trans people before and after using hormone replacement therapy. Without asking for consent, the developers used transition timeline videos from YouTubers who documenting their transition process. Kopfstein says this is harmful to trans people in a number of ways but primarily because it was done without the YouTuber’s approval with the intention “to train predictive algorithms that effectively ‘out’ trans people using archived photos.‣

“We’re not a problem for you to solve,”

Karl Ricanek, the researcher who started this project told The Verge that he just wanted to “illuminate what problem areas exist.” The problem he’s referring to is the fact that facial recognition services would not be able to identify someone if they used appearance-altering hormone therapy. Kopfstein said that this way of thinking is a big part of the issue. She said that tech developers look at people who do not fall in these rigid gender categories as problems, when not fitting in is often the point of being trans. “We’re people we’re not a problem for you to solve,” said Kopfstein.

The program at the University of North Carolina at Wilmington isn’t the only algorithm that is biased against trans people. Former data scientist and current Ph.D student and professor at the Data Ecologies Laboratory at the University of Washington believes practically all algorithms carry bias.

“The idea of an algorithmic system that doesn’t have biases laughable. And the idea of an algorithmic system that doesn’t then communicate that bias to the people using it is laughable too,” said Os Keyes who has seen how bias gets embedded into algorithms first hand.

When Keyes was working at a data scientist, they didn’t feel safe coming out as trans, or speaking up about trans issues in fear of being outed. Keyes recalls one incident when their company was working on a service that presented a security risk for marginalized people.


Now a sociologist, Keyes has done research that shows how little the data scientists developing algorithms consider the effects of their programs on minorities like women and gender non-conforming people. Keyes’ research involved tracking how many times engineers acknowledged the transgender community when explaining how their programs function in research papers. “All of the papers I ran into were just about cis people and just used the gender binary and didn't really explore and unpack gender whatsoever,” said Keyes about their findings.

Like Kopfstein, Keyes is also concerned about the future impact of gender recognition on trans people in terms of how the world views them and how they view themselves.

Keyes mentioned a Reddit thread where trans people upload their photographs to an app that categorizes the person in the photo is a man or a woman. Doing this allows them to determine how well they pass as cisgender that day. Apps like this are a representation of how technology creates standards by which people think of gender and where they belong. “When you’re designing infrastructure that says that gender works this certain way and if you don’t meet these standards, then you don’t count, you’re coercing people into presenting in those certain ways,” said Keyes.

This power to shape the ideology of the public in relation to gender is especially harmful for queer people because the foundation of queerness is the autonomy to define oneself. Living in a system that determines how someone can identify is what Keyes calls “the death knell of queerness and trans existence.”

Trans people are one of the most marginalized groups in American society. They struggle have high rates of suicide, hate crimes, unemployment among a lot other obstacles. Keyes and Kopfstein suggest that the rise of algorithmic bias against the trans community will only exacerbate these problems, pushing trans people further towards the margins.

Though things look dark for trans people as algorithms start to rule our world, both Kopfstein and Keyes believe that there’s a way out. Kopfstein suggested that engineers exert their power to address algorithmic bias. “What if there was a union at Google and it explicitly stated that people who are in the union refuse to participate in projects that involve face recognition or projects that involve drones?”. Keyes offered more community involvement in deciding how algorithms would work and what information local communities would be willing to give. It’s an option that would give people more autonomy in an age where technology seems to be happening to the public rather than for the public.