Uncovering the bias in swiping left


About Tynesha and Racism on Dating Apps

In the summer of 2018, Tynesha McCullers tried looking for a summer fling on Tinder. The University of Maryland resident director had just gotten out of a long-term relationship and wasn’t looking for anything serious. Despite her already low expectations, nothing could prepare McCullers for the blatant racism that permeated the app.

“I would start going through profiles and I was seeing things like ‘I don’t discriminate, but I have a thing for like Asian girls or Latina girls drive me wild, ’ ”McCullers recalled. As a dark-skinned black woman, seeing those types of disclaimers was disheartening. She would look through dozens of profiles, some that seemed like potential matches only to find a message at the bottom of the page indicating a preference for white girls.

The preference issue didn’t stop at discrimination, however. Sometimes users would indicate that they really loved black women which also raised a red flag to McCullers. She would see white guys saying how much they “loved dark chocolate” which made her feel like a fetish rather than someone they’d be genuinely interested.

Many people have online dating experiences just like McCullers’. According to a blog post published by OKCupid co-founder Christian Rudder, black women and Asian men were the least desirable demographics on the dating site. Some argue that everyone is entitled to their personal preference, especially when dating but Rudder believes our preferences are shaped by our cultural values. “ ...there is an evident trend showing that race is a factor for many individuals, and in a consistent way. This might say more about the cultural biases passed down in our society than individuals within it..” wrote Rudder.

Researchers support Rudder’s claims but also argue that dating apps like the one Rudder co-founded are making the problem worse. Jevan Hutson, attorney and Fellow at the University of Washington, conducted research on several dating apps to observe how their design mitigates racial discrimination.

One feature Hutson and his team analyzed was the filter tool that allows users to select a set of characteristics that they identify with and that they’re also looking for in a partner. “Screening tools based on protected characteristics undermine the potential of intimate platforms to bridge social distance as they allow users of different social or economic backgrounds to be made invisible,” wrote Hutson. Filters can be helpful, but not when it includes someone’s race or ethnicity. The filter feature goes contrary to the dating apps intentions to “ help daters look beyond appearance and connect on a deeper level” as Rudder stated in his blog post.

Hutson and his team also pointed to matching algorithms that determine who a user would be most compatible with. The paper references the dating app CoffeeMeetsBagel, which was brought under fire for showing users potential partners of their own race even though they did not specify racial preference. This happened because the algorithm assumed that every user preferred their own race. “Platforms like these define a ‘good’ future match by using the definition of a ‘good’ past match, without considering how those past matches came to be.” So even if a CoffeeMeetsBagel did want to meet someone from a different ethnic background, they wouldn’t be presented with the opportunity on the app.

Here’s a reminder about the different gender identities. There’s a lot more, but these are the ones you need to know to understand this story. FYI: I referenced a glossary made by Refinery29 and GLAAD to make this gallery.


1 / 8
2 / 8
3 / 8
4 / 8
5 / 8
6 / 8
7 / 8
8 / 8

Bumble & Gender Discrimination

Some dating apps enforce gender stereotypes in their design. Researcher Rena Bivens studied the self-proclaimed feminist dating app Bumble and how it perpetuates gender roles and excludes queer and non-binary people. Bumble is unique because only women can initiate conversation. According to CEO Whitney Wolfe, the intention behind that feature was to both break antiquated gender norms of women waiting to be approached and avoid “ all the ugly stuff, like aggression and abuse” that rejection can evoke in men. If the women message first, they’ll only be talking to people they’re interested in, thereby preventing rejection.

Bivens points out a few issues with this approach in her study. First, she argues that this design feature is based on the assumption that men and women behave in a particular way. “The whole principle behind the app suggests that femininity is something that’s gentle, that’s not aggressive and is attached to women’s bodies and has nothing to do with masculinity,” explained Bivens. Men, on the other hand, play the role of fragile brutes that are ready to spew misogynistic hate speech when they don’t get their way.

The app does not allow room for the spectrum of femininity, masculinity and the people that live in between. Queer and gender non-conforming people do not have the same protections as straight women on Bumble. “Ladies get to ask first, but as soon as you put different bodies in there, as soon as you make it about female-identified folks wanting to consider a relationship, the whole principle here just washes away,” said Bivens.

Queer people can also be put in precarious positions as a result of the apps’ pitfalls. Bivens explained that the app allows users to look for friendship in addition to a romantic partner. However, sometimes queer people looking for love are often paired with someone straight that is just looking for friendship which, in addition to being super awkward, can also be dangerous.

Bivens offers an intersectional outlook as a way to make dating apps more inclusive. “We can’t keep encoding the same traditional limited, narrow, cis normative heteronormative biases into the technologies because they get reflected back at us” she says. And Bivens believes that making this change requires engineers to be educated about issues of race, gender, sexuality and class so that they do not embed these biases into the apps.

Though discrimination on dating apps might sound inconsequential compared to the other ways algorithms can be biased against minorities, but who we find attractive matters. The research by Bivens, Rudder and Hutson demonstrates the power these programs have to dictate our social interactions and enforce our biases without us even knowing it. Hutson encouraged app designers to consider dating apps as “architectures of intimacy” because then they might understand how their app has the power to perpetuate bias and discrimination.