Read, Debate: Engage.

Lipstick and big data: how Cambridge Analytica got your identity wrong

December 04, 2018
topic:Digital Rights
tags:#Cambridge Analytica, #human rights, #Donald Trump, #Brexit, #LGBTQ, #big data
partner:Screen Shot
located:United Kingdom
Does your makeup define you? I hate the thought of my lipstick choice being used by data scientists to decide who I am. I might feel a shade of Velvet Teddy on a Monday in the office, but I'll be a smudged Lady Danger at 5am on a Saturday morning. I use makeup to explore my own identity, a slippery notion which is not at all easily expressed, even from my own lips. Those who wear or like makeup are almost always subject to prejudice. We are either wearing too much, not enough or we are doing it wrong. People who have been criticised for wearing, wanting to wear or not wearing makeup, will know that makeup has its own language.

By Eleanor FlowersWords

A glance at Instagram (owned by Facebook) seems to testify to the growing commitment amongst beauty professionals to champion diversity and creative self-expression. As Sissi Johnson, an industry scholar who lectures on ‘Multiculturalism in the Beauty Industry’ tells me, “beauty commodities have always been well-positioned for expressing the nuances of identity as they can be highly personalised. Makeup is particularly intimate and its significance to consumers varies throughout cultures and age groups”. My own concern, however, is that all our hard work towards fearless self-expression could go to waste. Beauty professionals offer huge amounts of data up to be organised and analysed by the tech gods, but are the values of today’s beauty industry being heard? How do our beauty choices translate into the language of data science?

Remember the Cambridge Analytica scandal? Cast your mind as far back as March 2018 when the international press exploded with cries of data breaches. Mainstream media outlets reported that Cambridge Analytica had gathered data from approximately 87 million Facebook users and used this information to design “mind-reading software”. Speculation then turned to whether this technology had been used to successfully target individual voters with political ads and if it was responsible for both Trump and Brexit. In times of political turbulence, it is often simpler to blame technology than it is to blame people. While the idea of having your data used to read your mind is scary, there are even more sinister forces at play here. What if I told you that the scariest thing was that Cambridge Analytica had got people all wrong, but that they had taken it upon themselves to speak on other people’s behalf anyway?

When distinguishing the innermost thoughts and feelings of voters in the U.S. and U.K., the scientists at Cambridge Analytica used “psychographics”. It is reasonable that most wouldn’t question something that sounds so authoritatively sciencey. Here’s the thing though: the social science they used was problematic, to say the least. The original research behind Cambridge Analytica’s so-called “high-tech” methods was headed up by Michal Kosinski at the University of Cambridge in 2012. The team started by correlating personality features, for instance neuroticism, with online behaviour, such as displaying high levels of Facebook activity or, erm, having loads of friends. Fine. But then in 2013, they wrote another paper that lauded the “predictive power of likes”, and this is where things got silly. Kosinski and his pals wrote that liking groups such as “Sephora” and “I Love Being a Mom” were highly accurate indicators of low intelligence. Misogyny, anyone? It is unclear what these scientists deem to be intelligent, but it doesn’t seem that makeup-wearing mothers stand much of a chance of being counted as such. Shopping at Sephora and mothering children have diddly squat to do with my intelligence levels and I am upset by people who tell me that they do. Kosinski has since raised concerns that this type of research might be unethical and was quoted in the Guardian as having said, “I did not build the bomb. I only showed that it exists”. How easy it can be to assert that technology exists separately from those who created it.

The paper then goes on to assert that a “like” for MAC Cosmetics is a good predictor of male homosexuality. Bafflingly, the researchers couldn’t work out why “gay men” hadn’t “liked” such groups as “Being Gay” or, even, “I Love Being Gay”. Instead, they had to make do with “likes” for makeup, this time as an imagined hallmark of gayness rather than stupidity. The kind of discourse that accounts for queerness in terms of being only a “gay man” is outdated and dangerous.

Kosinski and co. have since forayed into deploying deep neural networks to distinguish male sexual orientation from facial recognition. Basically, they are training computers to automatically spot which men look “gay” and which are not. The Stanford University study, which was based on data from white individuals only, was met with anger from GLAAD, the world’s largest LGBTQ media advocacy organisation. Jim Halloran, GLAAD’s Chief Digital Officer stated that Kosinski’s research “isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of colour and transgender people”. Kosinski's weird ideas about beauty choices serve principally to demonstrate that biases against people definitely still exist and that they are reinforced by technologies such as psychographics. Beauty consumption is about expression, not crappy categorisation.

No algorithm could have predicted the wealth of diverse self-expression that much of the makeup industry and beauty-blogging world continues to nourish. But the problem is that these Instagram and Facebook “likes”, say a “like” for Sephora or MAC, are vulnerable to getting translated back into something we didn’t ask for. When we use social media platforms, we leave our data open to being used to misrepresent us and our consumer choices. This insult to lipstick wearers around the world is a gross mismeasure of fxmme. Your makeup bag is a weapon for positive change and those who ridicule your consumer choices need to know they have got it twisted.

Content partners:
screen-shot-logo-web-new@2x
Screen Shot
Content partner
United Kingdom
Embed from Getty Images
I hate the thought of my lipstick choice being used by data scientists to decide who I am.
Embed from Getty Images
Mainstream media outlets reported that Cambridge Analytica had gathered data from approximately 87 million Facebook users and used this information to design “mind-reading software”.
Embed from Getty Images
While the idea of having your data used to read your mind is scary, there are even more sinister forces at play here.
Call to Action
Support the Russian LGBT Network to protect the rights and lives of LGBTQ individuals in Russia and Chechnya!
flag
Support now
.
.