top of page

Coding the victim: How artificial intelligence is creating gender bias among consumers

Artificial intelligence is creating a gender bias among consumers. Female driven consumer spending has doubled from $20 trillion in 2009 to $40 trillion in 2018 globally. They are responsible for around 80% of all consumer purchasing, and they often make the purchasing decisions for their households. Yet, the consumer industries continue to ignore these facts in an environment where women drive the global economy.


Women mean big business


If you were to take a look at some of the biggest businesses in the country right now, you’d be unsurprised to see that most of them are driven by female consumers.


Pretty Little Thing, Boohoo, and Misguided are just some of the big hitters among the massively growing fast fashion industry, and female consumers play a huge role in driving it.


Female consumers drive more than just the fashion industry. In fact, according to this study women make the decisions in the purchases of 94% of home furnishings, 92% of travel destinations, and 91% of homes.


Women are not just contributing to the consumer economy by frivolously scrolling through ASOS. They’re driving the consumer economy globally through their domination in industries like fitness, food, beauty and real estate.


It’s not just first-hand consumerism either that puts women at the top of the consumer economy. In almost every society in the world women act as primary caregivers to children and the elderly and vulnerable. This means that women are often buying on behalf of the people they care for. They are multiple markets in one.


Artificial (un)intelligence


So why are they still so criminally underestimated? And even more importantly, why are emerging technologies so determined to overlook them?


Artificial Intelligence is new, and pretty scary. Nobody really knows what it does or how it works, including those who implement it. It takes someone with majorly advanced knowledge on the subject to understand coding and how to implement it. Naturally, then, we just leave them to it and hope they know what they are doing.


Unfortunately, this means there is limited accountability when it goes wrong, after all, how do you blame an algorithm when things turn sour?


As with all new technologies, the bright, shiny new piece of tech has everyone in a daze. With guarantees to improve service, customer experience, and save everyone money, we don’t tend to take the time to take a step back and think: do we need this?


In many cases, and especially with artificial intelligence, we don’t. Meredith Broussard, a leader in data journalism, describes this unnecessary reliance on tech as ‘techochauvinism’. And it’s more dangerous than we think.


Gender bias and credit


A couple of months ago, former Apple CEO Steve Wozniak and his wife had some issues in applying for Apple’s new credit card. He wasn’t the only one, either. Tech entrepreneur David Heinmeier Hansson spotted something fishy, too.


After applying for Apple’s latest and greatest venture, Hansson noticed that his wife had been offered 20 times less credit than himself, even though she had a better credit score.


Wozniak found himself being offered 10 times the amount of his wife, even though they had shared assets and her credit score was, again, better. But how does this happen? Surely a credit algorithm doesn’t see ‘female’ on the application and automatically give her less? It’s not quite as straight forward as that.


Algorithms are built based on a set of codes. When they’re making a judgement about whether someone should be approved for credit, for example, they’re looking at a whole list of factors, not just whether someone is male or female.


Interestingly, consumer habits are a big factor here. Where a person shops can determine whether they are approved for credit and for how much. So, while the algorithm may not be seeing ‘female’ and making judgements based on gender, they do look at where that person shops, how much they spend, and how often and make judgements based on that.


But how does this work?

Let’s look at a hypothetical situation:


There’s a woman, she works a high-power job with a great salary and disposable income. Her husband is in a similar position, but he can’t really be bothered doing the weekly shopping. The woman does the food shopping, she buys the kids birthday gifts, she scrambles through Amazon late at night looking for those eco-friendly cotton pads her friends told her about.


Naturally, as consumerism is led by women, she spends more than her husband, because he doesn’t really want to. They both apply for a credit card; he is offered more credit because she is a ‘reckless spender’. This is how algorithms work, and they’re insanely tricky to hold to account.


What does she do now? They both know this isn’t right, she runs the family’s finances. She price checks online in the supermarket, she makes sure the bills are paid on time, she patches up her sons’ jeans that has a small hole at the knee; she’s responsible.


How can this machine tell her she’s reckless with her money?


She can complain, as many have, but the truth is, even those who implement the algorithms don’t really understand how they work. And because they’re not directly enforcing gender bias, they can justify it.


It’s not just women…

While Apple, and Goldman Sachs (the bank that handle the financial side), released a statement claiming that gender isn’t even a considering factor, that doesn’t mean that the cultural behaviours exhibited by women aren’t considered.


Unfortunately, it isn’t just in credit card applications where this is affecting women, and it isn’t just affecting women. It’s affecting all minorities.


There is currently a system in the US called COMPAS that is being employed to determine sentences. Studies have shown that this algorithm is exhibiting racial bias against African Americans, and not only that, but they are incredibly unreliable, too.


It’s starting to become apparent that in all the ways artificial intelligence is being used, it is far from being able to make these decisions in a way that a human can, because there’s no context to any of it.

Yes, the married mum of four may be spending more, but only because she has to. Yes, the African American man charged with petty theft may offend again, but the white man before him that was charged with armed robbery for the third time has a much higher risk of reoffending, but that’s okay because he’s from a middle-class neighbourhood.


Where do we go from here?


The point is, the use of algorithms right now shows no sign of slowing down. In fact, more and more companies are adopting this technology to increase profits and reduce overheads. In 2019, artificial intelligence saw a massive 154% growth in the global market.


This means of course that we can expect to see this technology invading more areas of our lives. In the US right now, they are trialling facial recognition to allow people to enter their own apartments, and why? Just because.


It’s the new toy that everyone wants, but if it continues without accountability, we can be sure to see more instances of discrimination based on gender and race.


It’s sad that in a world where women dominate the consumer economy, they are treated with such little regard. Women are smart, far smarter than any multinational conglomerate will ever give them credit for, so it’s time to start putting our money where our mouths are and stop supporting companies that show such little respect for half of the world’s population.

SUBSCRIBE TO MINDLESS MAG

Thanks for submitting!

© Mindless Mag

bottom of page