This Amnesty International Alumna is Taking Unconscious Bias Head On

By Jordan French Jordan French has been verified by Muck Rack's editorial team
Published on December 8, 2018

Some times we “play the cards we’re dealt.” When you don’t have cards, you play marbles.

That was the case for entrepreneur and Amnesty International Alumna Amy Auton-Smith, who is taking bias and discrimination solutions into her own hands with her latest startup, FairFrame. Grit Daily caught up with Amy to take a deeper look at unconscious bias and its role in racism, sexism, and other types of discrimination.

  1. What’s the story behind your own entrepreneurial ventures before FairFrame?

Because I did not have a lot of money as a kid, I was always thinking of ways to increase my allowance, before I was old enough to have a job. My best endeavor as a child was to buy marbles from the market to sell to kids in the school playground. Then, because I was good at marbles, I would often win them back and be able to sell them on to someone else.

As an adult, my entrepreneurial spirit was more focused on volunteering, organizing and service activities, for example with Amnesty International.

  1. What does FairFrame do?

FairFrame delivers personalised on-screen help to managers on identifying possible unconscious bias and stereotyping issues in their writing about others, before they share it. We also leverage social science research and AI to give data-driven insights and brand new metrics on D&I to organizations.

  1. What is “unconscious bias?” Why not tackle “conscious bias,” too?

    She didn’t have cards, so she made good use of marbles.

Conscious bias is active racism, or sexism or other types of discrimination: FairFrame doesn’t claim to deal with these patterns of thinking. We help people who want to do better on diversity and inclusion to apply their skills and to give effect to their desire to be part of a movement for change. We also make it easier for people who are very busy to remember to engage diversity thinking when writing about others.

  1. What makes your software think it is so much better than working people?

Our software is objective — we are very careful about our design and use of AI to ensure this is the case. People are rarely fully objective. This is something we can test for ourselves by having a go at the Implicit Association Tests (IAT). I can ‘feel’ my own bias in the difficulty I have with making some associations: these tests are great to help us to understand how our own minds have been trained to think.

If we have biases, why not use software to help identify them? It’s efficient and effective.

  1. Bias has been around for — seemingly forever. Shouldn’t you just give up attempting to change it?

The fact that bias has been around for forever is exactly why we’re here: we’re determined to help our clients deliver real change. As long as our clients want to make their workplaces inclusive, so that each and every one of their people feels they can bring their whole self to work… then we’re going to work to support them in achieving these goals.

  1. How much do you estimate the US economy suffers due to bias? How about the global economy?

Various studies have placed bias-related employee turnover cost to the U.S economy at between $64 billion and $98 billion per year. These are huge sums. The cost of training people on unconscious bias alongside other diversity initiatives has been estimated at $8 billion, forecast to double by 2020.

What’re even more interesting are the estimates of the cost of disengagement: estimated at around $500 billion per year in the U.S. Bias has been shown to correlate with disengagement, so it’s likely that the true cost is even greater than we think.

  1. Do you distinguish between discrimination and bias? How so?

Bias arises when the way that we look at the world lacks objectivity: whether we realise it or not. We find ourselves expecting certain things and looking for them: positive or negative. For example, we might expect men to be leaders, and therefore we look for and see leadership traits in our male team members, even if they’re not really there. We might expect women to be supportive of others, so if a women declines to help out on a task, we react more strongly than the situation objectively merits.

Discrimination is when we act on our bias or prejudice: for example by actively deciding not to engage someone because of a stereotyped view we hold of them or of people we consider to be part of the same group, rather than because of their objective merits as an individual.

  1. What are some of the worst incidences of bias you’ve seen in the workplace?

The worst incidents of bias tend to arise where a workplace has a culture that is based upon only one group: where there is a lack of understanding of how our behaviour and the things we say can impact someone else who is not from our group. Generally, once our understanding of others increases, our ability to model inclusive behaviors comes to the fore.

Research shows that working in more diverse groups can be more uncomfortable, but that this is what drives innovation, momentum and a sense of belonging: essential for retaining the best talent.

  1. What’s the worst your software can do?

As with all AI – the worst FairFrame could do would be to become self-aware and decide to take over the world. At a level below this, the worst outcome would be one that increases bias, so we’re constantly on the alert that we do not reinforce stereotype and that we ensure that FairFrame is engaging and helpful for its users.


By Jordan French Jordan French has been verified by Muck Rack's editorial team

Journalist verified by Muck Rack verified

Jordan French is the Founder and Executive Editor of Grit Daily Group, encompassing Financial Tech Times, Smartech Daily, Transit Tomorrow, BlockTelegraph, Meditech Today, High Net Worth magazine, Luxury Miami magazine, CEO Official magazine, Luxury LA magazine, and flagship outlet, Grit Daily. The champion of live journalism, Grit Daily's team hails from ABC, CBS, CNN, Entrepreneur, Fast Company, Forbes, Fox, PopSugar, SF Chronicle, VentureBeat, Verge, Vice, and Vox. An award-winning journalist, he was on the editorial staff at and a Fast 50 and Inc. 500-ranked entrepreneur with one sale. Formerly an engineer and intellectual-property attorney, his third company, BeeHex, rose to fame for its "3D printed pizza for astronauts" and is now a military contractor. A prolific investor, he's invested in 50+ early stage startups with 10+ exits through 2023.

Read more

More GD News