Career Corner Blog Posts
Blog posts are a great way for SAP, customers, and partners to share advice, insights into career trends, new opportunities, and personal success stories.
cancel
Showing results for 
Search instead for 
Did you mean: 
moyalynne
Active Contributor
Adapted from SAP Mentors Webinar April 9   -  (Disclaimer: Views my own)

If you’re a technologist and you’re sitting it out because you think the buzz-phrase “diversity in tech” doesn’t apply to you, I challenge to you to recognize that you’re part of the problem. Diversity isn’t just an HR issue, a talent pipeline issue, a question of recruiting, and an embarrassing statistic for major tech corporations. It’s a technology issue – and as a technologist, you can do something about it.

It’s been a month since the Lesbians Who Tech Summit 2018 and this may be my favorite, lingering insight.



Key themes from Lesbians Who Tech Summit 2018 – images via Camille Eddy

Since the organizers at Lesbians Who Tech worked hard to intentionally include a variety of marginalized technologists – featuring 100% female speakers, 50% speakers of color, 10% transgender and non-binary speakers – it’s no coincidence that key themes about bias in the machine were repeated at various eye-opening sessions. As technologists, it’s especially our responsibility to recognize when the status quo in critical emerging technologies such as machine learning, artificial intelligence, and the algorithm economy is inherently biased, or otherwise face unintended consequences.



Camille Eddy: The Cultural Bias in AI - images via Camille Eddy

The Cultural Bias In AI


Camille Eddy's brilliant talk at the Summit on the cultural bias that sneaks into AI (find her similar talk online) invites us to do just this. Too often, she said, the data sets we use to train the models that inform our mathematical algorithms are not actually diverse, and if we’ve failed to source diverse data sets, the foundation of our technology is biased and skewed to not be able to even recognize the diverse people who use it. “What happens,” she asks, “when any segment of the population is invisible to the technology we use?”

Segregated technology happens. In her work with robotics, Camille relates how people of color are often invisible or misidentified in facial recognition. You may have heard of the soap dispenser that used a sensor that couldn’t recognize black skin.  You may also know that some social media photo filters have trouble optimizing black skin the same way they optimize white skin. Did you know there are some optical sensors in fitness trackers that struggle to properly measure pulse through black skin? When wearables fail to source diversity, this can lead to life-threatening consequences in our tech-reliant world.

How can we prevent biased technology from being released into the market? And when we fail to prevent it, Camille asks, “How do we actually interrupt the bias as it happens?”



Filipe Roloff, in support of a colleague recovering from being attacked

In case you are tempted to think of this as “just” a black thing or a gay thing or a female thing, consider that there are 7.6 billion of us on planet earth - each of us different. Contributing their talents at SAP alone, we have people on the spectrum bringing special talents, people in wheelchairs helping us problem solve, blind people who help make our user interface better, veterans and people who’ve immigrated and adapted to wildly different situations, and yes even straight white cis males – all of us together helping make better technology just by being who we are -- yet each of us being reduced to a limited set of descriptors in algorithms that are used to profile us, serve us more technology, and influence us.



Sandberg and Swisher at Lesbians Who Tech Summit 2018 - images via Twitter and Lesbians Who Tech

When Bias Leads to Weaponization


Probably no company is having a more public challenge about data profiling right now than Facebook. Beyond the current sensationalism in data profiling, protection, and privacy, another important question I fear we risk missing is about the potential for weaponization of bias.

What are tech providers’ roles now and going forward in tempering technology that serves and exploits bias – in essence manipulating our own confirmation bias in turn with biased algorithms - to such a massive degree that it can influence elections -- the very fabric of society?

Should tech providers also have been responsible for predicting this would happen? As Kara Swisher asked Sheryl Sandberg in their talk at the Summit, “How did you not predict your platform would be weaponized?

Sandberg admitted: “We’re playing catch-up”— yet this is far from just Facebook’s issue or a matter of recruiting diversity. It’s not just somebody else’s problem, for example, that there are fewer than 2% black technologists at our elite Silicon Valley companies.  If we’re not all not asking questions and testing for bias – or the intent to cause harm (to ourselves or others) – we can consider ourselves all part of the problem.



Actual Photos

I, too, a gay white female in tech thinking and writing often about diversity issues, am part of the problem with as much of an inherently biased network as any other one single person in tech.

Even as magical as the Lesbians Who Tech Summit 2018 was, its wild conference diversity did not happen automatically. Lesbians Who Tech had to fight (others and themselves) to be intentional, inclusive, intersectional. (Trigger alert: the (other) “q” word may appear).

Which is the good news: that means we all can do it. We can question our bias. And we can test for it.

Disarming The Bias


 

There are three phases I think technologists need to go through to address bias in technology.

1


Get over it.

Pardon my directness, but first, we can check our privilege and get over (or promise to just try!) our issues with words like “quota” that might actually serve a productive purpose towards the diversity goals we say we have.

 

2


Check ourselves.

Then, we can take a look at who surrounds us and challenge it: Is there a diverse tribe in this room? Does this meeting room represent the communities in which we actually live? The world at large? The people who are going to use this tech or rely on it for their lives?

 

3


Be the technology.

Finally, if any of that makes us uncomfortable: We’re technologists, and as technologists, we can test our code for the same bias. Look at the diversity or lack of it in your data sample, and by how it acts when it is used by a variety of people.  There are also emerging services (built by, yes, technologists) out there that specifically to help interrupt bias in tech.

Three inspirations of this from Lesbians Who Tech include:



Ana Arriola - images via Twitter

  • Ana Arriola from Facebook talked at the Summit about how they are exploring “benevolent uses of AI” such as being able to determine a suicidal intent in a post, or ensure we never mis-gender or be harmful to anyone online.

  • Aubrey Blanche from Atlassian talked about technology to review for and interrupt biases and gendered words in job descriptions (https://textio.com/), which echoes SAP’s own Business Beyond Bias.

  • To help you get beyond your own biased hiring networks (yes, all of our networks are actually inherently biased), Lesbians Who Tech is architecting a tool to help validate nontraditional and underrepresented talent called include.io https://include.io/.


And there are many other tools coming out or not even invented yet in this essential space. Perhaps you will create one.

You can be part of the solution.

 

 
13 Comments