I am very biased

Recently, Brian Nosek from Project Implicit gave a lecture on bias at Stanford University. I had heard of his work before, but to experience it firsthand was quite an eye-opener. Imagine a lecture hall packed with 300 people, all faculty, research staff or students at Stanford.

Lesson 1: Are you sure he just said that?

For this exercise, half of the audience closed their eyes. Brian then played a video of a man speaking two syllables. He then asked the audience what they had heard: baba, dada, or gaga. The blinded side of the audience heard baba. The open-eyed side of the audience had a split opinion between gaga and dada. The video was played a second time, all eyes open. Of course it was a trick video: the man made the lip motions of baba, but an audio track of gaga had been overlayed. The blinded side of the audience was not confused by visuals and heard the truth. The unblinded side either ignored the audio and went with the visual, or blended audio and visual to gaga.

Lesson 2: Are you sure what you just saw?

Brian now explained to us he would show us a video consisting of two film strips overlaid on each other. Each of the film strips showed a team of three people passing a ball to each other. We were to count how many times the white team passes the ball, and ignore the black team. With full focus of 300 people staring at the screen, 100% of the audience got the answer right. 99.6% of us failed to see the woman with an umbrella strolling right through the middle of the game! The video was shown again, and we all gasped at what we had missed. The one person who had seen the woman the first time? He did not count, because he had seen the video before.

Lesson 3: Are you sure it was this person you talked to?

The audience was cut some slack this time, but we got to watch a video shot from the second floor of a college campus observing people walking along. A guy dressed as a construction worker (yellow helmet, tool belt) asked one of the pedestrians for direction. In the middle of the conversation, two people carrying a door walked right between the construction worker and the pedestrian, secretly exchanging the original construction worker with one of the people carrying the door (also wearing a helmet and a tool belt). 2/3 of the pedestrians did not notice the swap of construction worker, even though the second guy wore a green sweater instead of a blue one, did not have a beard, and of course spoke with a different voice. I’ll never trust an eyewitness on suspect identification ever again.

Lesson 4: Beat Cal

I will spare you reading through a lengthy explanation and ask you to go to https://implicit.harvard.edu/implicit/ and choose one of the demonstration tests. We did a few with the audience, and found out a clear association of Stanford with positive words. We also learned that we as an audience were 0.8 second slower in associating women with scientist. We also had a strong bias toward white americans vs black americans (gasp from the very embarrassed audience).

Lesson 5: What is in a name?

The researchers send identical, appropriate CVs answering a large number of employment ads. The only word distinguishing the CVs from each other was the first name of the applicant. The group then compiled a statistic on how many return calls each CV got to advance in the application process. The result was clear: male first names got more return calls than female ones. White sounding names got more return calls than african-american sounding names.

What have we learned?

We had to admit to ourselves how easily our senses are fooled. A yellow helmet can completely obscure the person wearing it. We don’t see the obvious, out-of place woman with umbrella. And despite our best intentions, knowing what the implicit association test is aiming for, we are biased. What do we do now?

Well, we did the first step and understood that bias is something we cannot avoid, despite our best intentions. It is in our nature to distinguish between “us” and “them”. Paying attention, however, has a large potential of minimizing a negative impact of our bias. For example, the male-only orchestra who started auditioning musicians behind a curtain to avoid bias toward an applicant who was a family member of the trombone player suddenly discovered female musicians among their top contenders.

Unfortunately, it is virtually impossible for a faculty promotion committee to anonymize the promotion package of the faculty member aiming to go a step up on the academic letter. What can be done, however, is to look at the percentage of successful promotions depending on which “bias group” the applicants may belong to. If there is a clear trend, maybe it is time to examine if there is unconscious bias at play.

Bias, in this context, is not intended to be a moral judgement. It is purely a neutral, scientific term.

The willingness or unwillingness to work with your own sets of biases to avoid unfairness, however, is a moral decision.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s