Simanaitis Says

On cars, old, new and future; science & technology; vintage airplanes, computer flight simulation of them; Sherlockiana; our English language; travel; and other stuff

IN DEFENSE OF ALGORITHMS

IT ISN’T MY mathematical background that has me defending algorithms. Nor is it scads of angry algorithms responding to the last two days of SimanaitisSays, in which I shared Annalee Newitz’s analyses of social media.

Come to think of it, though, wouldn’t it have been fun if scads of algorithms had angrily responded.

No, this defense is based on my reading “Biased Algorithms Are Easier to Fix Than Biased People,” by Sendhill Mullainathan in The New York Times, December 6, 2019. Here are tidbits gleaned from this article.

Illustration by Tim Cook in The New York Times, December 8, 2019.

Sendhill Mullainathan is a professor of behavioral and computational science at the University of Chicago, precisely the expertise required to analyze the differences between human-devised mathematical algorithms and humans themselves.

Unintended Biases. Not all biases are intentional. Lack of analysis, indifference, or just ignorance can bias matters. Mullainathan offers a poignant family experience in this regard: Newly arrived immigrants to Los Angeles when Sendhill was a kid, the family celebrated by going to Sears for a portrait, a rare thing in their native India.

The photos were a huge disappointment: “Our faces were barely visible,” Mullainathan writes, “Only the whites of our teeth and eyes came through. We learned later, much later, that the equipment had been calibrated for white skin.”

Racially Biased Studies. “In one study published 15 years ago,” Mullainathan writes, “two people applied for a job. Their resumés were very similar. One person was named Jamal, the other Brendan.”

“In a study published this year,” Mullainathan continues, “two patients sought medical care. Both had diabetes and high blood pressure. One patient was black, the other was white.”

Mullainathan says, “Both studies documented racial injustice: In the first, the applicant with the black-sounding name got fewer job interviews. In the second, the black patient received worse care.”

Humans and Algorithms—Both Potentially Biased. Mullainathan notes, “But they differed in one crucial respect. In the first, hiring managers made biased decisions. In the second, the culprit was a computer program. As a co-author of both studies, I see them as a lesson in contrasts. Side by side, they show stark differences between two types of bias: human and algorithmic.”

Human Biases. Human biases are cultivated over years, with individuals of different backgrounds and different experiences.

As Mullainathan notes, “Changing people’s hearts and minds is no simple matter. For example, implicit bias training appears to have a modest impact at best.”

Algorithmic Biases. By contrast, biases in algorithms are more easily identified. For example, those in the medical study were later traced to its choice of input data: Mullainathan notes, in assessing patient sickness “they used the most readily available data, healthcare expenditures. But because society spends less on black patients than on equally sick white ones, the algorithm understated the black patients’ true needs.”

Fixing the Algorithms. “Changing algorithms is easier than changing people,” Mullainathan says. “Software on computers can be updated; the ‘wetware’ in our brains has so far proven much less pliable.”

Fixing computer code is easier than fixing human attitudes. ds

© Dennis Simanaitis, SimanaitisSays.com, 2019

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: