Thursday, October 27, 2016

Evil Software: Wells Fargo Addition

I've written a lot in the past about how important it is to pay attention to the values that are embedded in software. Code is written by people who have their own ideas about how things and people "ought" to work. When you're selecting tools for yourself (or the enterprise if you're in such a role) you often have to dig deeply to find the evil or good that lies just beneath the interface. Ignoring this phase of your software evaluation can lead you to some embarrassing and expensive failures.

This example of the software used to guide a Wells Fargo employee through a face-to-face customer experience is about as bad as it can get. It illustrates the evil that can be embedded to micromanage and control employees:

"Again and again, each time couched in slightly different terms, the survey checked on the employee to see if she had been sufficiently aggressive in cross-selling."

Unfortunately, there are far too many of these sorts of examples. For every algorithmically-based wonder product (e.g. Google Now Cards) there are tons of these horrendous examples. I post it as a warning to how horribly bad the future of intelligent software will be if we are not vigilant. 

I remain a techno-utopian to the core, but stories like these give me the creeps.


Stephen Judd said...

Perhaps the real evil will be found in the algorithms that are increasingly making decisions. While I haven't read the book yet, I've heard a couple of interviews with Cathy O'Neil, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Besides an awesome title, she looks at the biases and assumptions built into many algorithms and how these can be corrosive.

Kevin Gamble said...

Good to hear from you! :) And good points!

I've been reading a lot about the bias built-into the FB algos of late, and how they are being used to shape public opinion. Kind of scary to think about-- we're all being manipulated.

I just bought the book-- looking forward to digging in. Thank you Stephen!

Stephen Judd said...

I think the broader problem you bring up, is that software apps and algorithms are generally constructed to optimize for one thing. To the developer, the side-effects aren't his/her problem.

The Facebook algorithm has the goal of getting people to spend more time on FB. To the algorithm and it's designers, it doesn't much matter how that goal is achieved. Additionally, some of these algorithms have accumulated from so many A/B tests, that no one actually understands all of the moving pieces...

I'll be interested to hear what you think of the book. The interviews I've heard have been compelling.

Hope you get to go snowboarding soon, if you haven't already.

Frank Morris said...

Nice blog, Thanks for this. You can visit Artificial Intelligence Program to get yourself a program that convert voice to text.