Friday, June 21, 2019

leftovers #2 – true, false, racist, or… not funny?

Last week, I applied this post’s ‘Not Funny?’ framework to a Heineken ad as a way to consider how companies cannot fail at humor in the same way as individuals. When I was researching the ad, I thought it would be a good idea to look up some other racist commercials and see if I could gain some insights into this particular subset of advertising. I had a vague recollection of Nike putting out a questionable soccer commercial involving Japan around two decades ago (it was Pepsi, and it was close). I went on Youtube to start my investigation and I entered ‘racist japan nike commercial’ into the search bar. And I ended up with…

A Chinese detergent ad? Hey now, Youtube!

I’m not going to get all huffy and indignant here (well, to the extent that being here writing about it doesn't count as 'getting huffy and indignant'). This little search engine redirect is hardly An Event, just a result of a generic search tool on a big website. If needed, I'm sure I could even give a decent race-free technical explanation about why my search terms redirected to an unrelated detergent ad. But I suspect somewhere in the logic of linking my search to a Chinese ad is a loose association based on the shared racial grouping of Asian. To some degree, I suppose the logic of search is the logic of racism - both take a starting idea, apply a series of casual associations, and then lump as much as possible under the newly formed umbrella.

I'm not necessarily pointing an accusing finger at Youtube. In terms of its search algorithm, the part I’m referring to – ‘japan’ – likely wasn’t the main reason the search returned a Chinese result. My best guess is that the words ‘racist’ and ‘commercial’ explain most of the result. And although I reject linking ‘japan’ to a Chinese ad, this ad is a huge deal – at the time of writing, it had tens of millions of views. If Youtube gets even the vaguest hint of a reference to such a popular video, it’s going to include it in as many sets of results as possible. When it comes to search, the algorithm is always going to prioritize the most commonly watched videos because the most commonly watched videos are the most likely videos a given user wants to find via search.

This last point, I think, is the most important one. Youtube search is a lot like a mediocre journalist – just as the latter is never going to tell the boss that there is nothing to write about on a given day, Youtube is never going to return a search with an entirely blank page. Therefore, if the search terms don't return exact matches, the search algorithm is going to err on the side of returning a list of popular videos to which it can make any kind of casual association. It’s this reason, and probably this reason only, why my search for a soccer commercial returned a detergent ad.

Search is just like any other tool because understanding how it works makes it more likely to function properly – and less likely to create misunderstandings. No one accuses a hot stove of bigotry any time it burns someone because we all know how a stove works. Of course, there are more serious implications for this idea. As we incorporate advanced software into an ever-increasing number of our decision tools, it becomes critical for users to understand how their new tools work. This article describing the problems with facial recognition software is a great example. It’s not that we should reject such tools out of hand – rather, we need make sure the users understand how the tool works so that it can be deployed in the right situations and use its power to help people become better at their work.