Online

Through an algorithm, darkly?

Bethany Keeley-Jonker

Two recent news stories – about the “right to be forgotten” by Google and Facebook’s surreptitious newsfeed study - have me thinking about what technology enables us to know about each other.

In May, the European Union affirmed a “right to be forgotten,” under which people are asking for news stories and other pieces of information to be scrubbed from Google searches. It’s an amazing image of public grace that also might lead to abuse. The problem is as old as gossip: some things are hard or even impossible to live down. But it’s also new: the facts of our past are only a search query away.

I might argue that the EU solution - to let us delete stuff - is a funny way to deal with the real problem. The incomplete picture Google offers has a kind of cultural credibility that other incomplete pictures don’t. I might learn some things about you by asking your relatives or former co-workers, but I wouldn’t think those opinions were complete or objective. An Internet search doesn’t remind us of its own partiality in the same way a human source does.

The recent controversial Facebook study brings to light the same issue in a different way. The researchers used the fact that Facebook’s newsfeed only shows you a partial view of your friends' posts to justify removing even more posts, with either positive or negative emotionally charged words, from some test subjects. The study found those who saw less positive posts were more likely to post negative content, and vice versa. The authors argue this is evidence of “emotional contagion” through social networks.

The 'right to be forgotten' is an amazing image of public grace that also might lead to abuse.

Many objectors are angry that Facebook users were manipulated without the opportunity to specifically consent. This XKCD cartoon suggests it merely draws attention to the manipulation to which we are always subject.

I’m not thrilled about that aspect of the study. But I think it’s fascinating that the researchers seem to have fallen for Facebook’s own deception, in that they assumed the content people posted to Facebook is a reliable representation of their emotional life. I’m not saying that folks don’t feel or mean what they post. Most of the time they probably do. But the parts of our lives we represent on Facebook, just like the parts that show up in a Google search, are necessarily partial. We select what we think is post-worthy and it’s more likely that this study shows we take cues from our communities about what kinds of things to select.

In 1 Corinthians 13, Paul contrasts our partial knowledge and God’s complete knowledge, using the image of a blurry mirror. “For now we see only a reflection as in a mirror; then we shall see face to face. Now I know in part; then I shall know fully, even as I am fully known.” The King James Version puts it this way: “For now we see through a glass, darkly.”

When an algorithm chooses what parts we know, we deceive ourselves and think we see the whole picture of each other, like God does. Perhaps it would be good to remind ourselves that this knowledge is actually limited in a variety of ways. And that the God who does see all still loves and forgives us anyway, while asking us to treat each other the same.

Topics: Online, Culture At Large, Science & Technology, Technology, News & Politics, World