Skip to content

The algorithms! They’re after us!

September 29, 2011

I had a discussion with my roommate last night, and we got on the topic of Google searches. He’s concerned–and perhaps rightfully so–that we are living in an age where newer, “smarter” algorithms filter, predict, and even control the kind of content we see online. This happens on Google, it happens on Facebook, and it’s pernicious. One of his biggest concerns is that this usually occurs when we’re blissfully unaware of the pedantry and taste manipulation that’s going on behind the scenes. The idea that someone is designing something to control the information he sees without him knowing it is really disturbing. He even sees it as censorship, of sorts.

Somehow, I’m a little less alarmist on this topic. Here’s some of my reasoning:

Humans have always lived in “closed circle” societies, and information has always tended to circulate somewhat incestually. If your friends are reading Dostoyevsky, chances are you are, too. If your entire family is nuts for the Packers, chances are you’re going to know a lot about the team, too. Search engines that try to predict what you’re likely to be interested in based on your past preferences and your friends’ interests are not inherently wrong, they’re just “old-fashioned”, in a way.

And since the dawn of time, there have always been barriers to information. In the days of the pony express, information exchange was limited by distance, time, and cost considerations. Even in today’s print publishing world, there are very real, very looming “gatekeepers” to information promulgation. When I was interning for a non-profit publisher here in the Twin Cities, I can’t tell you how depressing–but ultimately, necessary–it was to see stacks of hopeful manuscripts piled up 5 feet deep, waiting for rejection letters. The process by which this gatekeeping happens is tacit and tricky to pin down. It can amount to a college intern on a Monday morning, bleary-eyed from last night’s partying, glancing over the first few pages of your manuscript and tossing it on the rejection pile. In fact, it’s a huge injustice, if you’re a believer that all information has a right to spread its wings and fly off to eager audiences. (I’m not, for the record, a believer in this principle. At least not in print, where there are very real resource constraints to production and distribution, and where publishing Grandma Cindy’s fruitcake recipe book just might not be the best application of those resources.)

The beauty of the internet age is that Grandma Cindy may actually have a chance. The bane of the Internet age is that Grandma Cindys everywhere now have a chance. Right now, we’re dealing less with the gatekeeping issue and more with the very real dilemma of having too much information floating around for any normal human being to grasp without some sort of search aid. I’m not naive enough to think that we’ve eradicated gatekeeping altogether–the overhead and skill required to publish online is still high, and people are still constrained by governments, by peers, by ideologies that intimidate and bully the principles of free information exchange. We still have corners of the world where access to cheap, global information dissemination technology has not yet penetrated.

But I think my point is: these barriers are not new, they were not created by the Internet or Google, and while the Internet or Google may both exacerbate and emancipate these barriers, we need to look to ourselves first. It is unrealistic to place too much responsibility on our technologies for putting us in philosophical prisons or letting us float free in the flow of emancipated information.

I think part of the moral is this:learn what your search algorithm does for you, and if you don’t want your Internet filtered in a Googelian way, then don’t use Google. There are not a lot of “flat” search options around, nor would you want there to be. You’d simply get too much crap to be useful. With that said, we should be empowered to figure out the distortions operating behind our searches, and exercise some instrumental control over them, if we want. Contrary to popular assumption, Google and “the Internet” are not logical equivalents. We have other search engines, or you could go back to the days when we mostly used direct links, or just clicked around from link to link along “webrings”. (Remember those?) Or use RSS to get streams of information from your favorite news sources and blogs. Or use Twitter and leverage your friends to help you find exciting and relevant information. There are huge possibilities out there.

At the same time, the reality comes down to the simple fact that if you’re seeing one thing, then you’re not seeing another. We’re not superheroes (at least, most of us aren’t), so we need to lighten up on ourselves about this. There is only so much time in the day, and so some sort of filtering will be necessary if we’re going to find interesting stuff online, while still leaving time for things like eating, showering, and feeding the cat.

This is not to say we shouldn’t be circumspect about the types of algorithms we’re using. I can think of two huge caveats/cautions where I think we are justified in raising alarm:

1) For me, the biggest concern is that we maintain a plurality of tools to filter information. And no, Google alone does not count as a “plurality”. We should be able to use RSS and Twitter and Bing and e-mail campaigns and word-of-mouth, etc. to help steer us to information that is relevant to our lives and our ideals. Now, we can leverage not only those “closed circle” friendships of old, but also a pretty hefty global cloud of minds to steer us to good information. Subscribe to a mailing list, ask your friends about their favorite websites, go read a local paper online and see where this all leads you. Our ways of getting information in the Internet age exist on a continuum with, and do not have to be a radical break from, face-to-face modes of communication. And just like the days of old, the path to balanced knowledge is through plurality and triangulation.

2) When search algorithms start advancing commercial ends–especially when they do so below our threshold of awareness and in ways that are not transparent–then we need to grow concerned. There may well be a good critique of Google AdWords or Facebook’s marketing schemes in this respect, but I’ll leave that argument for another day. Basically, insisting on principles of transparency and and more self- and societal awareness about how we’re being marketed to could go a long way.

Have you thought about this before? Have you thought about it even more than I (and my roommate) have? Would you like to think about this more? If so, please comment or share this to keep the discussion going!

Advertisements
5 Comments leave one →
  1. October 2, 2011 1:35 pm

    Check out this TED talk by Kevin Slavin, I think it is very relevant to this conversation: http://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html

    There is another very relevant one called “What is the Internet hiding?” http://front.moveon.org/eli-pariser-filter-bubble-ted-talk/?sms_ss=facebook&at_xt=4dd16c45db1ef3db,0

    I think part of the public work of tech people in this century will be teaching the rest of us how to become producers of our interactions with technology and information, rather than consumers. We need ways of controlling and filtering consciously and for ourselves.

  2. barretta permalink
    October 3, 2011 12:19 pm

    It seems to me like a lot of social problems stem from someone getting what they want by manipulating everyone else’s vices. Advertisers manipulate our greed, our desire for social standing and instant gratification. The war machine manipulates our fear.

    Framed this way, one starting point for responses is to work against our own vices. In this case, it’s our tendency to shut ourselves inside an echo chamber. We only want to read about five things, so Google is learning to feed us those five things.

    If we start seeking out opposing viewpoints and unfamiliar information–if we’re in the habit of challenging our own assumptions–we get harder to manipulate. Wendell Berry says it well:

    As soon as the generals and the politicos
    can predict the motions of your mind,
    lose it. Leave it as a sign
    to mark the false trail, the way
    you didn’t go. Be like the fox
    who makes more tracks than necessary,
    some in the wrong direction.
    Practice resurrection.

    • October 3, 2011 12:59 pm

      Thanks, Adam! I love that you frame it as the act of “working against our own vices”. I think that is primarily what I was driving at, too. I am alarmed by the kind of fatalism with which we confront our technology, since it means we’re giving up our agency and ultimately dodging responsibility in something that was designed to serve our ends. (I’m not such a conspiracy theorist that I’d say technology is being designed primarily to serve some nefarious commercial/political agenda.) The technology we have available to us can have emancipatory moments, if we seek them out and demand them. The idea that “it just controls us”, or “it just does what it wants” feels SO disempowering (and inaccurate) to me.

    • November 9, 2011 12:20 am

      Thanks for the link–didn’t know of him, but now I do! I’ve been pondering how to build useful “randomness” into my media consumption (usually with Twitter), but I like the argument for “serendipity” better. And again, it was fun to “meet” you at the webinar! Good luck with the Latin… wish I remembered more of mine. All I’ve got left now is derivatives and SAT vocabulary.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s