Over at the NY Times, Anna North asks if we can become more creative by using an unusual search engine called Yossarian that purports to help us see things in new ways—ways that go beyond the predictable associations we’re inclined to make when thinking about people, things, ideas, events, etc. What fascinates me about this possibility is that in order for it to be true, prediction needs to be the antidote to predictability. Without inferring where your mind is prone to wandering, neither a person nor an algorithm stands a chance of presenting something to you in a new light.
In everyday life, predictability is associated with consistency. In many cases consistency is a good thing. If your friends are so reliable that you can confidently predict they’ll stay loyal and true, you’re in with a good crowd. If you can predict how long it will take you to drive to work, you can reliably arrive on time without needing to get up earlier than necessary or feeling rushed.
Prediction has become an important feature in information and communication technology. Auto-complete makes searching on Google GOOGL +0.41% such an efficient experience that it can feels like mind-reading. Prediction also enables recommendation engines on services like Amazon and Pandora to determine what we’d like to purchase or listen to. And, prediction makes it possible for fitness trackers to recommend when we should take a nap and Google Now to anticipate the weather we’ll encounter when travelling. Looking towards the future, some are hoping that when the Internet of Things matures, our refrigerators will recognize when we’re running out of groceries and contact stores to replenish our stock before it runs out.
While inferring what we want can save us time, make it easier for us to accomplish goals, and expedite finding things we expect will bring us pleasure, predictive technology also can create problems. Privacy scholars like Ryan Calo note that if marketers can use big data to predict when we’re susceptible to lowering our guard, they can capitalize on our vulnerabilities. A related concern was expressed when Facebook ran it’s infamous emotion contagion experiment. If social media companies can predict, with ever-finer precision, what makes users eager to engage with their platforms, they can design features that will manipulate us accordingly.
On a more fundamental level, Cass Sunstein and others who discuss filter bubbles have expressed concern that algorithms which present us with personalized information customized to fit our expectations of relevance can be bad for democracy: the echo chambers they create can incline us towards embracing narrow—if not extremist—worldviews, eschewing diversity, and favoring conformity.
I’ve recently joined the cadre critics by arguing that there’s a significant cost involved by predictive texting—as exemplified by QuickType, a new feature on Apple AAPL +2.94%’s iOS8 operating system—removing friction from communication by having algorithms guess what we’re likely to say. When your devices do the work of being you, you’re susceptible to becoming a predictable, facsimile of yourself who gives others your second best.
No comments:
Post a Comment