Four Thousand Weeks : Time Management for Mortals (9780374715243) Read online

Page 8


  Even commentators who spend a lot of time fretting about the modern-day “crisis of distraction” rarely seem to grasp the full implications of this. For example, you hear it said that attention is a “finite resource,” and finite it certainly is: according to one calculation, by the psychologist Timothy Wilson, we’re capable of consciously attending to about 0.0004 percent of the information bombarding our brains at any given moment. But to describe attention as a “resource” is to subtly misconstrue its centrality in our lives. Most other resources on which we rely as individuals—such as food, money, and electricity—are things that facilitate life, and in some cases it’s possible to live without them, at least for a while. Attention, on the other hand, just is life: your experience of being alive consists of nothing other than the sum of everything to which you pay attention. At the end of your life, looking back, whatever compelled your attention from moment to moment is simply what your life will have been. So when you pay attention to something you don’t especially value, it’s not an exaggeration to say that you’re paying with your life. Seen this way, “distraction” needn’t refer only to momentary lapses in focus, as when you’re distracted from performing your work duties by the ping of an incoming text message, or a compellingly terrible news story. The job itself could be a distraction—that is, an investment of a portion of your attention, and therefore of your life, in something less meaningful than other options that might have been available to you.

  This was why Seneca, in On the Shortness of Life, came down so hard on his fellow Romans for pursuing political careers they didn’t really care about, holding elaborate banquets they didn’t especially enjoy, or just “baking their bodies in the sun”: they didn’t seem to realize that in succumbing to such diversions, they were squandering the very stuff of existence. Seneca risks sounding like an uptight pleasure-hater here—after all, what’s so bad about a bit of sunbathing?—and to be honest, I suspect he probably was. But the crucial point isn’t that it’s wrong to choose to spend your time relaxing, whether at the beach or on BuzzFeed. It’s that the distracted person isn’t really choosing at all. Their attention has been commandeered by forces that don’t have their highest interests at heart.

  The proper response to this situation, we’re often told today, is to render ourselves indistractible in the face of interruptions: to learn the secrets of “relentless focus”—usually involving meditation, web-blocking apps, expensive noise-canceling headphones, and more meditation—so as to win the attentional struggle once and for all. But this is a trap. When you aim for this degree of control over your attention, you’re making the mistake of addressing one truth about human limitation—your limited time, and the consequent need to use it well—by denying another truth about human limitation, which is that achieving total sovereignty over your attention is almost certainly impossible. In any case, it would be highly undesirable to be able to do exactly as you wished with your attention. If outside forces couldn’t commandeer at least some of it against your will, you’d be unable to step out of the path of oncoming buses, or hear that your baby was in distress. Nor are the benefits confined to emergencies; the same phenomenon is what allows your attention to be seized by a beautiful sunset, or your eye to be caught by a stranger’s across a room. But it’s the obvious survival advantages of this kind of distractibility that explain why we evolved that way. The Paleolithic hunter-gatherer whose attention was alerted by a rustling in the bushes, whether he liked it or not, would have been far more likely to flourish than one who heard such rustlings only after first making the conscious decision to listen out for them.

  Neuroscientists call this “bottom-up” or involuntary attention, and we’d struggle to stay alive without it. Yet the capacity to exert some influence over the other part of your attention—the “top-down” or voluntary kind—can make the whole difference between a well-lived life and a hellish one. The classic and extreme demonstration of this is the case of the Austrian psychotherapist Viktor Frankl, author of Man’s Search for Meaning, who was able to fend off despair as a prisoner in Auschwitz because he retained the ability to direct a portion of his attention toward the only domain the camp guards couldn’t violate: his inner life, which he was then able to conduct with a measure of autonomy, resisting the outer pressures that threatened to reduce him to the status of an animal. But the flip side of this inspiring truth is that a life spent in circumstances immeasurably better than a concentration camp can still end up feeling fairly meaningless if you’re incapable of directing some of your attention as you’d like. After all, to have any meaningful experience, you must be able to focus on it, at least a bit. Otherwise, are you really having it at all? Can you have an experience you don’t experience? The finest meal at a Michelin-starred restaurant might as well be a plate of instant noodles if your mind is elsewhere; and a friendship to which you never actually give a moment’s thought is a friendship in name only. “Attention is the beginning of devotion,” writes the poet Mary Oliver, pointing to the fact that distraction and care are incompatible with each other: you can’t truly love a partner or a child, dedicate yourself to a career or to a cause—or just savor the pleasure of a stroll in the park—except to the extent that you can hold your attention on the object of your devotion to begin with.

  A Machine for Misusing Your Life

  All of which helps clarify what’s so alarming about the contemporary online “attention economy,” of which we’ve heard so much in recent years: it’s essentially a giant machine for persuading you to make the wrong choices about what to do with your attention, and therefore with your finite life, by getting you to care about things you didn’t want to care about. And you have far too little control over your attention simply to decide, as if by fiat, that you’re not going to succumb to its temptations.

  Many of us are familiar by now with the basic contours of this situation. We know that the “free” social media platforms we use aren’t really free, because, as the saying goes, you’re not the customer but the product being sold: in other words, the technology companies’ profits come from seizing our attention, then selling it to advertisers. We’re at least dimly aware, too, that our smartphones are tracking our every move, recording how we swipe and click, what we linger on or scroll past, so that the data collected can be used to show us precisely that content most likely to keep us hooked, which usually means whatever makes us angriest or most horrified. All the feuds and fake news and public shamings on social media, therefore, aren’t a flaw, from the perspective of the platform owners; they’re an integral part of the business model.

  You might also be aware that all this is delivered by means of “persuasive design”—an umbrella term for an armory of psychological techniques borrowed directly from the designers of casino slot machines, for the express purpose of encouraging compulsive behavior. One example among hundreds is the ubiquitous drag-down-to-refresh gesture, which keeps people scrolling by exploiting a phenomenon known as “variable rewards”: when you can’t predict whether or not refreshing the screen will bring new posts to read, the uncertainty makes you more likely to keep trying, again and again and again, just as you would on a slot machine. When this whole system reaches a certain level of pitiless efficiency, the former Facebook investor turned detractor Roger McNamee has argued, the old cliché about users as “the product being sold” stops seeming so apt. After all, companies are generally motivated to treat even their products with a modicum of respect, which is more than can be said about how some of them treat their users. A better analogy, McNamee suggests, is that we’re the fuel: logs thrown on Silicon Valley’s fire, impersonal repositories of attention to be exploited without mercy, until we’re all used up.

  What’s far less widely appreciated than all that, though, is how deep the distraction goes, and how radically it undermines our efforts to spend our finite time as we’d like. As you surface from an hour inadvertently frittered away on Facebook, you’d be forgiven for assuming that the damage, in
terms of wasted time, was limited to that single misspent hour. But you’d be wrong. Because the attention economy is designed to prioritize whatever’s most compelling—instead of whatever’s most true, or most useful—it systematically distorts the picture of the world we carry in our heads at all times. It influences our sense of what matters, what kinds of threats we face, how venal our political opponents are, and thousands of other things—and all these distorted judgments then influence how we allocate our offline time as well. If social media convinces you, for example, that violent crime is a far bigger problem in your city than it really is, you might find yourself walking the streets with unwarranted fear, staying home instead of venturing out, and avoiding interactions with strangers—and voting for a demagogue with a tough-on-crime platform. If all you ever see of your ideological opponents online is their very worst behavior, you’re liable to assume that even family members who differ from you politically must be similarly, irredeemably bad, making relationships with them hard to maintain. So it’s not simply that our devices distract us from more important matters. It’s that they change how we’re defining “important matters” in the first place. In the words of the philosopher Harry Frankfurt, they sabotage our capacity to “want what we want to want.”

  My own squalid, but I suspect entirely typical, history as a Twitter junkie might serve as a case in point. Even at the height of my dependency (I’m now in recovery), I rarely spent more than two hours a day glued to the screen. Yet Twitter’s dominion over my attention extended a great deal further than that. Long after I’d closed the app, I’d be panting on the treadmill at the gym, or chopping carrots for dinner, only to find myself mentally prosecuting a devastating argument against some idiotic holder of Wrong Opinions I’d had the misfortune to encounter online earlier that day. (It wasn’t misfortune really, of course; the algorithm showed me those posts deliberately, having learned what would wind me up.) Or my newborn son would do something adorable, and I’d catch myself speculating about how I might describe it in a tweet, as if what mattered wasn’t the experience but my (unpaid!) role as a provider of content for Twitter. And I vividly recall walking alone along a windswept Scottish beach, as dusk began to fall, when I experienced one particularly disturbing side effect of “persuasive design,” which is the twitchiness you start to feel when the activity in which you’re engaged hasn’t been crafted by a team of professional psychologists hell-bent on ensuring that your attention never wavers. I love windswept Scottish beaches at dusk more passionately than anything I can ever remember encountering on social media. But only the latter is engineered to constantly adapt to my interests and push my psychological buttons, so as to keep my attention captive. No wonder the rest of reality sometimes seems unable to compete.

  At the same time, the hopelessness of the world I encountered online began to seep into the world of the concrete. It was impossible to drink from Twitter’s fire hose of anger and suffering—of news and opinions selected for my perusal precisely because they weren’t the norm, which was what made them especially compelling—without starting to approach the rest of life as if they were the norm, which meant being constantly braced for confrontation or disaster, or harboring a nebulous sense of foreboding. Unsurprisingly, this rarely proved to be the basis for a fulfilling day. To make things more troublesome still, it can be difficult even to notice when your outlook on life is being changed in this depressing fashion, thanks to a special problem with attention, which is that it’s extremely difficult for it to monitor itself. The only faculty you can use to see what’s happening to your attention is your attention, the very thing that’s already been commandeered. This means that once the attention economy has rendered you sufficiently distracted, or annoyed, or on edge, it becomes easy to assume that this is just what life these days inevitably feels like. In T. S. Eliot’s words, we are “distracted from distraction by distraction.” The unsettling possibility is that if you’re convinced that none of this is a problem for you—that social media hasn’t turned you into an angrier, less empathetic, more anxious, or more numbed-out version of yourself—that might be because it has. Your finite time has been appropriated, without your realizing anything’s amiss.

  It’s been obvious for some time now, of course, that all this constitutes a political emergency. By portraying our opponents as beyond persuasion, social media sorts us into ever more hostile tribes, then rewards us, with likes and shares, for the most hyperbolic denunciations of the other side, fueling a vicious cycle that makes sane debate impossible. Meanwhile, we’ve learned the hard way that unscrupulous politicians can overwhelm their opposition, not to mention the fact-checking capabilities of journalists, simply by flooding a nation’s attentional bandwidth with outrage after outrage, so that each new scandal overwrites the last one in public awareness—and anyone who responds or retweets, even if their intention is to condemn the hatemongering, finds themselves rewarding it with attention, thereby helping it spread.

  As the technology critic Tristan Harris likes to say, each time you open a social media app, there are “a thousand people on the other side of the screen” paid to keep you there—and so it’s unrealistic to expect users to resist the assault on their time and attention by means of willpower alone. Political crises demand political solutions. Yet if we’re to understand distraction at the deepest level, we’ll also have to acknowledge an awkward truth at the bottom of all this, which is that “assault”—with its implications of an uninvited attack—isn’t quite the right word. We mustn’t let Silicon Valley off the hook, but we should be honest: much of the time, we give in to distraction willingly. Something in us wants to be distracted, whether by our digital devices or anything else—to not spend our lives on what we thought we cared about the most. The calls are coming from inside the house. This is among the most insidious of the obstacles we face in our efforts to use our finite lives well, so it’s time to take a closer look at it.

  6.

  The Intimate Interrupter

  Had you been walking in the Kii Mountains in southern Japan during the winter months of 1969, you might have witnessed something startling: a pale and skinny American man, entirely naked, dumping half-frozen water over his own head from a large wooden cistern. His name was Steve Young, and he was training to become a monk in the Shingon branch of Buddhism—but so far the process had been nothing but a sequence of humiliations. First, the abbot of the Mount Koya monastery had refused to let him in the door. Who on earth was this gangly white Asian studies PhD student, who’d apparently decided the life of a Japanese monk was for him? Eventually, after some badgering, Young had been permitted to stay, but only in return for performing various menial tasks around the monastery, like sweeping the hallways and washing dishes. Now, at last, he had been authorized to begin the hundred-day solo retreat that marked the first real step on the monastic journey—only to discover that it entailed living in a tiny unheated hut and conducting a thrice-daily purification ritual in which Young, who’d been raised beside the ocean in balmy California, had to douse himself with several gallons of bone-chilling melted snow. It was a “horrific ordeal,” he would recall years later. “It’s so cold that the water freezes the moment it touches the floor, and your towel freezes in your hand. So you’re sliding around barefoot on ice, trying to dry your body with a frozen hand towel.”

  Faced with physical distress—even of a much milder variety than this—most people’s instinctive reaction is to try not to pay attention to it, to attempt to focus on anything else at all. For example, if you’re mildly phobic about hypodermic syringes, like I am, you’ve probably found yourself staring very hard at the mediocre artwork in doctors’ clinics in an effort to take your mind off the jab you’re about to receive. At first, this was Young’s instinct, too: to recoil internally from the experience of the freezing water hitting his skin by thinking about something different—or else just trying, through an act of sheer will, not to feel the cold. This is hardly an unreasonable reaction: when i
t’s so unpleasant to stay focused on present experience, common sense would seem to suggest that mentally absenting yourself from the situation would moderate the pain.

  And yet as icy deluge followed icy deluge, Young began to understand that this was precisely the wrong strategy. In fact, the more he concentrated on the sensations of intense cold, giving his attention over to them as completely as he could, the less agonizing he found them—whereas once his “attention wandered, the suffering became unbearable.” After a few days, he began preparing for each drenching by first becoming as focused on his present experience as he possibly could so that, when the water hit, he would avoid spiraling from mere discomfort into agony. Slowly it dawned on him that this was the whole point of the ceremony. As he put it—though traditional Buddhist monks certainly would not have done so—it was a “giant biofeedback device,” designed to train him to concentrate by rewarding him (with a reduction in suffering) for as long as he could remain undistracted, and punishing him (with an increase in suffering) whenever he failed. After his retreat, Young—who is now a meditation teacher better known as Shinzen Young, his new first name having been bestowed on him by the abbot at Mount Koya—found that his powers of concentration had been transformed. Whereas staying focused on the present had made the agonies of the ice-water ritual more tolerable, it made less unpleasant undertakings—daily chores that might previously have been a source not of agony but of boredom or annoyance—positively engrossing. The more intensely he could hold his attention on the experience of whatever he was doing, the clearer it became to him that the real problem had been not the activity itself but his internal resistance to experiencing it. When he stopped trying to block out those sensations and attended to them instead, the discomfort would evaporate.