Michael Harris: The Benefits of Solitude

Solitude by Voloshenko on deviantArt
Our society rewards social behavior while ignoring the positive effects of time spent alone.

On April 14, 1934, Richard Byrd went out for his daily walk. The air was the usual temperature: minus 57 degrees Fahrenheit. He stepped steadily through the drifts of snow, making his rounds. And then he paused to listen. Nothing.

He attended, a little startled, to the cloud-high and over-powering silence he had stepped into. For miles around the only other life belonged to a few stubborn microbes that clung to sheltering shelves of ice. It was only 4 p.m., but the land quavered in a perpetual twilight. There was—was there?—some play on the chilled horizon, some crack in the bruised Antarctic sky. And then, unaccountably, Richard Byrd’s universe began to expand.

Later, back in his hut, huddled by a makeshift furnace, Byrd wrote in his diary:

Here were imponderable processes and forces of the cosmos, harmonious and soundless. Harmony, that was it! That was what came out of the silence—a gentle rhythm, the strain of a perfect chord, the music of the spheres, perhaps.

It was enough to catch that rhythm, momentarily to be myself a part of it. In that instant I could feel no doubt of man’s oneness with the universe.
Admiral Byrd had volunteered to staff a weather base near the South Pole for five winter months. But the reason he was there alone was far less concrete. Struggling to explain his reasons, Byrd admitted that he wanted
“to know that kind of experience to the full . . . to taste peace and quiet and solitude long enough to find out how good they really are.” He was also after a kind of personal liberty, for he believed that “no man can hope to be completely free who lingers within reach of familiar habits.”
Byrd received the Medal of Honor for his work, but for most of us, the choice to be alone in the wild is not rewarded at all; in fact it is highly suspect. A trek into nature is assumed to be proof of some anti-social tendency. A feral disposition. Our friends and families don’t want us to wander off in search of the expansive, euphoric revelations that Byrd experienced in his Antarctic abyss. So we keep warm, instead, within our comfortable culture of monitoring and messaging. We abhor the disconnection that the woods, the desert, the glacier threaten us with in their heartless way. Our culture leans so sharply toward the social that those who wander into the wild are lucky if they’re only considered weird. At worst, they’re Unabombers. The bias is so strong that we stop thinking about that wilderness trek altogether; besides, we tell ourselves, surely we aren’t capable of such adventures. We’d wind up rotting in a ditch. And even if we could access the wild, we probably don’t have the fine kind of soul that would get something out of it.

There is something dangerous about isolating oneself the way Admiral Byrd did. Mystic euphoria aside, he nearly died there at the frozen anchor of the world. His furnace began leaking carbon monoxide into his hut. Indeed, a company of men down at his base camp had to hike in and save him when his health deteriorated. Other solitaries without radio-handy companions have been markedly less lucky. Think of young Chris McCandless (memorialized in Jon Krakauer’s book Into the Wild), who left no trail for his acquaintances when he hiked into the Alaskan wilderness with nothing but a rifle and a 10-pound bag of rice. After 119 days he died in the wilderness he had sought—poisoned by mouldy seeds is one guess—stranded, anyway, by the vagaries of Mother Nature.

In the final days of Admiral Byrd’s solo Antarctic adventure—before men from his base camp came to rescue him—he was very close to death himself. Frostbite began to eat his body, and he mumbled like a monk in his sleeping bag, at times growing so weak he was unable to move. He cradled heat pads against himself and scraped lima beans from cans. He tried to play card games and was baffled by the weakness in his arms. He tried to read a biography of Napoleon but the words blurred and swam uselessly on the pages. “You asked for it,” a small voice within him said. “And here it is.”

But despite all this trauma, Admiral Byrd was returned to society with a gift that society itself could never give him; he carried “something I had not fully possessed before,” he wrote in his memoir. It was an “appreciation of the sheer beauty and miracle of being alive . . . Civilization has not altered my ideas. I live more simply now, and with more peace.”

When Byrd and McCandless trekked into the wild, so doggedly insisting on solitude in nature, they both tapped into a human impulse that our progress has all but quashed.


When did we first step out of the wild and into the forever-crowded city? There was a time when all we had was access to nature—we were so inextricably in it and of it. Our ancestors spent their first 2.5 million years operating as nomadic groups that gathered plants where they grew and hunted animals where they grazed. Relatively recently, around ten thousand years ago, something phenomenal shifted: beginning in modern-day Turkey, Iran, and elsewhere in the Middle East, our ancestors embarked on what’s called the Agricultural Revolution. They began to manipulate and care for plants (and animals), devoting their days to sowing seeds and battling weeds, leading herds to pastures and fighting off their predators. This was no overnight transformation; rather, bit by bit, these nomads reimagined nature as a force to be contained and managed.


Or was it nature, rather, that was doing the taming? Even as we domesticated the wheat, rice, and corn that we still rely on to feed ourselves, human lives were bent in servitude to the care of crops. The historian Yuval Noah Harari calls this exchange “history’s biggest fraud” and argues that “the Agricultural Revolution left farmers with lives generally more difficult and less satisfying than those of foragers.” The historian of food Margaret Visser agrees, calling rice, for example, a “tyrant” that
governs power structures, technological prowess, population figures, interpersonal relationships, religious custom . . . Once human beings agree to grow rice as a staple crop, they are caught in a web of consequences from which they cannot escape—if only because from that moment on rice dictates to them not only what they must do, but also what they prefer. 
Relying on single staples for the majority of one’s caloric intake can be a gamble, too: even while it allows for exponential population growth, the diets of individuals become less varied and more vulnerable to attack by pests and blight. Others have pointed out that, just as domesticated animals have smaller brains than their wild ancestors, the brain of the “domesticated human” is significantly smaller than that of our pre-agriculture, pre-city selves.

Meanwhile, the care of crops and animals required so much of humans that they were forced to cease their wandering ways and remain permanently beside their fields—and so we have wheat and its cousins to thank for the first human settlements.

Professor Harari notes that the plot of land around Jericho, in Palestine, would have originally supported “at most one roaming band of about a hundred relatively healthy and well-nourished people,” whereas, post–Agricultural Revolution (around 8500 BCE), “the oasis supported a large but cramped village of 1,000 people, who suffered far more from disease and malnourishment.” The Middle East was, by then, covered with similar, permanent settlements.
By 7500 BCE, our disenfranchisement from nature was expressed more dramatically when the citizens of Jericho constructed an enormous wall around their city—the first of its kind. The purpose of this wall was probably twofold: it protected against floods as well as marauding enemies. What’s extraordinary about this first significantly walled city is the almost fanatical determination to withdraw from that earlier, wild world. The wall, made of stone, was five feet thick and twelve feet tall. In addition, a ditch was constructed adjacent to the wall that was nine feet deep and almost thirty feet wide. Jericho’s workers dug this enormous bulwark against the outside from solid bedrock—a feat of determined withdrawal that would have been unthinkable to our pre-agricultural ancestors. This was a true denial of the sprawling bushland that had been our home for millennia. The “wild” had been exiled. And we never invited it back. By the fourth century BCE the Agricultural Revolution had evolved into an “urban revolution”—one we are living out still.


In 2007, it was announced that more people live in cities than not. According to the World Health Organization, six out of every ten people will live in cities by 2030. No reversal of the trend is in sight. And as the city continues to draw us to itself, like some enormous, concrete siren, we begin to convince ourselves that this crowded existence is the only “natural” life, that there is nothing for us beyond the walls of Jericho. Perhaps, goes the myth, there never was.

As the urban revolution reaches a head and humans become more citified than not, “nature deficit disorder” blooms in every apartment block, and the crowds of urbanity push out key components of human life that we never knew we needed to safeguard. Nature activists like Richard Louv use less poesy and more research to prove that cities impoverish our sensory experience and can lead to an impoverished identity, too—one deprived of “the sense of humility required for true human intelligence,” as Louv puts it.

But what really happens when we turn too often toward society and away from the salt-smacking air of the seaside or our prickling intuition of unseen movements in a darkening forest? Do we really dismantle parts of our better selves?


A growing body of research suggests exactly that. A study from the University of London, for example, found that members of the remote cattle-herding Himba tribe in Namibia, who spend their lives in the open bush, had greater attention spans and a greater sense of contentment than urbanized Britons and, when those same tribe members moved into urban centres, their attention spans and levels of contentment dropped to match their British counterparts. Dr. Karina Linnell, who led the study, was “staggered” by how superior the rural Himba were. She told the BBC that these profound differences were “a function of how we live our lives,” suggesting that overcrowded urban settings demand altered states of mind. Linnell even proposes that employers, were they looking to design the best workforces, consider stationing employees who need to concentrate outside the city.


Meanwhile, at Stanford University, study participants had their brains scanned before and after walking in grassy meadows and then beside heavy car traffic. Participants walking in urban environments had markedly higher instances of “rumination”—a brooding and self-criticism the researchers correlated with the onset of depression. And, just as parts of the brain associated with rumination lit up on urban walks, they calmed down during nature walks.


Photos of nature will increase your sense of affection and playfulness. A quick trip into the woods, known as “forest bathing” in Japan, reduces cortisol levels and boosts the immune system. Whether rich or poor, students perform better with access to green space. And a simple view of greenery can insulate us from stress and increase our resilience to adversity. Time in nature even boosts, in a very concrete way, our ability to smell, see, and hear. The data piles up.


The cumulative effect of all these benefits appears to be a kind of balm for the harried urban soul. In the nineteenth century, as urbanization began its enormous uptick, as over- crowded and polluted city streets became, in the words of Pip in Great Expectations, “all asmear with filth and fat and blood and foam,” doctors regularly prescribed “nature” for the anxiety and depression that ailed their patients. The smoke and noise of cities were seen as truly foreign influences that required remedy in the form of nature retreats. Sanitariums were nestled in lush, Arcadian surrounds to counteract the disruptive influence of cities. Eva Selhub and Alan Logan, the authors of Your Brain on Nature, have described how these efforts gave way, in the twentieth century, to the miracle of pills, which allowed ill people to remain in the city indefinitely, so long as they took their medicine: “The half-page advertisement for the Glen Springs Sanitarium gave way to the full-page advertisement for the anti-anxiety drug meprobamate.” In this light, today’s urban populace, which manages itself with sleeping pills and antidepressants (more than 10 per cent of Americans take antidepressants), may remind us of the soma-popping characters in Aldous Huxley’s dystopian Brave New World. That vision may be changing at last, though. Today, as the curative effects of nature come back to light, some doctors have again begun prescribing “time outdoors” for conditions as various as asthma, ADHD, obesity, diabetes, and anxiety.

To walk out of our houses and beyond our city limits is to shuck off the pretense and assumptions that we otherwise live by. This is how we open ourselves to brave new notions or independent attitudes. This is how we come to know our own minds.

For some people, a brief walk away from home has been the only respite from a suffocating domestic life. Think of an English woman in the early nineteenth century with very few activities open to her—certainly few chances to escape the confines of the drawing room. In Pride and Prejudice, Elizabeth Bennet’s determination to walk in the countryside signals her lack of convention. When her sister Jane takes ill at the wealthy Mr. Bingley’s house, Elizabeth traipses alone through fields of mud to be with her, prompting Bingley’s sister to call her “wild” in appearance with hair that has become unpardonably “blowsy”: “That she should have walked three miles so early in the day, in such dirty weather, and by herself, was almost incredible . . . they held her in contempt for it.”
 

The philosopher Thomas Hobbes had a walking stick with an inkhorn built into its top so he could jot things down as they popped into his head during long walks. Rousseau would have approved of the strategy; he writes, “I can only meditate when I am walking. When I stop, I cease to think; my mind only works with my legs.” Albert Einstein, for his part, was diligent about taking a walk through the woods on the Princeton campus every day. Other famous walkers include Charles Dickens and Mother Teresa, John Bunyan and Martin Luther King Jr., Francis of Assisi, and Toyohiko Kagawa. Why do so many bright minds seem set on their walks away from the desk? It can’t be just that they need a break from thinking—some of their best thinking is done during this supposed “downtime” out of doors.


In educational circles, there is a theory that helps explain the compulsion; it’s called the theory of loose parts. Originally developed by architect Simon Nicholson in 1972, when he was puzzling over how to make playgrounds more engaging, the loose parts theory suggests that one needs random elements, changing environments, in order to think independently and cobble together one’s own vision of things. Nature is an infinite source of loose parts, whereas the office or the living room, being made by people, is limited. Virginia Woolf noted that even the stuff and furniture of our homes may “enforce the memories of our own experience” and cause a narrowing, a suffocating effect. Outside of our ordered homes, though, we escape heavy memories about the way things have always been and become open to new attitudes.

But there does seem to be an art to walks; we must work at making use of those interstitial moments. Going on a hike, or even just taking the scenic route to the grocery store, is a chance to dip into our solitude—but we must seize it. If we’re compelled by our more curious selves to walk out into the world—sans phone, sans tablet, sans Internet of Everything—then we still must decide to taste the richness of things.

Outside the maelstrom of mainstream chatter, we at last meet not just the bigger world but also ourselves. Confirmed flâneur William Hazlitt paints the picture well. When he wanders out of doors he is searching for

liberty, perfect liberty, to think, feel, do, just as one pleases . . . I want to see my vague notions float like the down on the thistle before the breeze, and not to have them entangled in the briars and thorns of controversy. For once, I like to have it all my own way; and this is impossible unless you are alone.
This is the gift of even a short, solitary walk in a city park. To find, in glimpsing a sign of the elements, that one does belong to something more elemental than an urban crowd. That there is a universe of experience beyond human networks and social grooming—and that this universe is our true home. Workers in the cramped centre of Osaka may cut through Namba Park on their way to work; Torontonians may cut through Trinity Bellwoods Park on their way to the city’s best bookshop; New Yorkers may cut through Central Park on their way to the Metropolitan Museum; and Londoners may cut through Hyde Park on their way to Royal Albert Hall. Stepping off the narrow sidewalk for even a few minutes, we may come across a new (and very old) definition of ourselves, one with less reference to others.