‘The Dictator’ (2012) by Sacha Baron-Cohen plays on the fact that kitsch is used by dictators and fundamentalists to redefine our world. Zennie Abraham/Flickr, CC BY-ND
What is the predominant aesthetic of the twenty-first century? According to sociology professors Ruth Holliday and Tracey Potts, “we are on the point of drowning in kitsch. A casual survey of the British metropolitan high street offers ample evidence of the kitschification of everyday life.”

Kitsch can also be called cheesiness or tackiness. Specialists have defined kitsch as a tasteless copy of an existing style or as the systematic display of bad taste or artistic deficiency. Garden gnomes are kitsch, just like cheap paintings for tourists, which are technically correct but express their “truths” too directly and too straightforwardly, often in the form of clichés.

Some people play with kitsch by using irony, which can lead to interesting results. However, most of the time, kitsch has negative connotations.

Terrorism prefers kitsch 

In politics, most dictators have attempted to reinforce their authority with the help of kitsch propaganda. The former Libyan leader Muammar Gaddafi was called “the kitsch-dictator and Saddam Hussein, who designed his own monuments in a Stalinist spirit, is one of the few turn-of-the-century leaders able to debate his title. The tastes of the nouveau riches in Russia, China, the Middle East, and the US excel in a kind of conspicuous vulgarity that perfectly matches academic definitions of kitsch.

Terrorism, graphic images of which have invaded our lives in the past two decades, prefers kitsch. Al-Qaeda propaganda indulges in romantic presentations of sunrises, pre-modern utopias, as well as Gothic presentations of skulls and bones. Sociologist Rüdiger Lohlker, who analysed jihadist aesthetics, wrote that the jihadi magazine Al-Qaeda Airlines displayed "a fascination with gothic elements (skulls and bones) and kitsch”.

Videos put out by the so-called Islamic State (IS) offer even more explicit kitsch expressions as they cultivate the art of violence for its shock value.

Cultural identity theft

So why is there so much kitsch? Is there more kitsch now than there’s ever been? A lot of cheesiness has been around in popular religious art, and Caligula is probably the kitsch champion of all times. Enlightenment brought kitsch (then contained in Baroque art) to a temporary halt but it seems that we are catching up again. American screenwriter Kevin Williamson has called Donald Trump in the National Review “the worst taste since Caligula.”

Trump goes back to the pre-Enlightenment taste of Absolutism: his gilded Manhattan penthouse is replete with marble, Louis XIV furnishings, and haphazardly assembled historical themes.

According to my analysis, this attraction for kitsch has to do with the phenomenon of “deculturation” a phenomena in which a particular group is deprived of one or more aspects of its identity“. The term emerged in sociology in debates about the effects of colonialism and subsequent loss of culture, for example in Pierre Bourdieu’s early work Sociologie de l'Algérie.

Humans have always needed truths to believe in. Whereas in the past those truths tended to be transmitted through cultures, they are now increasingly produced instantaneously without cultural mediation. Kitsch employs this mechanism in the realm of aesthetics. In today’s world, kitsch is redefining our perception of truth; it is a truth devoid of culture or context.

The production of immediate, pure, and decultured truths is most obvious in the sphere of fundamentalist religions. Islam scholar Olivier Roy has shown that religious fundamentalism arises when religion is separated from the indigenous culture in which it was embedded.

Radicalisation occurs when religions attempt to define themselves as culturally neutral and "pure”. When religions are disconnected from concrete cultural values, their truths become absolute; fundamentalist religions tend to see themselves as providers of scientific truths.

Narcissistic impulse 

Studies have shown that kitsch has its roots in an intrinsically narcissistic impulse. That’s why it thrives particularly well in neoliberal environments determined by the dynamics of the information society. Social media are narcissistic because they enable individuals to recycle their own selves without being confronted with the culture of the other.

Algorithms tell us which books we like, based on previous choices. The narcissist structure of this model is obvious. Through algorithms, signs are quantified and classified along the guidelines of abstract forms of excellence. In a decultured world, the self becomes the only remaining ethical reference.

When there is no cultural other, only the “I” will be taken for granted. In the worst case, this system produces self-centered “alternative truths” and conspiracy theories, which are “kitsch-theories” because of their narcissistic, self-confirming structures.


Narcissus was so obsessed with himself that he died contemplating his own image. Between 1594 and 1596. Caravaggio/Galleria Nazionale d'Arte Antica/Wikimedia
“Kitsch truths” establish themselves autonomously by narcissistically affirming their own truth. Along the same lines, alternative truths and conspiracy theories do not misinform (misinformation being the holding back of an existing truth) but they kitschify truth. In the end, this leads to the total loss of truth.





Thorsten Botz-Bornstein is a German philosopher specializing in aesthetics and intercultural philosophy. Currently he is an Associate Professor of Philosophy at the Gulf University for Science and Technology
Clamdigger 1935 by Edward Hopper. Courtesy Sharon Mollerus/Flickr via Aeon Magazine

In 1840, Edgar Allan Poe described the ‘mad energy’ of an ageing man who roved the streets of London from dusk till dawn. His excruciating despair could be temporarily relieved only by immersing himself in a tumultuous throng of city-dwellers. ‘He refuses to be alone,’ Poe wrote. He ‘is the type and the genius of deep crime … He is the man of the crowd.’

Like many poets and philosophers through the ages, Poe stressed the significance of solitude. It was ‘such a great misfortune’, he thought, to lose the capacity to be alone with oneself, to get caught up in the crowd, to surrender one’s singularity to mind-numbing conformity. Two decades later, the idea of solitude captured Ralph Waldo Emerson’s imagination in a slightly different way: quoting Pythagoras, he wrote: ‘In the morning, – solitude; … that nature may speak to the imagination, as she does never in company.’ Emerson encouraged the wisest teachers to press upon their pupils the importance of ‘periods and habits of solitude’, habits that made ‘serious and abstracted thought’ possible.

In the 20th century, the idea of solitude formed the centre of Hannah Arendt’s thought. A German-Jewish émigré who fled Nazism and found refuge in the United States, Arendt spent much of her life studying the relationship between the individual and the polis. For her, freedom was tethered to both the private sphere – the vita contemplativa – and the public, political sphere – the vita activa. She understood that freedom entailed more than the human capacity to act spontaKitschneously and creatively in public. It also entailed the capacity to think and to judge in private, where solitude empowers the individual to contemplate her actions and develop her conscience, to escape the cacophony of the crowd – to finally hear herself think.

In 1961, The New Yorker commissioned Arendt to cover the trial of Adolf Eichmann, a Nazi SS officer who helped to orchestrate the Holocaust. How could anyone, she wanted to know, perpetrate such evil? Surely only a wicked sociopath could participate in the Shoah. But Arendt was surprised by Eichmann’s lack of imagination, his consummate conventionality. She argued that while Eichmann’s actions were evil, Eichmann himself – the person – ‘was quite ordinary, commonplace, and neither demonic nor monstrous. There was no sign in him of firm ideological convictions.’ She attributed his immorality – his capacity, even his eagerness, to commit crimes – to his ‘thoughtlessness’. It was his inability to stop and think that permitted Eichmann to participate in mass murder.

Just as Poe suspected that something sinister lurked deep within the man of the crowd, Arendt recognised that: ‘A person who does not know that silent intercourse (in which we examine what we say and what we do) will not mind contradicting himself, and this means he will never be either able or willing to account for what he says or does; nor will he mind committing any crime, since he can count on its being forgotten the next moment.’ Eichmann had shunned Socratic self-reflection. He had failed to return home to himself, to a state of solitude. He had discarded the vita contemplativa, and thus he had failed to embark upon the essential question-and-answering process that would have allowed him to examine the meaning of things, to distinguish between fact and fiction, truth and falsehood, good and evil.

‘It is better to suffer wrong than to do wrong,’ Arendt wrote, ‘because you can remain the friend of the sufferer; who would want to be the friend of and have to live together with a murderer? Not even another murderer.’ It is not that unthinking men are monsters, that the sad sleepwalkers of the world would sooner commit murder than face themselves in solitude. What Eichmann showed Arendt was that society could function freely and democratically only if it were made up of individuals engaged in the thinking activity – an activity that required solitude. Arendt believed that ‘living together with others begins with living together with oneself’.

But what if, we might ask, we become lonely in our solitude? Isn’t there some danger that we will become isolated individuals, cut off from the pleasures of friendship? Philosophers have long made a careful, and important, distinction between solitude and loneliness. In The Republic (c380 BCE), Plato proffered a parable in which Socrates celebrates the solitary philosopher. In the allegory of the cave, the philosopher escapes from the darkness of an underground den – and from the company of other humans – into the sunlight of contemplative thought. Alone but not lonely, the philosopher becomes attuned to her inner self and the world. In solitude, the soundless dialogue ‘which the soul holds with herself’ finally becomes audible.

Echoing Plato, Arendt observed: ‘Thinking, existentially speaking, is a solitary but not a lonely business; solitude is that human situation in which I keep myself company. Loneliness comes about … when I am one and without company’ but desire it and cannot find it. In solitude, Arendt never longed for companionship or craved camaraderie because she was never truly alone. Her inner self was a friend with whom she could carry on a conversation, that silent voice who posed the vital Socratic question: ‘What do you mean when you say …?’ The self, Arendt declared, ‘is the only one from whom you can never get away – except by ceasing to think.’

Arendt’s warning is well worth remembering in our own time. In our hyper-connected world, a world in which we can communicate constantly and instantly over the internet, we rarely remember to carve out spaces for solitary contemplation. We check our email hundreds of times per day; we shoot off thousands of text messages per month; we obsessively thumb through Twitter, Facebook and Instagram, aching to connect at all hours with close and casual acquaintances alike. We search for friends of friends, ex-lovers, people we barely know, people we have no business knowing. We crave constant companionship.

But, Arendt reminds us, if we lose our capacity for solitude, our ability to be alone with ourselves, then we lose our very ability to think. We risk getting caught up in the crowd. We risk being ‘swept away’, as she put it, ‘by what everybody else does and believes in’ – no longer able, in the cage of thoughtless conformity, to distinguish ‘right from wrong, beautiful from ugly’. Solitude is not only a state of mind essential to the development of an individual’s consciousness – and conscience – but also a practice that prepares one for participation in social and political life. Before we can keep company with others, we must learn to keep company with ourselves.



Jennifer Stitt

Jennifer Stitt is a graduate student in the history of philosophy at the University of Wisconsin-Madison.

This article was originally published in Aeon and has been republished under Creative Commons.
“Rosas y Estrellas” (“Roses and Stars”) by Raúl Martínez depicts 19th-century Cuban revolutionary José Martí (center) flanked by Fidel Castro and Che Guevara, with Latin American freedom fighters including Simón Bolívar behind them.
Patricia & Howard Farber Collection, New York ©Archive Raúl Martínez
“Rosas y Estrellas” (“Roses and Stars”) by Raúl Martínez depicts 19th-century Cuban revolutionary José Martí (center) flanked by Fidel Castro and Che Guevara, with Latin American freedom fighters including Simón Bolívar behind them.
Those last few turbulent years, it's as if we don't have feelings - guided perception- of any sort anymore. Because of this benumbed sensation we can't even properly articulate, though in non-elitist terms, the myriad of disenchantments we've to go through on a daily basis for we're mostly left with no memory of their very occurrence. Universalized disenchantment with the Order - the Dominant Group, the Political Regime, the Capitalist Economy, the Fake News, the Subaltern Cacophony, the Mediocre Public Service, the Higher Education that Principally serves as the Industry of the Precariat to state just few of our objet petit a of an existentialist frustration - is the rule rather than the exception. As a desperate act of escaping this contagious affliction, we've come to espouse fanfaronade - swaggering, empty boasting, the ostentatious display of our poorly constructed, empty-shell persona that feigns invincibility - and molysomophobia, an excessive fear of contamination - by old or new alien modes of thinking, unorthodox use of language, "dangerous" yet meticulous prognosis of the fragility of the immanent contingency of a single object - narrative or praxis - , simultaneously. Hence, we committed an inconsiderate act of concocting a mélange out of oil and water.

The only palatable way out of this conundrum, at least for the time being, apparently appears to be a massive "inoculation" campaign by way of a recumbnetibus, a sudden, wake-up knockout punch both verbal and physical. But this metaphysical vaccination outreach must carefully be under the aegis of personalities with the prerequisite courage and knowledge to call a cunt a cunt, in lieu of a stunt performance for the incurably ultracrepidarian, individuals or groups with an irresistible urge to give opinions and advice on matters of supreme importance such as this, outside of their minimalist, parsimonious, bookish knowledge (or "expert advice" as they prefer to refer to their apparent pseudo-intellectualism and the accompanying insatiable thirst to carpe the troubled diem in light of their "forethought").
The Thinking Loop A Courtesy of IBM Design
The irresistible urge to reflect - to systemically wail, to voice out the uneasily unutterable, to extemporize one's unbearable existential trite - basically emanates from one's frustration. One's frustration apropos of the burden of an imposed living, an inexplicably ubiquitous phenomenon of being tasked with a mission - to satisfy one's protectors under whose tutelage one is constantly marionetted. One ominously feels the coming of a certain serendipitous calamity to befall him, should he fail to get the pain of the menace inhabiting his conscience off his chest. Hence, the need to displace this geist pronto before it turns into an implosive conflagration.

In a world not of our making or of choosing for that matter we're consumed by the ever present fear of aphasia, a debilitating infirmity that hampers one's capacity to reveal as it were one's phenomenal existence to the world out there and repressive amnesia, the failure to remember one's apotheosis in the form of anamnesis, an unconsciously driven compendium of piecemeal personal fragments in the form of parables, poems, songs or else apologetic confessions on the deathbed.

If one's doesn't engage in the obsessive fabrication and confabulation of one's nostalgia-dominated subjective perspectives regarding most things in life that might or not matter once in a while, one is inadvertently forced to succumb to the fear of being forgotten by one's significant or rather insignificant others, those entities that inhabit the wide world outside one's domain of control, sometimes those of the cimmerii, all those phenomena that dwell in the darkest corners of undiscoverablity at least for the time being or till their hideousness gives way for an unexpected revelation to the light of the observant and imaginative mind of an eccentric persona.
Courtesy of Unraveling Magazine

Solitude by Voloshenko on deviantArt
Our society rewards social behavior while ignoring the positive effects of time spent alone.

On April 14, 1934, Richard Byrd went out for his daily walk. The air was the usual temperature: minus 57 degrees Fahrenheit. He stepped steadily through the drifts of snow, making his rounds. And then he paused to listen. Nothing.

He attended, a little startled, to the cloud-high and over-powering silence he had stepped into. For miles around the only other life belonged to a few stubborn microbes that clung to sheltering shelves of ice. It was only 4 p.m., but the land quavered in a perpetual twilight. There was—was there?—some play on the chilled horizon, some crack in the bruised Antarctic sky. And then, unaccountably, Richard Byrd’s universe began to expand.

Later, back in his hut, huddled by a makeshift furnace, Byrd wrote in his diary:

Here were imponderable processes and forces of the cosmos, harmonious and soundless. Harmony, that was it! That was what came out of the silence—a gentle rhythm, the strain of a perfect chord, the music of the spheres, perhaps.

It was enough to catch that rhythm, momentarily to be myself a part of it. In that instant I could feel no doubt of man’s oneness with the universe.
Admiral Byrd had volunteered to staff a weather base near the South Pole for five winter months. But the reason he was there alone was far less concrete. Struggling to explain his reasons, Byrd admitted that he wanted
“to know that kind of experience to the full . . . to taste peace and quiet and solitude long enough to find out how good they really are.” He was also after a kind of personal liberty, for he believed that “no man can hope to be completely free who lingers within reach of familiar habits.”
Byrd received the Medal of Honor for his work, but for most of us, the choice to be alone in the wild is not rewarded at all; in fact it is highly suspect. A trek into nature is assumed to be proof of some anti-social tendency. A feral disposition. Our friends and families don’t want us to wander off in search of the expansive, euphoric revelations that Byrd experienced in his Antarctic abyss. So we keep warm, instead, within our comfortable culture of monitoring and messaging. We abhor the disconnection that the woods, the desert, the glacier threaten us with in their heartless way. Our culture leans so sharply toward the social that those who wander into the wild are lucky if they’re only considered weird. At worst, they’re Unabombers. The bias is so strong that we stop thinking about that wilderness trek altogether; besides, we tell ourselves, surely we aren’t capable of such adventures. We’d wind up rotting in a ditch. And even if we could access the wild, we probably don’t have the fine kind of soul that would get something out of it.

There is something dangerous about isolating oneself the way Admiral Byrd did. Mystic euphoria aside, he nearly died there at the frozen anchor of the world. His furnace began leaking carbon monoxide into his hut. Indeed, a company of men down at his base camp had to hike in and save him when his health deteriorated. Other solitaries without radio-handy companions have been markedly less lucky. Think of young Chris McCandless (memorialized in Jon Krakauer’s book Into the Wild), who left no trail for his acquaintances when he hiked into the Alaskan wilderness with nothing but a rifle and a 10-pound bag of rice. After 119 days he died in the wilderness he had sought—poisoned by mouldy seeds is one guess—stranded, anyway, by the vagaries of Mother Nature.

In the final days of Admiral Byrd’s solo Antarctic adventure—before men from his base camp came to rescue him—he was very close to death himself. Frostbite began to eat his body, and he mumbled like a monk in his sleeping bag, at times growing so weak he was unable to move. He cradled heat pads against himself and scraped lima beans from cans. He tried to play card games and was baffled by the weakness in his arms. He tried to read a biography of Napoleon but the words blurred and swam uselessly on the pages. “You asked for it,” a small voice within him said. “And here it is.”

But despite all this trauma, Admiral Byrd was returned to society with a gift that society itself could never give him; he carried “something I had not fully possessed before,” he wrote in his memoir. It was an “appreciation of the sheer beauty and miracle of being alive . . . Civilization has not altered my ideas. I live more simply now, and with more peace.”

When Byrd and McCandless trekked into the wild, so doggedly insisting on solitude in nature, they both tapped into a human impulse that our progress has all but quashed.


When did we first step out of the wild and into the forever-crowded city? There was a time when all we had was access to nature—we were so inextricably in it and of it. Our ancestors spent their first 2.5 million years operating as nomadic groups that gathered plants where they grew and hunted animals where they grazed. Relatively recently, around ten thousand years ago, something phenomenal shifted: beginning in modern-day Turkey, Iran, and elsewhere in the Middle East, our ancestors embarked on what’s called the Agricultural Revolution. They began to manipulate and care for plants (and animals), devoting their days to sowing seeds and battling weeds, leading herds to pastures and fighting off their predators. This was no overnight transformation; rather, bit by bit, these nomads reimagined nature as a force to be contained and managed.


Or was it nature, rather, that was doing the taming? Even as we domesticated the wheat, rice, and corn that we still rely on to feed ourselves, human lives were bent in servitude to the care of crops. The historian Yuval Noah Harari calls this exchange “history’s biggest fraud” and argues that “the Agricultural Revolution left farmers with lives generally more difficult and less satisfying than those of foragers.” The historian of food Margaret Visser agrees, calling rice, for example, a “tyrant” that
governs power structures, technological prowess, population figures, interpersonal relationships, religious custom . . . Once human beings agree to grow rice as a staple crop, they are caught in a web of consequences from which they cannot escape—if only because from that moment on rice dictates to them not only what they must do, but also what they prefer. 
Relying on single staples for the majority of one’s caloric intake can be a gamble, too: even while it allows for exponential population growth, the diets of individuals become less varied and more vulnerable to attack by pests and blight. Others have pointed out that, just as domesticated animals have smaller brains than their wild ancestors, the brain of the “domesticated human” is significantly smaller than that of our pre-agriculture, pre-city selves.

Meanwhile, the care of crops and animals required so much of humans that they were forced to cease their wandering ways and remain permanently beside their fields—and so we have wheat and its cousins to thank for the first human settlements.

Professor Harari notes that the plot of land around Jericho, in Palestine, would have originally supported “at most one roaming band of about a hundred relatively healthy and well-nourished people,” whereas, post–Agricultural Revolution (around 8500 BCE), “the oasis supported a large but cramped village of 1,000 people, who suffered far more from disease and malnourishment.” The Middle East was, by then, covered with similar, permanent settlements.
By 7500 BCE, our disenfranchisement from nature was expressed more dramatically when the citizens of Jericho constructed an enormous wall around their city—the first of its kind. The purpose of this wall was probably twofold: it protected against floods as well as marauding enemies. What’s extraordinary about this first significantly walled city is the almost fanatical determination to withdraw from that earlier, wild world. The wall, made of stone, was five feet thick and twelve feet tall. In addition, a ditch was constructed adjacent to the wall that was nine feet deep and almost thirty feet wide. Jericho’s workers dug this enormous bulwark against the outside from solid bedrock—a feat of determined withdrawal that would have been unthinkable to our pre-agricultural ancestors. This was a true denial of the sprawling bushland that had been our home for millennia. The “wild” had been exiled. And we never invited it back. By the fourth century BCE the Agricultural Revolution had evolved into an “urban revolution”—one we are living out still.


In 2007, it was announced that more people live in cities than not. According to the World Health Organization, six out of every ten people will live in cities by 2030. No reversal of the trend is in sight. And as the city continues to draw us to itself, like some enormous, concrete siren, we begin to convince ourselves that this crowded existence is the only “natural” life, that there is nothing for us beyond the walls of Jericho. Perhaps, goes the myth, there never was.

As the urban revolution reaches a head and humans become more citified than not, “nature deficit disorder” blooms in every apartment block, and the crowds of urbanity push out key components of human life that we never knew we needed to safeguard. Nature activists like Richard Louv use less poesy and more research to prove that cities impoverish our sensory experience and can lead to an impoverished identity, too—one deprived of “the sense of humility required for true human intelligence,” as Louv puts it.

But what really happens when we turn too often toward society and away from the salt-smacking air of the seaside or our prickling intuition of unseen movements in a darkening forest? Do we really dismantle parts of our better selves?


A growing body of research suggests exactly that. A study from the University of London, for example, found that members of the remote cattle-herding Himba tribe in Namibia, who spend their lives in the open bush, had greater attention spans and a greater sense of contentment than urbanized Britons and, when those same tribe members moved into urban centres, their attention spans and levels of contentment dropped to match their British counterparts. Dr. Karina Linnell, who led the study, was “staggered” by how superior the rural Himba were. She told the BBC that these profound differences were “a function of how we live our lives,” suggesting that overcrowded urban settings demand altered states of mind. Linnell even proposes that employers, were they looking to design the best workforces, consider stationing employees who need to concentrate outside the city.


Meanwhile, at Stanford University, study participants had their brains scanned before and after walking in grassy meadows and then beside heavy car traffic. Participants walking in urban environments had markedly higher instances of “rumination”—a brooding and self-criticism the researchers correlated with the onset of depression. And, just as parts of the brain associated with rumination lit up on urban walks, they calmed down during nature walks.


Photos of nature will increase your sense of affection and playfulness. A quick trip into the woods, known as “forest bathing” in Japan, reduces cortisol levels and boosts the immune system. Whether rich or poor, students perform better with access to green space. And a simple view of greenery can insulate us from stress and increase our resilience to adversity. Time in nature even boosts, in a very concrete way, our ability to smell, see, and hear. The data piles up.


The cumulative effect of all these benefits appears to be a kind of balm for the harried urban soul. In the nineteenth century, as urbanization began its enormous uptick, as over- crowded and polluted city streets became, in the words of Pip in Great Expectations, “all asmear with filth and fat and blood and foam,” doctors regularly prescribed “nature” for the anxiety and depression that ailed their patients. The smoke and noise of cities were seen as truly foreign influences that required remedy in the form of nature retreats. Sanitariums were nestled in lush, Arcadian surrounds to counteract the disruptive influence of cities. Eva Selhub and Alan Logan, the authors of Your Brain on Nature, have described how these efforts gave way, in the twentieth century, to the miracle of pills, which allowed ill people to remain in the city indefinitely, so long as they took their medicine: “The half-page advertisement for the Glen Springs Sanitarium gave way to the full-page advertisement for the anti-anxiety drug meprobamate.” In this light, today’s urban populace, which manages itself with sleeping pills and antidepressants (more than 10 per cent of Americans take antidepressants), may remind us of the soma-popping characters in Aldous Huxley’s dystopian Brave New World. That vision may be changing at last, though. Today, as the curative effects of nature come back to light, some doctors have again begun prescribing “time outdoors” for conditions as various as asthma, ADHD, obesity, diabetes, and anxiety.

To walk out of our houses and beyond our city limits is to shuck off the pretense and assumptions that we otherwise live by. This is how we open ourselves to brave new notions or independent attitudes. This is how we come to know our own minds.

For some people, a brief walk away from home has been the only respite from a suffocating domestic life. Think of an English woman in the early nineteenth century with very few activities open to her—certainly few chances to escape the confines of the drawing room. In Pride and Prejudice, Elizabeth Bennet’s determination to walk in the countryside signals her lack of convention. When her sister Jane takes ill at the wealthy Mr. Bingley’s house, Elizabeth traipses alone through fields of mud to be with her, prompting Bingley’s sister to call her “wild” in appearance with hair that has become unpardonably “blowsy”: “That she should have walked three miles so early in the day, in such dirty weather, and by herself, was almost incredible . . . they held her in contempt for it.”
 

The philosopher Thomas Hobbes had a walking stick with an inkhorn built into its top so he could jot things down as they popped into his head during long walks. Rousseau would have approved of the strategy; he writes, “I can only meditate when I am walking. When I stop, I cease to think; my mind only works with my legs.” Albert Einstein, for his part, was diligent about taking a walk through the woods on the Princeton campus every day. Other famous walkers include Charles Dickens and Mother Teresa, John Bunyan and Martin Luther King Jr., Francis of Assisi, and Toyohiko Kagawa. Why do so many bright minds seem set on their walks away from the desk? It can’t be just that they need a break from thinking—some of their best thinking is done during this supposed “downtime” out of doors.


In educational circles, there is a theory that helps explain the compulsion; it’s called the theory of loose parts. Originally developed by architect Simon Nicholson in 1972, when he was puzzling over how to make playgrounds more engaging, the loose parts theory suggests that one needs random elements, changing environments, in order to think independently and cobble together one’s own vision of things. Nature is an infinite source of loose parts, whereas the office or the living room, being made by people, is limited. Virginia Woolf noted that even the stuff and furniture of our homes may “enforce the memories of our own experience” and cause a narrowing, a suffocating effect. Outside of our ordered homes, though, we escape heavy memories about the way things have always been and become open to new attitudes.

But there does seem to be an art to walks; we must work at making use of those interstitial moments. Going on a hike, or even just taking the scenic route to the grocery store, is a chance to dip into our solitude—but we must seize it. If we’re compelled by our more curious selves to walk out into the world—sans phone, sans tablet, sans Internet of Everything—then we still must decide to taste the richness of things.

Outside the maelstrom of mainstream chatter, we at last meet not just the bigger world but also ourselves. Confirmed flâneur William Hazlitt paints the picture well. When he wanders out of doors he is searching for

liberty, perfect liberty, to think, feel, do, just as one pleases . . . I want to see my vague notions float like the down on the thistle before the breeze, and not to have them entangled in the briars and thorns of controversy. For once, I like to have it all my own way; and this is impossible unless you are alone.
This is the gift of even a short, solitary walk in a city park. To find, in glimpsing a sign of the elements, that one does belong to something more elemental than an urban crowd. That there is a universe of experience beyond human networks and social grooming—and that this universe is our true home. Workers in the cramped centre of Osaka may cut through Namba Park on their way to work; Torontonians may cut through Trinity Bellwoods Park on their way to the city’s best bookshop; New Yorkers may cut through Central Park on their way to the Metropolitan Museum; and Londoners may cut through Hyde Park on their way to Royal Albert Hall. Stepping off the narrow sidewalk for even a few minutes, we may come across a new (and very old) definition of ourselves, one with less reference to others.