In Search of an American Mythology

apollo-148722_960_720

Deep inside the novel Jonathan Strange & Mr Norrell comes a memorable passage, to me at least. Describing an arcane theory in the novel’s fictional world, author Susanna Clarke writes, “Men and fairies both contain within them a faculty of reason and a faculty of magic. In men reason is strong and magic is weak. With fairies it is the other way round: magic comes very naturally to them, but by human standards they are barely sane.”

The book tells the tale of two magicians practicing their trade in an alternate Napoleonic-era Europe where magic is an accepted fact of history. During their quest to “restore English magic,” the men’s personalities clash very memorably. The first, Gilbert Norrell, is a pedantic, antisocial, somewhat jealous scholar intent on monopolizing the magical arts. His counterpart Jonathan Strange comes from that noble tradition in British literature: the aimless young gentleman in search of a profession (or, failing that, anything at all to do).

The story’s villain is a fairy, the mercurial king of a realm called Lost-Hope. Like most of the gods and spirits populating the world’s myths, he is quite out of his mind. Though that is as it should be. No one expects or wants a character like that to act in a sober and rational way.

The novel’s portrayal of humankind and fairies, the race of reason and the race of madness, appears to trace our concepts of “real” and “romantic”. But what is reasonable isn’t always the most real, especially to those with little stake in the modern world’s bureaucratic regularities. For all their otherworldly allure, “fairy” characters in literature are humans too. Given the power, many of us would behave with a similar impulsiveness, the same willful disregard for social propriety. And we’d have fun doing it.

Sometimes it seems like American history goes through cycles of real and romantic. Definable periods of prosperous bureaucratic order intermixed with madder, headier times. But sometimes everything feels jumbled up, resistant to these blanket categorizations. It’s no wonder academics abandoned that kind of thinking long ago, at least when it comes to the scholarship they generate. Their own lives, no doubt, are an entirely different matter.

Despite (or because of) the technocrats’ best and continuing efforts, I can’t help but feel like we’re in a fairy-world of our own today, a place where astonishing ignorance and wayward power freely associate with the most advanced calculation in history. A place where the compelling case for “progress” has to contend with equally compelling evidence that all such progress is a mere pipe dream.

In the British myths that Clarke draws from, the same places Tolkien looked, there is no bureaucracy to speak of, little real law. Power means violence, and yet modern patterns of totalizing domination are absent. Instead, everything is fickle and free. The mighty are mighty indeed, but they’re often cast down without warning, replaced by servants. People are not as they appear. Chance and coincidence aren’t narrative flaws.

Without a national mythology of that caliber, we Americans are left with a set of vague, recent-ish proscriptions from the founding fathers (and the memory of Native American civilization wiped out). We’re missing the grounding mythology provides. And so we’re stuck in a cycle of national adolescence: proud and self-important on one day, insecure and pouty the next.

Maybe one day, after the five-hundredth movie reboot, Batman and Superman and Star Wars will mature into the role. Maybe one day we’ll be able to properly contextualize our slogans of freedom! and democracy! Maybe we’ll recognize them for the compelling, dangerous things they are, not so easily had or taken away. Until then, we can console ourselves with the fairy stories of foreign lands.

If Driving Can’t Be Fun…

self-driver

Hi there! – the Google self-driving car

In a world of driverless cars, the driving man is king. Or so I thought last weekend at a meetup where one laptop-cradling attendee asked the assembly: would you give up your right to drive for the convenience of a self-driving fleet? About two-thirds of the group raised their hands.

A strict libertarian I am not, but we should be wary when someone proposes the removal of rights. I didn’t raise my hand, but not out of ideological certitude. While the man’s question is a potent one, it seems too complex and too contingent for a straw poll to do it justice. So here’s my attempt.

It doesn’t take a professional planner to recognize that modern cities consider the right to drive a given. Driving is as necessary as eating for most of us. And despite our daily struggles with traffic, the automobile lives very close to the beating heart of American culture. The 2008 automotive industry bailouts got so much attention partly because we picture American industry’s golden age as rows of men on the Detroit assembly line, churning out cars. Our concept of leisure, even for supposedly driving-averse Millennials, operates around the assumption that you need a car to get anywhere worth going.

Perhaps an automotive century molded our sense of urban life in its image just as concretely as it transformed our physical infrastructure. We may be stuck with the private car. Transit, bikes, and walkable neighborhoods are all well and good, but what we really need are cars, driverless and battery-powered and app-summoned. Give up your right to drive, abandon these silly buses and trains, and declare your allegiance to Silicon Valley’s new world-changing venture.

It’s easy to get cynical. Silicon Valley’s insistence on disruption hasn’t delivered the revolutionary improvements in lifestyle it promised. Some say we’re more beholden to our machines than ever before, adapting our minds and bodies to their rhythms while maintaining the precarious illusion of control. Wouldn’t a citywide network of self-driving cars be an extension of that trend? Wouldn’t it become the civic answer to a Disneyland ride, a place to cede our right to travel, for a fee, to those who have our happiness at heart?

The Game of Driving

Beyond the cynicism, the doubt, the useful politics around any discussion of “rights,” is the brutal reality of driving as it exists now. And I’m not referring to the tens of thousands of Americans who lose their lives every year to variations on driver error. I am not talking about grinding hours on clogged, crumbling highways, bumper-to-bumper with the rest of the restless city. I’m not even talking about the land use disaster that is parking, the sky-high garage rates in city centers and the vast deserts of empty asphalt girdling suburban malls.

I’m referring to the game of driving, the unique psychological and political environment we enter when we pull away from garage or driveway and into that fraught arena, the street. We take for granted, I think, the degree to which this game shapes our politics and personalities. Though we are awash with self-help and self-realization guides of every kind, they offer no comment on the specific and often intense exercise that occupies hours of our time, every single day.

And yet there are engineers and programmers at Google, even now, who take this game of driving very seriously. They’re thinking about it every day. As I write this, intricate protocols are being written to guide self-driving prototypes safely down the streets of Mountain View. They’re even tackling ethical challenges, like whether an autonomous car can or should choose between the lives of its occupants and the lives of pedestrians on the road. Those programmers know that driving fits one definition of a game: a set of activities bounded by concrete rules. And that game is enforced, traffic cops aside, by the collective agreement of all parties involved to follow those rules.

Imagine you’re alone in your car and you approach a red light. Even though you can see clearly that no drivers are speeding down the perpendicular road, you stop at the light and wait. Imagine you’re on a two-lane street in front of a pack of cars. You want to drive at 30 mph, but the strangers behind you crowd close and you can tell they’re in a hurry. You speed up to 45 mph. Imagine you’re on a boulevard with three lanes traveling in your direction. You know there aren’t cars to your right or left, but you stick to your lane anyway, or feel a little jolt of embarrassment when you realize you’re drifting.

These are all examples of gameplay in an unpredictable arena where the stakes are at once commonplace (getting to the grocery store before it closes) and extreme (avoiding death-by-collision at that blind turn on the way there). That combination of extremes always struck me as psychologically significant.

A Political Exercise

And it’s all very political. Consider the enforcement of rules by horn and by rule-bending squad car. The masculine posturing and the constant preemptive stereotyping by vehicle and driver identity. The protocol of turn signals. The ways we improvise when a cyclist, a construction zone, or a fallen tree modifies the game-board. And then there’s road rage. That flash of supreme fury when one of these anonymous nobodies breaks the rules and puts us in a disadvantageous position.

The game of driving can be brutal in the city. We are all strangers when we walk down the sidewalk, but at least we are human strangers, bound to consider the people we encounter as something like our conscious selves. With cars it isn’t so easy. We know intellectually that the driver is a person, but what we see first is the car’s inhuman form. Moreover, we encounter that form in a competitive and rule-stricken environment where fear, anger, and shame are never too far away.

That isn’t how it was supposed to be. In a blander, less-crowded America, the automobile was a multiplication of the horse, a tool for liberation and self-discovery. In many parts of the country, out away from the cities, it can still be that way. And city driving isn’t always a challenge. The danger and the rules remain during pleasurable trips, but they soften. Far from obstacles or assholes, fellow drivers become incidental peers in a world where we’re all playing the same game. In those cases, driver control can be a positive social exercise – as well as a lot of fun.

A Google transportation network sounds like the final triumph of Silicon Valley boosterism, but we need to resist the urge to kill the self-driving car. Liberated from the need to play this high-stakes, low-payout game of driving, we can regain some of the psychological and political energy it currently drains from us. We can keep the right to drive, but only in places where we derive some pleasure from it, where road interactions don’t need to rely on an attitude of defensiveness. Maybe the self-driving car can help us make space for the games we actually enjoy.

Photo credit: Lars Plougmann

Tragedy in the Culture War

old farm

We don’t talk about them that way, but “urban” and “rural” are fuzzy things. Town and country may have been cleanly divided in medieval Europe, where landed barons ruled the fields and merchants plied their trade in tightly-defined non-feudal centers. But now that we’re all (ideally) free merchants of our time and energies, now that the private corporation has become something very different from the public corporation, urban and rural are just geographic – or demographic – terms.

But not necessarily to the political right. According to conservative scholar Victor Davis Hanson, urban and rural still determine how we view the world. In “The Oldest Divide,” published in City Journal (also in the LA Times), Hanson defends the wounded honor of rural America. As a city-dwelling progressive, I can’t say I felt wholly comfortable with Hanson’s piece. But it was a good read.

Hanson portrays urban America as a bloated and elitist place, cut off from the land, its residents totally ignorant about how food gets to table. The ideals of Jefferson and the gentleman farmer have steadily corroded, weakening the American republic and its citizens. America needs its greatness again.

It’s easy to argue that Hanson has it all backwards. That rural places, not urban ones, are over-represented in state and federal government. That the gentleman farmer is just that: white, male, and fairly well-off. It can be argued that Hanson’s dual residence in a Central Valley farmhouse and a Stanford University apartment lets him sample both places, shielding him from their problems. And his disdain for ecology and conservation places him vaguely out-of-time, an intellectual emissary from the early 20th century. A case can even be made that Hanson romanticizes rural America: exactly what he accuses city-dwellers of.

But aggrieved right-wing rhetoric aside, I found Hanson’s article insightful in a number of ways, and it feels good to escape my usual online echo chambers for a while.

“Living safely” versus “living together”

Hanson writes, “For rural residents, existential issues on the national level are seen as magnified versions of personal considerations: Does the country have enough food, fuel and minerals? Can America defend itself, protect its friends and punish its enemies? These concerns differ markedly from the urbanite’s worry about whether the government will provide services to take care of vulnerable populations or whether those of different races and religions can get along in such a crowded environment.”

This election season (if two years can be called that), disbelief about Donald Trump is the refrain from progressives. We can’t believe people are falling for his coarse, xenophobic nativism. He’s a dabbling bully, a liar, a frat boy idol. But for rural white people, living in declining places with neighbors “on the take” from governmental aid programs, current Republican rhetoric satisfies these “magnified personal considerations”.

I’m reminded of a theory that uses colonial origin in England to explain America’s clashing political ideologies. According to the hypothesis, New England settlers and their liberal politics emigrated from urban and lowland England, places associated with royal and parliamentary London elites. Those who settled the American South, by contrast, hailed from rural upland areas further from government control. Hence the worry, in Hanson’s piece and among conservatives at large, about urban “elites” and their regimented, tight-lipped greed. They’re worried about a new serfdom, a new aristocracy.

Which “culture of dependence”?

Since they’re closer to the land, says Hanson, rural people see only weakness in “dependence,” especially on the government that may be the predominant authority they encounter. In cities we live differently. Certainly, social programs do exist and some people may be unduly dependent on them. But any dependence on government – the public corporation – is eclipsed by an all-encompassing cultural dependency on the private corporation.

There is very little to do in the modern city that doesn’t involve paying (or being encouraged to pay, or working to pay) for some non-essential product or service. Government authority is present in the background; commerce is front-and-center.

In his article, Hanson described an Obama administration bulletin like this: “Its dependency narrative defined the life of an everywoman character as one of cradle-to-grave government reliance — a desirable thing.” Multiply that little message a thousand-fold, Mr. Hanson, and realize that urban life is a dependency narrative of cradle-to-grave corporate reliance, hammered into us as a desirable thing by a million commercials, ads, and the constant strain to live like those “elites” you imagine comprise the entire urban population.

Perhaps agrarians see corporations as enablers, as providers of seed and equipment, as those who purchase what they coax from the earth. And that leaves them the luxury to see government – where they get their subsides – as the necessary evil in their midst. Of course, actual farmers are a minority these days. The aggrieved small-town population has been slighted, not by the government, but by the corporate culture that outsourced their jobs and is returning to its natural metropolitan home. The fact that neighbors turn to drugs, drink, or federal alms is tragic but predictable. After all, didn’t the cities turn to drugs, crime, and government aid when the economy abandoned them last century?

Rural tragedy, urban therapy

It turns out Hanson is a classicist, a defender of the rural ideal not only on Jeffersonian terms, but through a republican lineage going back to Greece and Rome. While that background may account for his conservative outlook, it also enables him to observe the following:

“Physical and mental balance, practicality, a sense of the tragic rather than the therapeutic — all these were birthed by rural life and yet proved essential to the survival of a nation that would inevitably become more mannered, sophisticated and urban.”

I think that’s one of the key statements in Hanson’s essay, a point of divergence that does much to explain the great impasse of American politics. If rural politics amplifies personal problems, a tragic view makes sense: life ends in death, and nothing stays the same forever. Progressive urban politics soothes and solves; people learn to live with their differences. But they may fall prey to utopian and ahistorical thinking.

It could be argued that today’s Republican politics aren’t politics at all, but a tragic personal narrative written on national scale – a narrative of decline, anger, inevitable change. It is bizarre (see Donald Trump) because it isn’t really about living together, it’s a tragic drama where the old ways go out with a bang and fade to black.

If progressives really want to confront the old rural values Hanson defends, maybe we should take the time to understand them and explain ourselves on their terms. We’re not all yuppies and “elites”. City life can be tragic too.

Photo credit: Mark Engelbrecht via Flickr cc

Don’t let Banksy get you down

banksy

We’ve all seen it by now. A hasty peace sign with the Eiffel Tower set inside, calling to mind recent tragedy. Like most viral memes, “Peace for Paris” quickly escaped its original creator, a little-known graphic designer named Jean Jullien. Subsequent research attached Jullien’s name to the work, but I’d wager most of us still haven’t heard of him.

In fact, following the attacks, another artist’s name dominated online speculation about the image. Another artist, whose name is a buzzword for all that is insufferably hip. Perhaps Jullien’s casual brushstrokes evoked spray-paint on concrete. Maybe the symbol was painted, with studious irony, on a wall outside the UK Ministry of Defence. That’s right, I’m talking about Banksy.

I have nothing against Banksy’s art. Much of it is brilliant, a stenciled satire on the hypocrisy, venality, and all-around badness of society. It is vandalism, sure, but the places where it appears are so barren and ugly that modification could only improve them. Combining technical skill, an eye for irony, and a healthy absurdist streak, Banksy lets even the most oppressive facades denounce their own culture of oppression.

I’ve never been a street art nerd, but I admired Banksy enough to feature one of his works – rioter throwing bouquet of flowers – as my Facebook profile for several months. I’ve seen his film Exit Through the Gift Shop, a strange ode to fellow street artist Thierry Guetta. His ribbing of the police and CCTV are appreciated. But wasn’t there something obscenely predictable about our eagerness to credit him for the Paris drawing?

From the moment the rumor hit social media, I didn’t think Bansky really created the image. Not to imply that Banksy (whoever he/she/they are) doesn’t share in the collective grief over what happened. But anyone familiar with his repertoire knows that Banksy’s work is rarely straightforward. There’s always an angle, an implied critique of the viewer and of society.

If I had to guess, I’d expect Banksy to side with Team Beirut and lace his Paris picture with a critique of colonial bias. But never mind that “Peace for Paris” wasn’t his usual style: it was a pop image with some meaning, and thus he must be behind it. Banksy’s brand is so powerful that building owners count can themselves lucky when he “defaces” their property. If I owned a building, I’d expose a blank concrete wall, maybe mount a fake camera and some official-looking sign, hoping he’ll show up.

This isn’t an attack on Banksy. It’s a critique of the social thinking that sustains him. In the internet era more than ever, we expect regular doses of our favorite stuff delivered in a predictable way. Even someone like Banksy, who exists to disrupt that culture, is now a poster-boy for middle class would-be revolutionaries. Our media culture neutralizes criticism by appropriating it.

Maybe the real problem lies with street art itself. And hasn’t the name of Banksy become almost synonymous with that enterprise? He’s an unknown quantity, an outlaw, and anonymity has only added to the allure. Banksy’s mystique has driven speculation that he may be a collective of many artists, each working in “his” signature style.

Hmm. That’s not so very different from the co-creation, co-authorship, sharing, and collective brainstorming favored by digital capital. Like an online alias, “Banksy” lets him operate without dragging his own name around. Even if there is someone from Bristol who pioneered the Banksy phenomenon, what’s to prevent a skilled artist from passing off a forged Banksy? Art, writing, video, music – all intellectual property is now at risk from those who would devalue it. Or worse still, those who would catapult it to fame as a meme: something that issued from the roiling froth without an owner, author, or history.

If I had to criticize street art, it would be for making the ephemeral cool. For telling young people that the system will inevitably erase what they create, and convincing them that is a good thing. It is perfect irony that Snapchat has built a billion-dollar business around graffiti’s promise: that your subversive image will linger for a moment and then be washed away.

Of course, when those creations are powerful they’re not really erased. What can be erased are the rights and property privileges of the original creators. And it’s doubly hard when you’re coming from the left: if your creative portfolio consists of thinly-veiled socialism, what right do you have to demand royalties? The creative knowledge worker, who wants to work for good, finds that he or she can’t get paid for it.

I know nothing about Jean Jullien. I can’t say what kind of person he is. But I know he has to call himself graphic designer, not artist, because he wants to make a living doing what he enjoys. Banksy can call himself an artist: he deserves to. But Jean Jullien created an image that won’t disappear.

Photo credit: fingertrouble via Flickr cc

The new flower children

student protest

I’d like to talk about my feelings. Specifically, my pain. Today my pain comes from how badly people treat political correctness. I sit here, listening to people criticize political correctness, and, well, they’re not bad people who want to hurt political correctness’ feelings. They just don’t understand that they’re causing pain. They don’t understand that their microaggressions are making political correctness feel bad about itself.

And it’s not political correctness’ responsibility to educate its critics about why they’re being problematic. The critics need to educate themselves. Why should political correctness teach people what’s not okay? Why should the debate be framed on the critics’ terms? As a former critic of political correctness, I took the time to learn about what it has gone through – what it still goes through – in this country. So can you.

You see, political correctness wasn’t born into privilege. It spent its childhood braving the cramped corners of the American psyche that deal with guilt, suffering, and centuries of deferred justice. Political correctness had a hardscrabble adolescence filled with mockery and condemnation. Political correctness wasn’t the kind of idea that could just fit in with the rest. It was different, and it knew it was different. It just didn’t understand why different was bad.

So political correctness suffered. There was a lot of pain, a lot of impotent anger. But when it got older, it met some other ideas that looked just like it did: they were the same kind. And these new friends weren’t awkward and insecure and ashamed of who they were. They liked themselves. Mostly they were older ideas, veteran tenants of the collective psyche, an eclectic bunch. But they accepted political correctness as one of their own.

The coolest members of this little set were freedom and its partner equality. They were the de facto leaders, though the group didn’t really need leadership. They were friends. Freedom and equality were actually pretty popular, but they remained edgy. Not like bland, beautiful health and prosperity. Freedom and equality fought a lot, but somehow they always got back together.

And then there were democracy and socialism, one well-groomed, the other haggard and limping. Environmentalism smoked and swore and got chided for it by sustainability. Political correctness had a crush on diversity but didn’t know what to say. Diversity was very attractive after all, and got a lot of praise for it.

Political correctness still felt uncomfortable with its new friends, but at least it wasn’t alone. And miraculously, things got better. The gatekeepers of the collective American psyche changed their tune. College campuses, newsrooms, corporate offices – everywhere it went, political correctness caught on. It was an exciting time.

But it was also a sad and painful time, because even though a lot of people went along with political correctness, it didn’t feel like it belonged. It wasn’t cool like the rest of them. Even on the campuses there was a sense of quiet judgement. Political correctness worried that it just wasn’t likable. That it made people feel very uncomfortable about themselves and others, like they had to walk on eggshells all the time.

And that’s not what political correctness wanted. The problem was, political correctness didn’t know what it wanted. People tugged it this way and that, trying to benefit from it while condemning it under their breath. They said political correctness and freedom couldn’t get along, and the vibe between the two got weird. Even sweet diversity became distant, and political correctness thought they’d never be together.

So political correctness sulked and complained and pissed people off. This went on for a while. One day, freedom and equality had one of their spats. Equality came outside, sat down next to political correctness, and said, “I want to tell you about somebody you’ve never met, someone who doesn’t come around here anymore.”

“Dead?”

“No, fairness isn’t dead. Freedom says fairness lives inside all of us, but I don’t know about that. What I do know is that fairness is like us, an idea like you and I. But fairness is also unique.”

“Special?”

“Maybe. Fairness is important when you’re young and fades when you get older. It sounds childish, and people call it by different names, like justice. Fairness makes people uncomfortable just like you do. But fairness has the power to move people in the moment, to make them angry, or sad, or joyful. When people talk about us, usually they’re really talking about fairness. Fairness is a child’s voice speaking ancient wisdom. Fairness is…well, fairness unites us and makes us powerful. And yeah, special.”

“But what does fairness have to do with me? I’m not wise, how can I be!”

Equality grinned. “Fairness speaks though you. Really. You are brash and sensitive, you provoke people. They might not like you, or even respect you. They’ll say life isn’t fair, just deal with it. And people will deal with it. But in the back of their minds you’ll be there, growing every day. In the back of their minds you’ll say: Life isn’t fair. But it should be.”

Photo credit: Francisco Osorio via Flickr cc

What Is Like?

like button

One day, Twitter changed its favorite button to a “like” button adorned with a little red heart. For a few hours the mock-rage was palpable. Then, in a quick turnaround, that all evaporated as users settled into the new normal. Did we like the favorite button? Are we sad it’s gone? It doesn’t matter – today it feels like the favorite never existed.

It’s become a bit of a tech cliche. Every time Facebook, Twitter, or some other social network changes its user interface, a million voices cry out and are suddenly silent. They lose interest: what was strange one hour becomes old news the next. Our “learning curve” on tech operates with blistering efficiency. Where else but the corporate web do we instantly adapt to changes we didn’t choose or anticipate?

Before I wrote this, I made a point to avoid the tech insiders and social media experts. I wanted my take on the change to be pure and unadulterated. And my take is this: the new like button is a very important change, not in itself, but because of what it stands for.

Building the bird’s nest

For better or worse, social media is our foremost utility for giving and receiving the written word. Twitter has become the roosting ground of choice for journalists, critics, and assorted members of the commentariat. It represents everything the mainstream writing market prizes: wit, brevity, celebrity, saying a lot with a little. It’s also a serious place to escape from “friends” and interact with strangers.

On the day of the change, I checked some of those strangers’ accounts, people I don’t know in person, people who care about written communication. Predictably, they had already posted variations on snark, witticism, and honest comment. Mostly tongue-in-cheek condemnations of the new interface. I may be more gullible than they: my initial response was simple confusion.

For a second or two after logging on, I didn’t comprehend that the favorite had a new face, that the change was just cosmetic. For a brief moment Twitter was an alien planet, an arena with changed rules and I the last to know. And that’s the problem with social media as a communications utility: great writing breaks rules, makes them up, and takes the reader beyond what’s on the page. But corporate social media binds us to its will from the moment we check the box marked “I agree.”

Is social media any worse than 20th century mass media, with its publishers and networks and gatekeepers? I’d argue that it does better on idea exchange (but not idea monetization). But that doesn’t mean we shouldn’t address the new problems it engenders.

Here are two reasons I think Twitter’s minor change is noteworthy. You may have considered these before, in various guises, but I’ll put them out there anyway.

“Liking” culture

What’s the history of the like? Why did such a noncommittal expression become so laden with social meaning? I’m taken back to elementary school, when it denoted childish crushes. Back then, “she likes him” was a very different statement than “I like playing with Legos.”

As adults, we’ve made the vocabulary of liking even more complicated. On social media, we subject a potential like to a series of mental conditionals before we even click the button. If I like this post, then what will so-and-so think? Liking, now as then, can mean many things. But the language of the like isn’t one of signifiers and semantics as much as messy unspoken codes.

Facebook’s proposed dislike button reveals the like’s feebleness when something is said earnestly, seriously, inviting love or hate. Those are times when the like feels wrong, when we are driven to cobble together phrases and write a comment, however ungrammatical it might be. Writing is a productive act, in the simplest sense of making something that wasn’t there before.

On the flip side, the like button is consumerist. It’s a way to package and create reputation the same way money packages and creates value. It reduces all things to an eternal present where the eyes glaze over and the feed scrolls on, full of cute selfies, baseball games, and trips to Machu Picchu.

The Orwell effect

Don’t get me wrong, the present is a great place to be, maybe the best place. But when we write, we link past to present to future the way storytellers did for countless fire-lit centuries. We can like stories, but we can’t really tell them with likes alone. Social media simultaneously magnifies and suppresses our collective ability to tell stories.

It’s ironic that while Twitter is the utility of choice for writers and journalists, Facebook does a better job weaving historical record into its interface. Facebook’s timelines let users revisit posts and celebrate (or laugh at) past events and opinions. That can be jarring for Millennials who have radically changed during their years on Facebook! Twitter, for all its timeliness, is stuck in the present. All of its features prioritize what’s going on right now. All of them, that is, except the erstwhile favorite.

The name says it all: a favorite tweet was a bookmark, a way to preserve fleeting witticisms before they disappeared in the scroll-down abyss. Favorites let users keep track of interesting people without actually following them. The tool let us acknowledge the tragic or the negative without implying approval. Sure, we can do these things with the like, but a like isn’t really a bookmark at all. On Facebook, Instagram, etc, liking is spontaneous and quickly forgotten.

Which brings me to Orwell. In Nineteen Eighty-Four, the ruling regime edited old newspapers to make all of history match the Party line. Twitter’s change is less drastic, but it follows the pattern. Not only has Twitter abandoned the favorite; it has obliterated all evidence that the favorite ever existed. To a new user joining today, the like was always the like. My favorites are now listed as likes. But they weren’t likes when I clicked the button – they were favorites!

I don’t mean to complain about a petty distinction. I just want to point out how easy it is to rewrite history on social media, for the user and for the company. If these platforms are the record of our times, shouldn’t key decisions about them involve more than data collection and targeted advertising?

In the end, I like the like. To dislike the like would be pointless, and worse still, uncreative. The like is enfolding us in its bland, uncritical embrace. There’s nothing wrong with the like. But a world of likes has got me questioning, where is the love?

Photo credit: Owen W Brown via Flickr cc

Has Environmentalism Lost Touch with the Wild?

greenbiz1

Desperation. Rising seas, phantom hurricanes, the polar bear’s plight. To channel the climate deniers and the corporate right, environmentalists have long resorted to “alarmism” and “fearmongering” to spread their message, appeals to arouse our society’s collective fear of dying. As an unabashed environmentalist, I have to agree with the opposition: our prophecies may be based in fact, but that doesn’t make them any less sensational – and subject to the criticism sensational rhetoric will get. But if environmentalists abandon this script, are we watering down the message beyond repair?

In a piece published by the LA Times, Yale Environmental Protection Clinic director Joshua Galperin takes on the theme of desperation. These days, he argues, young environmentalists use the language of the “enemy” hoping for nominal, halfhearted concessions. They’re into appeasement, not bold progress. They’re desperate. Galperin yearns for the days when environmentalists rallied behind their “awe at the grandeur, interconnectedness and unpredictability of the ecosystems and wild landscapes.” Lifted out of themselves by the natural world, the old guard advocated for transformative change, an end to pollution and corporate malfeasance.

The argument is this: old-school environmentalists demanded revolution and got concessions. The new generation demands concessions and gets nothing. Cue desperation.

I think Galperin gives “young environmentalists” too much credit. His Yale students may be revolutionaries at heart, but I wouldn’t say the same for the majority of professionals in the environmental field. For one thing, the American public already supports environmentalist aims despite the power brokers’ successful campaign to stymie real action. Faced with an uncertain future, Millennials are hedging their bets, going along with the corporate responsibility narrative in case this blows over and the revolutionaries are left out to dry in a vibrant clean economy.

The die-hard greens, meanwhile, find themselves in a world that talks their talk without doing any of the associated legwork. Dirty capitalism is rebranding itself and maybe that’s how it should be. Climate change is happening, but maybe, um, it won’t really be that bad for people like us. Maybe people will live happy, decent lives in Elon Musk’s new order.

The left-wing magazine Jacobin published an article today chronicling how government power was (and is) essential to the development of capital. The piece touts the Soviet Union, not as a model to emulate, but as a gray counterweight to liberal economics, a bogeyman that let mid-century leftists extract concessions from corporate America. That “viable alternative” disintegrated around 1990, when Galperin notes the last substantive federal legislation on climate was passed. Back then, people my age were infants, poised to grow up in the world’s first truly global civilization.

With the Soviets gone and European “socialism” looking less attractive – as it botches immigration from its former colonies – we’re left with a voracious global economy and those who say we can knead it into shape. Meanwhile, inexorably, the old guard’s awe has slipped away. Where did it go?

For some young people it’s still there, hidden in a different place. Human civilization, for better or worse, has achieved cosmic proportions worthy of awe. The engines of global capital, people in their teeming billions, and now the omniscient, omnipresent digital overlay on our lives – these things are grand, interconnected, unpredictable, beyond understanding.

As the world urbanizes and cities grow at unprecedented rates, we find more everyday awe in man-made things. Green is an afterthought and an ornament. We drive through endless sprawl adorned with facsimiles of nature. We walk in tightly-bounded parks, places to pause and watch the cars go by. Increasingly, only those of us with stability, with disposable income, with time to spare can escape to “real nature.” And what about people who live far from the city center? Against Galperin’s narrative, they favor the policymakers who want to sully their hills and valleys for a better bottom line.

Centuries of science have convinced us that we know something about nature. And decades of unprecedented human change have us convinced that civilization is a mysterious thing. I agree with Galperin: mainstream environmentalism needs to rediscover wilderness, to see a new Apple laptop beside a houseplant and recognize which one is more complex, more awe-inspiring. Environmentalists need to study the human world, to acknowledge things like the federal government, investment banking, and the tech industry not for their innovation, or their evil, but for their simplicity.

The planet doesn’t require desperation, but it does deserve awe. The environment isn’t some fragile thing in need of protection – it’s us we need to worry about. If we can get beyond the relative smallness of global capital, we might find wilderness again, beyond the city lights, up in the stars, in ourselves.

Photo credit: Kevin Krejci via Flickr

Tech as Ritual

tech as ritual

It is an impersonal god, without name, without history, immanent in the world, diffused in a numberless multitude of things.

That’s how the “father of sociology” Emile Durkheim described the native state of religion. By discussing religion as something outside god, outside the supernatural, Durkheim overcame the modern impulse to separate experiences into the real and the magical. To so-called “primitive” peoples, wrote Durkheim, religion is bound up in the rituals of day-to-day existence: without scientific rationality everything becomes laden with meaning, with arcane portent. The physical is spiritual, and the spiritual is intimate and commonplace.

As a young boy attending Sunday Mass, I took in the proceedings as might be expected, in silent anticipation that the Holy Spirit – a luminous and cloud-like entity – would coalesce from the glass over the altar with a miraculous message for me alone. The spirit never showed itself, and I was left to repeat the tired rituals of Catholic communion.

To a son of a different age, lacking today’s scientific indoctrination and rigorous denial of the supernatural, the rituals themselves might have been evidence enough. To stand in a large room, evocatively designed, with true believers instead of the obligatory Sunday crowd, to witness the physical transformation of simple food into something essentially spiritual: maybe that’s the flavor of genuine Christian experience today. Practice and ritual providing not “evidence” of the magical, but a new world where the omnipresence of God makes for wonderful and terrible things.

Whatever the case, religion as Durkheim described it exists today, independent of God, independent of the supernatural, without history, diffused throughout the world. I’m referring to the construct of “tech,” not just an inventory of useful tools, but a clergy, a catechism, a divine mission. Tech will create a world, tech will destroy a world.

Turning to transhumanism

There’s a pinch of Genesis in the corporate language of high tech. Google defines its mission as the following: to organize the world’s information and make it universally accessible and useful. This is, one can assume, a task the Lord God might have left for the seventh day had He not chosen to rest. Google’s “Ten things we know to be true” is as much a modern confession of faith than anything else I’ve come across. In its quest to innovate and disrupt, tech moves beyond bourgeois motives into a realm still freshly baked from the sixties, a California Jerusalem characterized by the need to achieve oneness with God.

Faced with my own lack of religious faith, I struggled to find a substitute for God. My solution started from the many insolvable questions I had about the world. I recall the feeling clearly: grasping at a word, an idea, and mashing it around to extract meaning. But what meaning there was could be subjected to the same torture, and on and on until all meaning derailed into the ultimate unknown. The only way to discern the “meaning of life,” I decided, was to become all-knowing. Only then could one access the answers and finally know what to do. This adolescent quest sparked my interest in transhumanism.

The transhuman dream, still one of the most interesting developments in modern religiosity, is techno-optimism taken to its logical conclusion: that the human race, having achieved singularity and runaway intelligence, might become 1) all knowing and 2) immortal. It’s the need to atone for our dying bodies and decaying minds. Transhumanists, like many religious people, need to become the god they seek. It’s a way to reconcile the obvious benefits of applied science with the squidgy moral stuff science doesn’t address.

But transhumanism is still a super-natural framework. To the disappointment of men like Ray Kurzweil, we remain trapped in mortal bodies. There is reason to believe Moore’s Law cannot hold, that the growth rate of digital processing power will peak before our machines achieve self-awareness. Transhumanism is a theology, dealing with God before man. But tech on the ground feels more like religious practice, a framework of daily rituals and obligations. Their sum total isn’t God, but the way we live now, phone addiction and email and Paypal and Youtube and Instagram and Uber and Netflix – the list goes on.

Rituals, not “god”

What would a visitor from a few centuries ago think about our lived world? How would she parse appliances, lighting fixtures, our buildings, cars, airplanes, thermostats, laptops, TVs, our clothes, phones, headphones, bicycles, plastic? Maybe she’d see echoes of her time in the trees outside the window, our books (though they look different), the way people love and hate each other. Our world is a wonderful and terrible place.

As many have said before, digital tech takes all this up a notch. There are older folks bored on Facebook right now to whom “Facebook” was incomprehensible many decades ago. Because we’re so adaptable, we can function within a miraculous environment once we know it isn’t an illusion. Durkheim’s point is that before science, many peoples saw myth and magic as commonplace, granting them the ability to survive and simultaneously develop culture.

Rituals certainly govern our relationship with pre-digital tech. Before vacuuming we unspool the cord and plug it into the wall socket, facilitating the flow of electricity. When we drive we buckle our seatbelts, turn the key in the ignition, and begin a complex series of hand-eye maneuvers to govern the vehicle as it travels. We brush our teeth. Factory workers operate complex machines and do some tasks by hand.

But these actions are still inherently physical. They take place in one world, the “real world.” The rituals of Sunday Mass, of temple, of meditation, are very different. In church, physically meaningless ritual brings the practitioner from the profane world into sacred space. The rules are different there. People aren’t what they seem, and neither are things. There is a sense of order, of authority, of morality, but it’s very distinct from the laws of the outside. In the sacred space, you’re empowered in some ways and handicapped in others. When you enter the sacred space with others, you experience what Durkheim called “collective effervescence,” the passage into a treacherous borderland where anything can happen.

The rituals of modern tech are ergonomic, but only so that we can overcome ergonomics and “go inside.” We tap at our smartphones or switch on laptops. We disassociate from the physical world and enter a state of distracted concentration, gazing into a lighted screen. We wander the onion-world of the internet, where every page reveals the seed of the next. And from those pages leap infinite possibilities. Does the person we’re messaging live next door or across an ocean? Will this article reveal a new way to solve an impossible problem? Will we marry the next swipe on Tinder?

We get bored on tech, just as the the devotee gets bored sometimes during service. But our boredom only reinforces the all-encompassing nature of that which we inhabit. We are bored, but we stay in the matrix. There’s even a tech clergy – the programmers and specialists who’d dismiss this post as a layman’s ravings. To those who understand its hidden mechanics (they’d tell me, citing Asimov), a system cannot be a religion. Sacred space is just bits and electrons, basic programs disguised as something more. But when those programmers and engineers go off the clock, where do they go for entertainment, for love, for release?

The Catholic Church once reserved literacy for clerical elites, convinced that commoners reading the Bible would ruin religion. But when Martin Luther opened the floodgates and let all people read, what emerged was a tighter, stricter version of Christianity based on a personal relationship with the text. Religious ritual changed colors, put on a fresh face. Where once was MySpace now is Snapchat.

Where is all of this leading?

Our interaction with tech is ritualistic, and through ritual we gain access to the realm where, increasingly, we achieve all significant things. The question, as always, is whether all of this is good or bad. I’m on the fence. If we acknowledge a progression from traditional religious practice into a fraught modern era of religious doubt, where does tech practice fit in? Is it scientific/rational? Is it like religion?

One train of thought – sponsored by Silicon Valley and the techno-optimists – promises that digital tech will heal the disconnect between science and meaningful ritual. The alienation of 20th-century mass media, its friendliness to propaganda, will give way to a new age of open data and social media. Connection will breed compassion. But the pessimists fear that by resurrecting social networks and neutering mass media, tech will duplicate the ritual-bound ignorance of the past. The rational individual will lose his power to a new clergy, a new feudalism led by higher earthly powers. The selfie and the tl;dr tribe will replace criticism, the essay, and political economy.

I really hope it’ll be the former and not the latter. Time will tell. But until the picture becomes clearer, let me return to the ritual of posting on WordPress.

Photo credit: Ken Walton via Flickr

Fight for the Right to Ramble

adventure

Mass shootings are interesting – horrifically so – because they escape the mold of social crime (the “prison-industrial complex”) and flow from the psychoses of an individual. They have given rise to fitful ideological intercourse about gun control, autonomy, and how we should talk about the issue. I discussed the first two items on that list (however briefly) in my previous post, and now I’d like to touch on the final one.

High-profile mass killers tend to leave behind manifestos justifying or explaining their intent. Off the top of my head, I can think of Dylann Roof, Elliot Rodger, and Christopher Dorner (the rogue LAPD officer: not really a mass killer, but treated like one). The content of these reports is, in most cases, unconscionable. But it is striking that when stories about the manifestos do break, they are described, almost invariably, as “rambling,” as in “a rambling 5,000-word account entitled ‘God’s Plan For Me.'”

From the limited sample of such “literature” I have read, that description is quite accurate. Deep in their psychoses and paranoias, the perpetrators have little time for concise argument. Historically, we’ve seen more eloquent defenders of racism than Dylann Roof. Better rhetoricians than Elliot Rodger have advocated for patriarchy. But why are journalists and other observers so eager to apply the word “rambling” to the writings of depraved individuals? Since when has a literary ramble become associated with sick, twisted thinking?

This must sound quite petty and nit-picky, but it points to a deep and pervasive trap that has befallen the world of words. Namely, we’ve entered into an era of Orwellian Newspeak, of abbreviated language and meaning. But there’s no Big Brother, no repressive authority demanding we edit down our dictionaries. We’re doing it to ourselves. We need to stop and think, maybe leave the building and go for a little ramble round the block.

See, the word ramble means “to walk or go from one place to another place without a specific goal, purpose, or direction.” (That’s from Merriam-Webster.) Rambling is the domain of the transient, the wanderer, the pilgrim: a romantic type, prone to sudden outbursts of poetry mediated by spells in drinking holes and houses of ill repute (or, perhaps, country churches). The ramble excites a venturesome, individualistic part of the human mind, making it a favored plot device of novelists and screenwriters. To ramble is to cut fetters and be free.

That’s why I’m suspicious of any attempt to criminalize the ramble, especially in writing. The fact that someone chose to write in a rambling fashion doesn’t diminish the content’s truth or falsehood. In many cases, a roundabout and indirect path to meaning sets the reader on an emotional or intellectual journey. All good fiction rambles. An abstract summary of a novel’s “meaning” might help booksellers, but it’s not what the reader comes for. Even the great expository and scientific works of bygone eras rambled to a sometimes preposterous degree. That’s because the author felt those topics were important, even if we do not.

Maybe I’m giving the ramblers undue credit. In professional and business circles, rambling just doesn’t work, and many literary ramblers simply lack the talent to sharpen their prose. But editorial passion for concision can overextend itself, cutting into subtlety. And user-friendliness advocates have released a torrent of abbreviated listicles, self-help pieces, and trivia features. These aren’t necessarily bad, but they can be very limiting.

We have entered, for better or worse, the age of the brainstorm. The belief that sudden flashes of insight can be had by sitting down and working at thought until the picture pops into focus. There’s a lot to be said for the brainstorm, but I don’t think it can be complete without the meditative ramble. The one complements the other. The off-topic ramble may be impotent without the brainstorm’s aggression, but the brainstorm lacks perspective without the ramble, the meditation, the silent moment. I think we have the brainstorm down, but society could use some rambling practice.

In conclusion. Maybe citing the manifestos of mass shooters isn’t the right tactic to promote something I think is positive. Or maybe we really are in a struggle to preserve and extend the diversity of language, to feel confident writing something that doesn’t score well on SEO and won’t attract a tribe of people who want to live better. Maybe this isn’t a joke and I mean what I say. Or maybe it’s just a ramble.

Photo credit: Tanti Ruwani via Flickr

Preppers, Guns, and Environmentalism

 

I know next to nothing about guns, aside from what the odd video game has to say. I’m a west coast liberal, a progressive, and I chuckle with wry agreement when Europeans refer to the “sick gun culture” in America. They’re right, of course. There are too many guns. But I’ve never been an ardent gun control advocate, and I only just realized why.

As unlikely as it sounds, guns and green have a lot in common. Modern environmentalism aims for the same psychological sweet spot that shows like The Walking Dead – in all their high-caliber glory – target for high ratings and repeat views. I’m referring to the strain of millenarian end-of-the-world environmentalism that lies beneath every viral article on melting permafrost, oceanic die-offs, and refugees from submerged cities. When it comes right down to it, the green movement needs the apocalypse as much as the gun lobby does. Which brings me to the American prepper.

I’ll admit to a fascination with the prepper phenomenon. The stereotype of rednecks hiding out in bunkers belies an active and earnest subculture of regular folks (often from the “heartland”) who prefer self-reliance in the face of major catastrophe. A whole cottage industry has sprung up, especially online, marketing specialized survival goods, manuals, and assorted bric-a-bric to preppers. There’s even a TV program about the subculture (American Preppers), which I haven’t had a chance to see.

The lengths to which some of these individuals pursue their hobby is impressive. In the event of TEOTWAWKI (the end of the world as we know it), preppers prefer to own a rural “bug-out” location already stocked and ready to serve as home for the long-haul. They mull over ways to keep their families away from the “Golden Horde” of panicked non-preppers. (It’s always families: apparently the single person is rare in prepper-land.) They often accumulate a vast array of canned food, survival equipment, and, of course, weaponry. If a sudden TEOTWAWKI event does occur, I’m certain they’ll have the last laugh.

As mass shootings dominate the news, tired old arguments for and against gun control get dragged back and forth, achieving little. I wish more critics would brave the liberal-conservative border fence and acknowledge the real issue: we Americans love us some autonomy. And we love being able to say that we have more autonomy than those drones on the other side. The prepper movement is the face of a powerful undercurrent in the collective American psyche.

What’s funny is that most preppers are sustainability advocates. Their love of autonomy and self-sufficiency – being “off the grid” – drives them to solar energy, recycling, composting, and conservation. And their predilection for the end of the world (as we know it) sounds mighty similar to the torrent of articles about climate scientists and their worst fears. Guns, like household solar panels, symbolize the common person’s power in the face of global threats.

The fact that modern environmentalists align so closely with “liberal” positions like gun control, queer rights, abortion rights, and the like is a recent fluke of political history. Previously, conservation environmentalism embraced hunters, cowboy enthusiasts (Edward Abbey), and assorted wild men who’d laugh out loud at Dick Cheney’s famous steady hand. The modern pivot toward a progressive environmentalism has attempted to forge common cause with successes in identity politics, letting some of the previous generation’s prized autonomy slide.

It’s been said before, but I think we need to take another hard look at the green movement’s inherent conservatism. As in “conservation,” not “Republican.” Too often, those of us caught up in the fervor of today’s identity movement (a heroic and worthwhile endeavor) see the past as a dark stain, a pit of obscene human rights violations, of racism, of misogyny, of homophobia. Amid all this valid criticism, we miss the fact that the neoconservative/neoliberal agenda cares nothing for tradition, autonomy, or identity. In a blind quest for maximization and growth, silly human notions like “love” or “religion” mean very little.

For all its tragedies, the past is a fertile field for useful stories. Stories that explain things like conservative and progressive, liberal and authoritarian, capitalist and corporatist. It’s up to each of us to look at those stories and decide which labels are good and bad, and in which circumstances. What I can say is that the current global system is fundamentally at odds with both environmentalism and autonomy. We have allowed most of our ideological markers to fuse and blend into meaninglessness.

Sanders’ and Trump’s populist campaigns prove that autonomy remains a viable force in the American mind. They also prove how divided we have become. Through shotguns or solar panels, I hope our love of self-reliance can help us find some common ground other than liberal capitalist progressive authoritarian conservative libertarian corporatism. (I choose the solar panels.)