What Is Like?

like button

One day, Twitter changed its favorite button to a “like” button adorned with a little red heart. For a few hours the mock-rage was palpable. Then, in a quick turnaround, that all evaporated as users settled into the new normal. Did we like the favorite button? Are we sad it’s gone? It doesn’t matter – today it feels like the favorite never existed.

It’s become a bit of a tech cliche. Every time Facebook, Twitter, or some other social network changes its user interface, a million voices cry out and are suddenly silent. They lose interest: what was strange one hour becomes old news the next. Our “learning curve” on tech operates with blistering efficiency. Where else but the corporate web do we instantly adapt to changes we didn’t choose or anticipate?

Before I wrote this, I made a point to avoid the tech insiders and social media experts. I wanted my take on the change to be pure and unadulterated. And my take is this: the new like button is a very important change, not in itself, but because of what it stands for.

Building the bird’s nest

For better or worse, social media is our foremost utility for giving and receiving the written word. Twitter has become the roosting ground of choice for journalists, critics, and assorted members of the commentariat. It represents everything the mainstream writing market prizes: wit, brevity, celebrity, saying a lot with a little. It’s also a serious place to escape from “friends” and interact with strangers.

On the day of the change, I checked some of those strangers’ accounts, people I don’t know in person, people who care about written communication. Predictably, they had already posted variations on snark, witticism, and honest comment. Mostly tongue-in-cheek condemnations of the new interface. I may be more gullible than they: my initial response was simple confusion.

For a second or two after logging on, I didn’t comprehend that the favorite had a new face, that the change was just cosmetic. For a brief moment Twitter was an alien planet, an arena with changed rules and I the last to know. And that’s the problem with social media as a communications utility: great writing breaks rules, makes them up, and takes the reader beyond what’s on the page. But corporate social media binds us to its will from the moment we check the box marked “I agree.”

Is social media any worse than 20th century mass media, with its publishers and networks and gatekeepers? I’d argue that it does better on idea exchange (but not idea monetization). But that doesn’t mean we shouldn’t address the new problems it engenders.

Here are two reasons I think Twitter’s minor change is noteworthy. You may have considered these before, in various guises, but I’ll put them out there anyway.

“Liking” culture

What’s the history of the like? Why did such a noncommittal expression become so laden with social meaning? I’m taken back to elementary school, when it denoted childish crushes. Back then, “she likes him” was a very different statement than “I like playing with Legos.”

As adults, we’ve made the vocabulary of liking even more complicated. On social media, we subject a potential like to a series of mental conditionals before we even click the button. If I like this post, then what will so-and-so think? Liking, now as then, can mean many things. But the language of the like isn’t one of signifiers and semantics as much as messy unspoken codes.

Facebook’s proposed dislike button reveals the like’s feebleness when something is said earnestly, seriously, inviting love or hate. Those are times when the like feels wrong, when we are driven to cobble together phrases and write¬†a comment, however ungrammatical it might be. Writing is a productive act, in the simplest sense of making something that wasn’t there before.

On the flip side, the like button is consumerist. It’s a way to package and create reputation the same way money packages and creates value. It reduces all things to an eternal present where the eyes glaze over and the feed scrolls on, full of cute selfies, baseball games, and trips to Machu Picchu.

The Orwell effect

Don’t get me wrong, the present is a great place to be, maybe the best place. But when we write, we link past to present to future the way storytellers did for countless fire-lit centuries. We can like stories, but we can’t really tell them with likes alone. Social media simultaneously magnifies and suppresses our collective ability to tell stories.

It’s ironic that while Twitter is the utility of choice for writers and journalists, Facebook does a better job weaving historical record into its interface. Facebook’s timelines let users revisit posts and celebrate (or laugh at) past events and opinions. That can be jarring for Millennials who have radically changed during their years on Facebook! Twitter, for all its timeliness, is stuck in the present. All of its features prioritize what’s going on right now. All of them, that is, except the erstwhile favorite.

The name says it all: a favorite tweet was a bookmark, a way to preserve fleeting witticisms before they disappeared in the scroll-down abyss. Favorites let users keep track of interesting people without actually following them. The tool let us acknowledge the tragic or the negative without implying approval. Sure, we can do these things with the like, but a like isn’t really a bookmark at all. On Facebook, Instagram, etc, liking is spontaneous and quickly forgotten.

Which brings me to Orwell. In Nineteen Eighty-Four, the ruling regime edited old newspapers to make all of history match the Party line. Twitter’s change is less drastic, but it follows the pattern. Not only has Twitter abandoned the favorite; it has obliterated all evidence that the favorite ever existed. To a new user joining today, the like was always the like. My favorites are now listed as likes. But they weren’t likes when I clicked the button – they were favorites!

I don’t mean to complain about a petty distinction. I just want to point out how easy it is to rewrite history on social media, for the user and for the company. If these platforms are the record of our times, shouldn’t key decisions about them involve more than data collection and targeted advertising?

In the end, I like the like. To dislike the like would be pointless, and worse still, uncreative. The like is enfolding us in its bland, uncritical embrace. There’s nothing wrong with the like. But a world of likes has got me questioning, where is the love?

Photo credit: Owen W Brown via Flickr cc

Advertisements

Tech as Ritual

tech as ritual

It is an impersonal god, without name, without history, immanent in the world, diffused in a numberless multitude of things.

That’s how the “father of sociology” Emile Durkheim described the native state of religion. By discussing religion as something outside god, outside the supernatural, Durkheim overcame the modern impulse to separate experiences into the real and the magical. To so-called “primitive” peoples, wrote Durkheim, religion is bound up in the rituals of day-to-day existence: without scientific rationality everything becomes laden with meaning, with arcane portent. The physical is spiritual, and the spiritual is intimate and commonplace.

As a young boy attending Sunday Mass, I took in the proceedings as might be expected, in silent anticipation that the Holy Spirit – a luminous and cloud-like entity – would coalesce from the glass over the altar with a miraculous message for me alone. The spirit never showed itself, and I was left to repeat the tired rituals of Catholic communion.

To a son of a different age, lacking today’s scientific indoctrination and rigorous denial of the supernatural, the rituals themselves might have been evidence enough. To stand in a large room, evocatively designed, with true believers instead of the obligatory Sunday crowd, to witness the physical transformation of simple food into something essentially spiritual: maybe that’s the flavor of genuine Christian experience today. Practice and ritual providing not “evidence” of the magical, but a new world where the omnipresence of God makes for wonderful and terrible things.

Whatever the case, religion as Durkheim described it exists today, independent of God, independent of the supernatural, without history, diffused throughout the world. I’m referring to the construct of “tech,” not just an inventory of useful tools, but a clergy, a catechism, a divine mission. Tech will create a world, tech will destroy a world.

Turning to transhumanism

There’s a pinch of Genesis in the corporate language of high tech. Google defines its mission as the following: to organize the world’s information and make it universally accessible and useful. This is, one can assume, a task the Lord God might have left for the seventh day had He not chosen to rest. Google’s “Ten things we know to be true” is as much a modern confession of faith than anything else I’ve come across. In its quest to innovate and disrupt, tech moves beyond bourgeois motives into a realm still freshly baked from the sixties, a California Jerusalem characterized by the need to achieve oneness with God.

Faced with my own lack of religious faith, I struggled to find a substitute for God. My solution started from the many insolvable questions I had about the world. I recall the feeling clearly: grasping at a word, an idea, and mashing it around to extract meaning. But what meaning there was could be subjected to the same torture, and on and on until all meaning derailed into the ultimate unknown. The only way to discern the “meaning of life,” I decided, was to become all-knowing. Only then could one access the answers and finally know what to do. This adolescent quest sparked my interest in transhumanism.

The transhuman dream, still one of the most interesting developments in modern religiosity, is techno-optimism taken to its logical conclusion: that the human race, having achieved singularity and runaway intelligence, might become 1) all knowing and 2) immortal. It’s the need to atone for our dying bodies and decaying minds. Transhumanists, like many religious people, need to become the god they seek. It’s a way to reconcile the obvious benefits of applied science with the squidgy moral stuff science doesn’t address.

But transhumanism is still a super-natural framework. To the disappointment of men like Ray Kurzweil, we remain trapped in mortal bodies. There is reason to believe Moore’s Law cannot hold, that the growth rate of digital processing power will peak before our machines achieve self-awareness. Transhumanism is a theology, dealing with God before man. But tech on the ground feels more like religious practice, a framework of daily rituals and obligations. Their sum total isn’t God, but the way we live now, phone addiction and email and Paypal and Youtube and Instagram and Uber and Netflix – the list goes on.

Rituals, not “god”

What would a visitor from a few centuries ago think about our lived world? How would she parse appliances, lighting fixtures, our buildings, cars, airplanes, thermostats, laptops, TVs, our clothes, phones, headphones, bicycles, plastic? Maybe she’d see echoes of her time in the trees outside the window, our books (though they look different), the way people love and hate each other. Our world is a wonderful and terrible place.

As many have said before, digital tech takes all this up a notch. There are older folks bored on Facebook right now to whom “Facebook” was incomprehensible many decades ago. Because we’re so adaptable, we can function within a miraculous environment once we know it isn’t an illusion. Durkheim’s point is that before science, many peoples saw myth and magic as commonplace, granting them the ability to survive and simultaneously develop culture.

Rituals certainly govern our relationship with pre-digital tech. Before vacuuming we unspool the cord and plug it into the wall socket, facilitating the flow of electricity. When we drive we buckle our seatbelts, turn the key in the ignition, and begin a complex series of hand-eye maneuvers to govern the vehicle as it travels. We brush our teeth. Factory workers operate complex machines and do some tasks by hand.

But these actions are still inherently physical. They take place in one world, the “real world.” The rituals of Sunday Mass, of temple, of meditation, are very different. In church, physically meaningless ritual brings the practitioner from the profane world into sacred space. The rules are different there. People aren’t what they seem, and neither are things. There is a sense of order, of authority, of morality, but it’s very distinct from the laws of the outside. In the sacred space, you’re empowered in some ways and handicapped in others. When you enter the sacred space with others, you experience what Durkheim called “collective effervescence,” the passage into a treacherous borderland where anything can happen.

The rituals of modern tech are ergonomic, but only so that we can overcome ergonomics and “go inside.” We tap at our smartphones or switch on laptops. We disassociate from the physical world and enter a state of distracted concentration, gazing into a lighted screen. We wander the onion-world of the internet, where every page reveals the seed of the next. And from those pages leap infinite possibilities. Does the person we’re messaging live next door or across an ocean? Will this article reveal a new way to solve an impossible problem? Will we marry the next swipe on Tinder?

We get bored on tech, just as the the devotee gets bored sometimes during service. But our boredom only reinforces the all-encompassing nature of that which we inhabit. We are bored, but we stay in the matrix. There’s even a tech clergy – the programmers and specialists who’d dismiss this post as a layman’s ravings. To those who understand its hidden mechanics (they’d tell me, citing Asimov), a system cannot be a religion. Sacred space is just bits and electrons, basic programs disguised as something more. But when those programmers and engineers go off the clock, where do they go for entertainment, for love, for release?

The Catholic Church once reserved literacy for clerical elites, convinced that commoners reading the Bible would ruin religion. But when Martin Luther opened the floodgates and let all people read, what emerged was a tighter, stricter version of Christianity based on a personal relationship with the text. Religious ritual changed colors, put on a fresh face. Where once was MySpace now is Snapchat.

Where is all of this leading?

Our interaction with tech is ritualistic, and through ritual we gain access to the realm where, increasingly, we achieve all significant things. The question, as always, is whether all of this is good or bad. I’m on the fence. If we acknowledge a progression from traditional religious practice into a fraught modern era of religious doubt, where does tech practice fit in? Is it scientific/rational? Is it like religion?

One train of thought – sponsored by Silicon Valley and the techno-optimists – promises that digital tech will heal the disconnect between science and meaningful ritual. The alienation of 20th-century mass media, its friendliness to propaganda, will give way to a new age of open data and social media. Connection will breed compassion. But the pessimists fear that by resurrecting social networks and neutering mass media, tech will duplicate the ritual-bound ignorance of the past. The rational individual will lose his power to a new clergy, a new feudalism led by higher earthly powers. The selfie and the tl;dr tribe will replace criticism, the essay, and political economy.

I really hope it’ll be the former and not the latter. Time will tell. But until the picture becomes clearer, let me return to the ritual of posting on WordPress.

Photo credit: Ken Walton via Flickr

The Millennial Experience Trap

experience trap 1

Quick note: It’s been a while since I last posted here. This month, I decided to post more regularly and on a wide range of topics. Anything I deem interesting. I hope my (currently sparse) readership will enjoy. If anything, it’s decent writing practice.

To start in again, I wanted to make a few observations about a self-replicating cycle (I wouldn’t call it vicious, but maybe I should) that bedevils the relationship we have with media, the environment, and life itself. Yes, it is quite serious! It might be called the experience trap.

I’m not referring to the modern epidemic of unpaid internships, a resume-industrial complex that glimmers with the false sheen of necessity for too many college grads. No, the issue at play here is far subtler, and characteristically postmodern in its implications.

On one level, the experience trap operates as an advertising scheme. Young people, as we have been encouraged to know, crave experience over possession. We live and consume in the moment, putting the beach sunset, the cafe buzz, the skydiving trip above the whole business of mortgages, ladder climbing, and delayed returns. Marketers incorporate this theme into their creations, further replicating it.

And so we’re left with the flighty Millennial stereotype, of Facebook-addicted twenty-somethings leading a legion of high schoolers to text and Instagram their way into the abyss. Youngsters’ heads buried in smartphones, their parents following suit. This is all very frightening. Of course it’s also simplistic, just a symptom of the real malaise.

The Millennial, commentators report, has begun to settle down. Some are having kids, moving to the suburbs, doing as their ancestors have done. But that doesn’t change the fact that media – especially advertising media – has embraced the mythic, ecstatic, in-the-moment experience as what we all should crave.

Along with the stream of articles on Millennial childbirth and the suburban Millennial come reports of Millennial frugality, of a willingness to rent, of young people returning to the Rust Belt for cheap housing and hipster grit. Postmodern counterculturalists reject the overt status symbols of American Dream 1.0 and embrace an updated version. In American Dream 2.0 (or is it 1.1?), apps, bikes, micro-apartments, and the sharing economy will liberate us from the crushing burden of owning things. We will save the planet by having easily-categorized experiences rather than actual possessions. We will rent services, and go without if need be.

The counterargument is obvious. If we don’t own anything in the sharing economy, we end up forever indentured to those who do. If we give up driving in cities with no bike lanes, we get crushed beneath the wheels of those who could care less about climate change. By ceding the desire to accumulate – a simple, small-minded fixation – in favor of a wish to “just live life,” we risk disenfranchising ourselves at a crucial moment.

After all, climate change is upon us. Income inequality is gruesome. People are stripped of opportunity and dignity because of their identities. Imperialism’s legacy remains fully entrenched in the global economy.

But this is also an exciting time because we’re finally acknowledging the underlying social psychoses beneath not just these issues, but all their permutations and offspring. A worldwide web of data, dumb and mute in of itself, has become available to millions of minds, making us, as a species, more mature, more aware, more critical.

Still, whatever progress civilization might be making, what about our own lives? Can the experiences we allow ourselves ever measure up to our world-class expectations? There is a protean fear at work beneath the glossy veneer of mainstream culture, a burrowing doubt that grafts itself into every counterculture, onto every attempt to tell a social story with legitimacy and self-respect.

It’s what makes people turn to possessions, to ownership, again and again and against all the evidence that money can’t buy happiness. It’s the nagging sense that the experiences we have – the kind we are allowed to have – aren’t enough to replace what consumerism offers. (Which is what, again?) Most of the time we can’t even untangle our activities from consumerism, let alone tease out something revolutionary.

If we intend to value experiences over possessions, to look back on a life worth living, we need to reexamine the context of those experiences. Are they truly memorable? Do they stick in the mind without electronic aid? Did they contribute to real growth, to an evolving mind and spirit? Are we alive to the difference between long-term and short-term memory, or do we let weeks and months slip by unremembered?

Above all, can we really say that we want a latte-sipping life, free from real danger and pain, free of the thought of death, hemmed in by the invisible bounds of virtuality, anesthetized by the knowledge that we all live in an age of distraction? Too disenchanted to fully embrace corporate capitalism, too civilized to actually challenge it? A life in which curation becomes creativity, and one experience cannot be meaningfully distinguished from the next?

The experience peddlers tell us to step up and do what makes us come alive. But that’s wrong, isn’t it? We’re alive already. We need to throw off what deadens us.

Photo credit: Ted Eytan via Flickr