Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 20 2011

16:00

Webs and whirligigs: Marshall McLuhan in his time and ours

Thursday, July 21 would have been the 100th birthday of Marshall McLuhan, the Canadian media theorists who was one of the most influential — or at least one of the most quoted — media thinkers of the 20th century. (And certainly the only one to feature, memorably, in Annie Hall, above.) To celebrate, we’re having a mini McLuhan Week here at the Lab. To kick us off, here’s our own Megan Garber.

Marshall McLuhan is generally best known — and to some extent exclusively known — for a single maxim: “The medium is the message.” This is mostly unfortunate. McLuhan was the author of several books of varying forms, a pioneering intellectual celebrity, and the founder of a field; five words, plump and alliterative though they may be, are wildly inadequate. But McLuhan had, in his way, a sense of humor, and appreciated as much as anyone the absurdity of his own meta-maxim (M.M. = (M=M)), and ended up feeding and fighting his own reductive celebrity in pretty much equal measure. A lover of poetry, probes, and extremely bad puns, he named one of his later books The Medium Is the Massage.

Today, 100 years after his birth and nearly 50 after he gave us language that made “media” into a thing, McLuhan is a Media Guru of the first order, which is to say that he is often quoted and rarely read. (The second-most-famous McLuhanism: “You know nothing of my work!”) When he died in late 1980, obituaries remembered him, with no apparent irony, as the “apostle of the electronic age.” But what will he be for the digital? Do his insights, focused as they were on the vagaries of television, apply equally well to the brave new world of bytes and bits?

For all the “visionary” status we confer on him today, it’s worth remembering that McLuhan was constrained by his time as much as we are to our own: He wrote not just about Tribal Man and Graphic Man, about the cultural and cognitive effects of communication as they sweep the span of human history, but also about jukeboxes and miniskirts and magazines and hosiery. Women, to him, were accessories to men. And his thinking (to repeat: Tribal Man) was pretty much implicitly paternalistic. When he talked about a “global village” — another maybe-claim to web-visionary fame — he wasn’t talking about a world community where New Yorkers go jeans-shopping with Londoners and Guineans share sugar with Laotians and everyone finally meets at a communal table to sip artisanal tea and discuss newly localized world events; he was talking about an encroaching dystopia that renders Tribal Man — or, more accurately, re-tribalized humanity — increasingly connected to, and yet actually disconnected from, each other via the barely-contained buzz of electric wires. A global village, McLuhan feared, was one that would be populated by automatons.

But writing, as it filled us with notions of our own narrative nobility, inspired in us something else, too: a need — a desire, an impulse — for containment.

“What if he is right”? Tom Wolfe asked, ominously. “What…if…he…is…right”?

And: He was right, about not everything but a lot, which is why today he is a Media Guru and a YouTube sensation and a ubiquitous subject of biographies both cheeky and earnest and a fixture of culture both nerd and pop, which are increasingly the same thing. He is the patron saint of Wired. Today, as the “electronic” age zips and zaps into the digital, as we are spun by the centrifugal forces of a nascent revolution that we can’t fully perceive because we’re the ones doing the spinning, McLuhan’s theories seem epic and urgent and obvious all at the same time. And McLuhan himself — the teacher, the thinker, the darling of the media he both measured and mocked — seems both more relevant, and less so, than ever before.

More, because, as the tale goes, McLuhan pretty much foresaw this whole Internet business. But less, too, because whatever foreseeing he did arrived, as foresight often does, prematurely. In the ’60s, at the height of his fame, McLuhan’s ideas were thrilling and shocking and, more generously, radical. Fifty years later, tempered by time, those same ideas have coalesced into conventionality (less generously: cliché). “The medium is the message” has been used to describe everything from cars to computers. I’m pretty sure I remember Bart Simpson writing it on a blackboard. McLuhan, controversial in his own time, has mainstreamed; the basic tenets of his thought — to the extent that his “thought,” an impressionistic assemblage of ideas that sweep and swoop and sometimes snap with self-contradiction, are a unit at all — have been, basically, accepted. We shape our tools, and afterward our tools shape us. Yeah, definitely. But…now what?

McLuhan wasn’t a journalistic thinker; he was a media theorist, and is most interesting when he’s talking not about the news itself, but about more theory-y things — modes and nodes and all the rest. (Though: If you want a treat, check out The Mechanical Bride, the collection of essays that formed his first book and that feature McLuhan before he became, fully, McLuhan — McLuhan not as an enigmatic intellect so much as a classic critic, trenchant and crotchety and indignant and delightful.) One feature of McLuhan’s thought that is newly relevant, though — to the world of the web, and to the new forms of journalism that live within it — is the one that is both core and corollary to the medium is the message: the basic tenet that our communications tools aren’t actually tools at all, but forces that disrupt human culture by way of human psychology. And vice versa.

Before print came along, McLuhan argues, we were, as a species, “ear-oriented”: Human culture was oral culture, with everything — community, ephemerality, memory — that that implies. Print changed all that, pretty much: It changed us, certainly — cultural evolution can take place approximately 1.5 million times faster than genetic evolution can — by imbuing in us a newly “graphic” orientation. Which brought with it literacy, which brought with it the easy outsourcing of memory, which brought with it an increased, if not wholly novel, notion of human individuality. Print captured and conjured the world at the same time, giving us a new kind of power over our environment that was almost — almost — mystical. Thoth, the Egyptian god of writing, was also the god of magic.

Online, the wonder of the whirligig — the cheerful circuity of oral culture — is returning to us, and we to it.

But writing, as it filled us with notions of our own narrative nobility, inspired in us something else, too: a need — a desire, an impulse — for containment. With print’s easy ubiquity, the default tumult of oral culture gave way to something more linear, more ordered — something that aspired to a sense of completeness. Communicating became not so much about interpreting the world as about capturing it.

And — here’s where things get especially relevant for our purposes — the media (“media,” now, in the daily-journalism sense) have been key agents of that shift. What journalism has been as much as anything else, on the mass-and-macro level of culture, is a collective attempt to commodify time. Not just in its staccatoed stories of human events, but in its measurements and mechanics: the daily paper. The weekly magazine. The nightly news. “The Epiphanator,” Paul Ford has called it. So journalism, for everything else it has done, has also carved out a social space — the newshole, the object that results when you attempt to stanch the flood of history with a beanbag — from the stretches of time. The newshole has been the graphic-man version of Mumford’s clock, a revolution in words and images and increments, implying if not imposing human agency, ticking and tocking to the beat of human events.

And the sense it has engendered of time as an episodic thing has translated, as well, to the content of journalism. Stories, in short, have endings. And they have beginnings. That is, in fact, what makes them stories. A Mumfordian media is one that is composed of a series of episodes, modular events that can be figured and configured and then reconfigured for our narrative needs. Frank Kermode looked at the sweep of literary history and saw within it a pattern of “end-determined fictions” — stories that were defined by arbitration and apocalypse and, overall, “the sense of an ending”; the kind of structural eschatology he describes, though, isn’t limited to literature. Nonfiction stories, too, have been defined by the sense — actually, the assumption — of an ending.

But! The web. The web, with its feeds and flows and rivers and streams. The web, which has endowed us with — and the phrase is, of course, telling — “real time.” Online, publishing schedules (and, increasingly, broadcasting schedules), byproducts of the industrial world, are increasingly out of place. Online, we are time-shifters. Online, the wonder of the whirligig — the cheerful circuity of oral culture — is returning to us, and we to it. The Gutenberg Parenthesis is quickly closing. The web is de-incrementalizing history. “Real time” is real precisely because it is timeless. It lacks a schedule. It is incessant.

And so are our media, made newly social. Facebook and Twitter and Google+ and all the rest swim with time’s flow, rather than attempting to stanch it. And they are, despite that but mostly because of it, increasingly defining our journalism. They are also, as it were, McLuhanesque. (Google+: extension of man.) Because if McLuhan is to be believed, the much-discussed and often-assumed human need for narrative — or, at least, our need for narrative that has explicit beginnings and endings — may be contingent rather than implicit. Which means that as conditions change, so may — so will — we. We may evolve past our need, in other words, for containment, for conclusions, for answers.

McLuhan’s vision is, finally, of a world of frayed ends rather than neat endings, one in which stock loses out to flow — a media environment, which is to say simply an environment, in which all that is solid melts…and then, finally, floods. And for journalism and journalists, of course, that represents a tension of rather epic, and certainly existential, dimensions. Paul Ford:

We’ll still need professionals to organize the events of the world into narratives, and our story-craving brains will still need the narrative hooks, the cold opens, the dramatic climaxes, and that all-important “■” to help us make sense of the great glut of recent history that is dumped over us every morning. No matter what comes along streams, feeds, and walls, we will still have need of an ending.

To which McLuhan whispers, ominously: “No, Paul, no. No, we may not….”

Images by Leo Reynolds and Phil Hollman used under a Creative Commons license.

June 02 2011

17:30

Is Twitter writing, or is it speech? Why we need a new paradigm for our social media platforms

New tools are at their most powerful, Clay Shirky says, once they’re ubiquitous enough to become invisible. Twitter may be increasingly pervasive — a Pew study released yesterday shows that 13 percent of online adults use the service, which is up from 8 percent six months ago — but it’s pretty much the opposite of invisible. We talk on Twitter, yes, but almost as much, it seems, we talk about it.

The big debates about Twitter’s overall efficacy as a medium — like the one launched by, say, Malcolm Gladwell and, more recently, Bill Keller, whose resignation from the New York Times editorship people have (jokingly, I think?) chalked up to his Twitter-take-on column — tend to devolve into contingents rather than resolve into consensus. An even more recent debate between Mathew Ingram and Jeff Jarvis, which comparatively nuanced, comparatively polite) ended with Ingram writing, “I guess we will have to agree to disagree.”

But why all the third-railiness? Twitter, like many other subjects of political pique, tends to be framed in extremes: On the one hand, there’s Twitter, the cheeky, geeky little platform — the perky Twitter bird! the collective of “tweets”! all the twee new words that have emerged with the advent of the tw-efix! — and on the other, there’s Twitter, the disruptor: the real-time reporting tool. The pseudo-enabler of democratic revolution. The existential threat to the narrative primacy of the news article. Twetcetera.

The dissonance here could be chalked up to the fact that Twitter is simply a medium like any other medium, and, in that, will make of itself (conversation-enabler, LOLCat passer-onner, rebellion-facilitator) whatever we, its users, make of it. But that doesn’t fully account for Twitter’s capacity to inspire so much angst (“Is Twitter making us ____?”), or, for that matter, to inspire so much joy. The McLuhany mindset toward Twitter — the assumption of a medium that is not only the message to, but the molder of, its users — seems to be rooted in a notion of what Twitter should be as much as what it is.

Which begs the question: What is Twitter, actually? (No, seriously!) And what type of communication is it, finally? If we’re wondering why heated debates about Twitter’s effect on information/politics/us tend to be at once so ubiquitous and so generally unsatisfying…the answer may be that, collectively, we have yet to come to consensus on a much more basic question: Is Twitter writing, or is it speech?

Twitter versus “Twitter”

The broader answer, sure, is that it shouldn’t matter. Twitter is…Twitter. It is what it is, and that should be enough. As a culture, though, we tend to insist on categorizing our communication, drawing thick lines between words that are spoken and words that are written. So libel is, legally, a different offense than slander; the written word, we assume, carries the heft of both deliberation and proliferation and therefore a moral weight that the spoken word does not. Text, we figure, is: conclusive, in that its words are the deliberate products of discourse; inclusive, in that it is available equally to anyone who happens to read it; exclusive, in that it filters those words selectively; archival, in that it preserves information for posterity; and static, in that, once published, its words are final.

And speech, while we’re at it, is discursive and ephemeral and, importantly, continual. A conversation will end, yes, but it is not the ending that defines it.

Those characteristics give way to categories. Writing is X; speaking is Y; and both have different normative dimensions that are based on, ultimately, the dynamics of power versus peer — the talking to versus the talking with. So when we talk about Twitter, we tend to base our assessments on its performance as a tool of either orality or textuality. Bill Keller seems to see Twitter as text that happens also to be conversation, and, in that, finds the form understandably lacking. His detractors, on the other hand, seem to see Twitter as conversation that happens also to be text, and, in that, find it understandably awesome.

Which would all be fine — nuanced, even! — were it not for the fact that Twitter-as-text and Twitter-as-conversation tend to be indicated by the same word: “Twitter.” In the manner of “blogger” and “journalist” and even “journalism” itself, “Twitter” has become emblematic of a certain psychology — or, more specifically, of several different psychologies packed awkwardly into a single signifier. And to the extent that it’s become a loaded word, “Twitter” has also become a problematic one: #Twittermakesyoustupid is unfair, but #”Twitter”makesyoustupid has a point. The framework of text and speech falls apart once we recognize that Twitter is both and neither at once. It’s its own thing, a new category.

Our language, however, doesn’t yet recognize that. Our rhetoric hasn’t yet caught up to our reality — for Twitter and, by extension, for other social media.

We might deem Twitter a text-based mechanism of orality, as the scholar Zeynep Tufekci has suggested, or of a “secondary orality,” as Walter Ong has argued, or of something else entirely (tweech? twext? something even more grating, if that’s possible?). It almost doesn’t matter. The point is to acknowledge, online, a new environment — indeed, a new culture — in which writing and speech, textuality and orality, collapse into each other. Speaking is no longer fully ephemeral. And text is no longer simply a repository of thought, composed by an author and bestowed upon the world in an ecstasy of self-containment. On the web, writing is newly dynamic. It talks. It twists. It has people on the other end of it. You read it, sure, but it reads you back.

“The Internet looking back at you”

In his social media-themed session at last year’s ONA conference, former Lab writer and current Wall Street Journal outreach editor Zach Seward talked about being, essentially, the voice of the outlet’s news feed on Twitter. When readers tweeted responses to news stories, @WSJ might respond in kind — possibly surprising them and probably delighting them and maybe, just for a second, sort of freaking them out.

The Journal’s readers were confronted, in other words, with text’s increasingly implicit mutuality. And their “whoa, it’s human!” experience — the Soylent Greenification of online news consumption — can bring, along with its obvious benefits, the same kind of momentary unease that accompanies the de-commodification of, basically, anything: the man behind the curtain, the ghost in the machine, etc. Concerns expressed about Twitter, from that perspective, may well be stand-ins for concerns about privacy and clickstream tracking and algorithmic recommendation and all the other bugs and features of the newly reciprocal reading experience. As the filmmaker Tze Chun noted to The New York Times this weekend, discussing the increasingly personalized workings of the web: “You are used to looking at the Internet voyeuristically. It’s weird to have the Internet looking back at you….”

So a Panoptic reading experience is also, it’s worth remembering, a revolutionary reading experience. Online, words themselves, once silent and still, are suddenly springing to life. And that can be, in every sense, a shock to the system. (Awesome! And also: Aaaah!) Text, after all, as an artifact and a construct, has generally been a noun rather than a verb, defined by its solidity, by its thingness — and, in that, by its passive willingness to be the object of interpretation by active human minds. Entire schools of literary criticism have been devoted to that assumption.

And in written words’ temporal capacity as both repositories and relics, in their power to colonize our collective past in the service of our collective future, they have suggested, ultimately, order. “The printed page,” Neil Postman had it, “revealed the world, line by line, page by page, to be a serious, coherent place, capable of management by reason, and of improvement by logical and relevant criticism.” In their architecture of sequentialism, neatly packaged in manuscripts of varying forms, written words have been bridges, solid and tangible, that have linked the past to the future. As such, they have carried an assurance of cultural continuity.

It’s that preservative function that, for the moment, Twitter is largely lacking. As a platform, it does a great job of connecting; it does, however, a significantly less-great job of conserving. It’s getting better every day; in the meantime, though, as a vessel of cultural memory, it carries legitimately entropic implications.

But, then, concerns about Twitter’s ephemerality are also generally based on a notion of Twitter-as-text. In that, they assume a zero-sum relationship between the writing published on Twitter and the writing published elsewhere. They see the written, printed word — the bridge, the badge of a kind of informational immortality — dissolving into the digital. They see back-end edits revising stories (which is to say, histories) in an instant. They see hacks erasing those stories altogether. They see links dying off at an alarming rate. They see all that is solid melting into bits.

And they have, in that perspective, a point: While new curatorial tools, Storify and its ilk, will become increasingly effective, they might not be able to recapture print’s assurance, tenacious if tenuous, of a neatly captured world. That’s partly because print’s promise of epistemic completeness has always been, to some extent, empty; but it’s also because those tools will be operating within a digital world that is increasingly — and actually kind of wonderfully — dynamic and discursive.

But what the concerns about Twitter tend to forget is that language is not, and has never been, solid. Expression allows itself room to expand. Twitter is emblematic, if not predictive, of the Gutenberg Parenthesis: the notion that, under the web’s influence, our text-ordered world is resolving back into something more traditionally oral — more conversational and, yes, more ephemeral. “Chaos is our lot,” Clay Shirky notes; “the best we can do is identify the various forces at work shaping various possible futures.” One of those forces — and, indeed, one of those futures — is the hybrid linguistic form that we are shaping online even as it shapes us. And so the digital sphere calls for a new paradigm of communication: one that is discursive as well as conservative, one that acquiesces to chaos even as it resists it, one that relies on text even as it sheds the mantle of textuality. A paradigm we might call “Twitter.”

Photos by olalindberg and Tony Hall used under a Creative Commons license.

April 07 2010

16:00

The Gutenberg Parenthesis: Thomas Pettitt on parallels between the pre-print era and our own Internet age

Could the most reliable futurist of the digital age be…Johannes Gutenberg?

Possibly. Or, definitely, if you subscribe to the theory of the Gutenberg Parenthesis: the idea that the post-Gutenberg era — the period from, roughly, the 15th century to the 20th, an age defined by textuality — was essentially an interruption in the broader arc of human communication. And that we are now, via the discursive architecture of the web, slowly returning to a state in which orality — conversation, gossip, the ephemeral — defines our media culture.

It’s a controversial idea, but a fascinating one. And one whose back-to-the-future sensibility (particularly now, with the introduction of the iPad and other Potential Game-Changers) seems increasingly relevant: When you’re living through a revolution, it’s helpful to know what you may be turning toward.

On hand to discuss the theory further, at an MIT-sponsored colloquium late last week, was Professor Thomas Pettitt of the University of Southern Denmark, who has focused academically on the Gutenberg Parenthesis and its implications. (More on his work, including links to papers he’s presented on the subject, here.)

At the talk, Professor Pettitt discussed, among other things, the implications of the book as an intellectual object — in particular, the idea that truth itself can be contained in text. For the Lab’s purposes, I wanted to hear more about the journalistic implications of that idea — and what it means for our media if we are, indeed, moving into a post-print age.

I spoke with Professor Pettitt and asked him about those implications — and about, in particular, the challenges to a notion of normative truth that they suggest. Here’s what he told me; a transcript of his thoughts is below.

There are things going on that are related changes. The big revolution with Gutenberg changed, or was related to big changes in other aspects — for example, the way we look at the world and the way we categorize things in the world. And if the same thing is happening now, and if we are reversing that revolution in these things as well, then this idea can predict the future. Because we are going forward to the past.

And with regard to things like truth, or the things like the reliability of what you hear in the media, then I think, well, in a way we’re in for a bad time. Because there was a hierarchy. In the parenthesis, people like to categorize — and that includes the things they read. So the idea clearly was that in books, you have the truth. Because it was solid, it looked straight, it looked like someone very clever or someone very intelligent had made this thing, this artifact. Words, printed words — in nice, straight columns, in beautifully bound volumes — you could rely on them. That was the idea.

And then paperback books weren’t quite as reliable, and newspapers and newssheets were even less reliable. And rumors you heard in the street were the least reliable of all. You knew where you were — or you thought you knew where you were. Because the truth was that those bound books were probably no more truthful than the rumors you heard on the street, quite likely.

I often tell my students that they should start their literature work, their work here, by tearing a book to pieces: Take a book, take some second-hand book, that looks impressive — and just rip it to pieces. And you can see that it’s just made, it’s just glued, it’s just stitched. And it’s not invulnerable. It’s just that someone’s made it. It doesn’t have to be true because it looks good.

And that’s what’s happening now. What’s happening now is there’s a breakdown in the categories. Yes. Informal messaging is starting to look like books. And books are being made more and more quickly. Some books seem to be like they are like bound photocopies. You can make a book — you can do desktop publishing. We can no longer assume that what’s in — we’re not distinguishing so much: ‘if it’s in a book, it’s right,’ ‘if it’s in writing, it’s less right,’ and ‘if it’s in speech, it’s less reliable.’ We don’t know where we are.

And I suppose the press, and journalism, and newspapers, will have to find their way. They will have to find some way of distinguishing themselves in this — it’s now a world of overlapping forms of communication. People will no longer assume that if it’s in a newspaper, it’s right. Newspapers are spreading urban legends, some of the time. Or at least now we know that they pass on urban legends. And the formal press will need somehow to find a new place in this chaos of communication where you can’t decide the level, the status, the value of the message by the form of the message. Print is no longer a guarantee of truth. And speech no longer undermines truth. And so newspapers, or the press, will need to find some other signals — it’s got to find a way though this.

And it might do well to take a look at rumors and, sort of, more primitive forms of the press in the 16th century and the 15th century. How did people themselves — when there were no books, how did people sort out the truth? How did they decide what they would rely on and what they wouldn’t rely on? It’ll be a — it’s a new world to find your way around. But that new world is in some ways an old world. It’s the world from before print, and the identifiable newspapers.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl