Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 21 2011

17:00

Marshall McLuhan, Superstar

Today would have been Marshall McLuhan’s 100th birthday. Continuing our informal McLuhan Week at the Lab, we present this essay by Maria Bustillos on McLuhan’s unique status as a media theorist who was also a media star.

There was no longer a single thing in [the] environment that was not interesting [...] “Even if it’s some place I don’t find congenial, like a dull movie or a nightclub, I’m busy perceiving patterns,” he once told a reporter. A street sign, a building, a sports car — what, he would ask himself and others, did these things mean?

—Philip Marchand, Marshall McLuhan:
The Medium and the Messenger

The public intellectual was invented in the mid-20th century. Certainly there were others before that who started the ball rolling — talented writers and academics with flexible, open minds taking the whole culture into account, trying to make sense of things as they were happening — but few of them penetrated far beyond the walls of the academy or the confines of some other single discipline. We might count Bertrand Russell as an early prototype, with his prominence in pacifist circles and campaigns against nuclear disarmament, or better still G.B. Shaw, an autodidact of boundless energy who cofounded the London School of Economics and also helped popularize Jaeger’s “sanitary” woolen undies. Until Al Gore came along, Shaw was the only person to have won both a Nobel Prize and an Oscar.

Both Russell and Shaw gained a great deal of influence outside their own spheres of work, but remained above it all, too; they were “authorities” who might be called on to offer their views to the public on this topic or that. But it was a devoutly Catholic, rather conservative Canadian academic who first succeeded in breaking down every barrier there was in the intensity of his effort to understand, interpret, and influence the world. Marshall McLuhan was quite possibly the first real public intellectual. That wide-ranging role having once been instantiated, others came to fill it, in ever-increasing numbers.

Though he was an ordinary English prof by trade, McLuhan’s work had measurable effects on the worlds of art, business, politics, advertising and broadcasting. He appeared on the cover of Newsweek and had office space at Time. Tom Wolfe took him to a “topless restaurant” and wrote about him for New York magazine (“What If He is Right?”). He was consulted by IBM and General Motors, and he coined the phrase, “Turn on, tune in, drop out,” according to Timothy Leary. He made the Canadian Prime Minister, Pierre Trudeau, shave off his beard.

In 1969, McLuhan gave one of the most revealing and best interviews Playboy ever published (a high bar, there.)

PLAYBOY: Have you ever taken LSD yourself?

McLUHAN: No, I never have. I’m an observer in these matters, not a participant. I had an operation last year to remove a tumor that was expanding my brain in a less than pleasant manner, and during my prolonged convalescence I’m not allowed any stimulant stronger than coffee. Alas! A few months ago, however, I was almost “busted” on a drug charge. On a plane returning from Vancouver, where a university had awarded me an honorary degree, I ran into a colleague who asked me where I’d been. “To Vancouver to pick up my LL.D.,” I told him. I noticed a fellow passenger looking at me with a strange expression, and when I got off the plane at Toronto Airport, two customs guards pulled me into a little room and started going over my luggage. “Do you know Timothy Leary?” one asked. I replied I did and that seemed to wrap it up for him. “All right,” he said. “Where’s the stuff? We know you told somebody you’d gone to Vancouver to pick up some LL.D.” After a laborious dialog, I persuaded him that an LL.D. has nothing to do with consciousness expansion — just the opposite, in fact — and I was released.

Until the mid-century, there was a wall between what we now call popular culture and the “high culture” of the rich and educated, and there was another wall, at least as thick, between popular and academic discourse. Cracks had begun to appear by the 1930s, when the Marxist theorists of the Frankfurt School began to take on the subject of mass culture, culminating in works such as Theodor Adorno and Max Horkheimer’s The Culture Industry: Enlightenment as Mass Deception (1944). These academics saw popular culture as a positive evil, though, undermining the chances of revolution; a new kind of “opiate of the masses.” Later critics such as Edward Shils and Herbert J. Gans would elaborate on the same themes. But none of these writers personally ID’d with mass culture in any way. Far from it. Indeed Shils said in 1959: “Some people dislike the working classes more than the middle classes, depending on their political backgrounds. But the real fact is that from an esthetic and moral standpoint, the objects of mass culture are repulsive to us.” To some degree, that academic standoffishness is with us even today. The sneering of the “high” for the “low”.

Marshall McLuhan’s first book, The Mechanical Bride: The Folklore of Industrial Man, was published in 1951, and it took a quite different approach to the task of lifting the veil of mass culture in order to expose the workings beneath. The chief difference was that McLuhan never saw or really even acknowledged that wall between the critic of culture and the culture itself. After all, he too was a human being, a citizen, a reader of newspapers and magazines. McLuhan’s critique took place from the inside.

“[B]eing highbrow, in McLuhan’s eyes, never conferred the slightest moral value on anything,” observed his biographer, Philip Marchand.

McLuhan’s student Walter J. Ong wrote magnificently on this theme in his essay, “McLuhan as Teacher: The Future Is a Thing of the Past,” published in the Sept. 1981 Journal of Communication.

When [McLuhan] did attend to [...] popular works, as in his first book, The Mechanical Bride (1951), it was to invest them with high seriousness. He showed that such things as advertising and comic strips were in their own way as deeply into certain cyclonic centers of human existence — sex, death, religion, and the human-technology relationship — as was the most “serious” art, though both naively and meretriciously. However, awareness of the facts here was neither naive nor meretricious; it was upsetting and liberating.

Marshall Soules of Malaspina University-College had this comment on the “high seriousness” with which McLuhan treated popular works:

It is this strategic stance which distinguishes McLuhan from many media critics — like those associated with the Frankfurt or Birmingham Schools, or like Neil Postman, Mark Miller, Stewart Ewen and others — whose views imply an idealized literate culture corrupted by popular, commercialised, and manipulative media. McLuhan used his training as a literary critic to engage in a dialogue with the media from the centre of the maelstrom.

The Mechanical Bride consists of a selection of advertisements with essays and captions attached.

“Where did you see that bug-eyed romantic of action before?

Was it in a Hemingway novel?

Is the news world a cheap suburb for the artist’s bohemia?

— from The Mechanical Bride

The playful and wide-ranging tone of The Mechanical Bride was entirely new, given that its intentions were as serious as a heart attack. McLuhan thought that the manipulative characteristics of advertising might be resisted once they were understood. “It was, if anything, a critique of an entire culture, an exhilarating tour of the illusions behind John Wayne westerns, deodorants, and Buick ads. The tone of McLuhan’s essays was not without an occasional hint of admiration for the skill of advertisers and capturing the anxieties and appetites of that culture,” Marchand wrote.

The Mechanical Bride was way too far ahead of its time, selling only a few hundred copies, but that was okay because the author was just warming up. McLuhan had found the voice and style of inquiry that he would employ for the rest of his career. In the Playboy interview he said, “I consider myself a generalist, not a specialist who has staked out a tiny plot of study as his intellectual turf and is oblivious to everything else [...] Only by standing aside from any phenomenon and taking an overview can you discover its operative principles and lines of force.”

This inclusiveness, the penetrating, metaphorical free-for-all investigative method that appeared in McLuhan’s first book would gain him increasing admiration, as an understanding of the “rearview mirror view” of the world he used to talk about gained currency: “[A]n environment becomes fully visible only when it has been superseded by a new environment; thus we are always one step behind in our view of the world [...] The present is always invisible because it’s environmental and saturates the whole field of attention so overwhelmingly; thus everyone but the artist, the man of integral awareness, is alive in an earlier day.”

Because he refused to put himself on a pedestal, because everything was of interest to him, McLuhan was able to join the wires of pure academic curiosity with the vast cultural output of the mid-century to create an explosion of insights (or a “galaxy”, I should say) that is still incandescent with possibility a half-century later. Simply by taking the whole of society as a fit subject for serious discourse, he unshackled the intellectuals from their first-class seats, and they have been quite free to roam about the cabin of culture ever since.

As his books were published, McLuhan’s influence continued to spread through high culture and low. He loved being interviewed and would talk his head off to practically anyone, about the Symbolist poets and about Joyce, about car advertisements and cuneiform. You might say that he embraced the culture, and the culture embraced him right back. The Smothers Brothers loved him, and so did Glenn Gould and Goldie Hawn, Susan Sontag, John Lennon and Woody Allen. (Apropos of the latter, McLuhan very much enjoyed doing the famous cameo in Annie Hall, though he had, characteristically, his own ideas about what his lines ought to have been, and a “sharp exchange” occurred between Allen and himself. McLuhan’s most famous line in the movie, “You know nothing of my work,” is in fact one that he had long employed in real life as a put-down of opponents in debate.)

An aside: In 1977, Woody Allen was very far from being the grand old man of cinema that he is now. He had yet to win an Oscar, and had at that time directed only extremely goofy comedies. It was a mark of McLuhan’s willingness to get out there and try stuff, his total unpretentiousness, that he went along with the idea of being in a Woody Allen film. Only imagine any of today’s intellectuals being asked, say, to appear in an Apatow comedy. Would Noam Chomsky do it? Jürgen Habermas? Slavoj Zizek? (Well, Zizek might.)

Even better was Henry Gibson’s recurring two-line poem about McLuhan on the U.S. television show Laugh-In:

Marshall McLuhan,
What are you doin’?

Last year, I briefly attended the Modern Language Association conference in Los Angeles, met a number of eminent English scholars, and attended some of their presentations on Wordsworth and Derrida and on the development of that new, McLuhanesque-sounding discipline, the digital humanities. What I wished most, when I left the conference, was that these fascinating theorists were not all locked away behind the walls of the academy, and that anyone could come and enjoy their talks. The McLuhan manner of appearing anywhere he found interesting, which is to say all over the place, instead of just during office hours, does not diminish serious academics or writers: It enlarges them.

Is this, when it comes down to it, a mere matter of shyness? Or is it a matter of professional dignity, of amour-propre? The academy has so much to contribute to the broader culture; huge numbers of non-academics, I feel sure, would enjoy a great deal of what they have to say, and perhaps vice-versa. But somehow I find it difficult to imagine most of the academics I know agreeing to visit a topless restaurant with Tom Wolfe (on the record, at least). I hope, though, that they will consider venturing out to try such things more and more, and that today’s Wolfes will feel emboldened to ask them, and that the culture indeed becomes more egalitarian, blurrier, “retribalized” as McLuhan seemed to believe it would.

Personally, I have a great faith in the resiliency and adaptability of man, and I tend to look to our tomorrows with a surge of excitement and hope.

— from the 1969 Playboy interview

July 20 2011

16:00

Webs and whirligigs: Marshall McLuhan in his time and ours

Thursday, July 21 would have been the 100th birthday of Marshall McLuhan, the Canadian media theorists who was one of the most influential — or at least one of the most quoted — media thinkers of the 20th century. (And certainly the only one to feature, memorably, in Annie Hall, above.) To celebrate, we’re having a mini McLuhan Week here at the Lab. To kick us off, here’s our own Megan Garber.

Marshall McLuhan is generally best known — and to some extent exclusively known — for a single maxim: “The medium is the message.” This is mostly unfortunate. McLuhan was the author of several books of varying forms, a pioneering intellectual celebrity, and the founder of a field; five words, plump and alliterative though they may be, are wildly inadequate. But McLuhan had, in his way, a sense of humor, and appreciated as much as anyone the absurdity of his own meta-maxim (M.M. = (M=M)), and ended up feeding and fighting his own reductive celebrity in pretty much equal measure. A lover of poetry, probes, and extremely bad puns, he named one of his later books The Medium Is the Massage.

Today, 100 years after his birth and nearly 50 after he gave us language that made “media” into a thing, McLuhan is a Media Guru of the first order, which is to say that he is often quoted and rarely read. (The second-most-famous McLuhanism: “You know nothing of my work!”) When he died in late 1980, obituaries remembered him, with no apparent irony, as the “apostle of the electronic age.” But what will he be for the digital? Do his insights, focused as they were on the vagaries of television, apply equally well to the brave new world of bytes and bits?

For all the “visionary” status we confer on him today, it’s worth remembering that McLuhan was constrained by his time as much as we are to our own: He wrote not just about Tribal Man and Graphic Man, about the cultural and cognitive effects of communication as they sweep the span of human history, but also about jukeboxes and miniskirts and magazines and hosiery. Women, to him, were accessories to men. And his thinking (to repeat: Tribal Man) was pretty much implicitly paternalistic. When he talked about a “global village” — another maybe-claim to web-visionary fame — he wasn’t talking about a world community where New Yorkers go jeans-shopping with Londoners and Guineans share sugar with Laotians and everyone finally meets at a communal table to sip artisanal tea and discuss newly localized world events; he was talking about an encroaching dystopia that renders Tribal Man — or, more accurately, re-tribalized humanity — increasingly connected to, and yet actually disconnected from, each other via the barely-contained buzz of electric wires. A global village, McLuhan feared, was one that would be populated by automatons.

But writing, as it filled us with notions of our own narrative nobility, inspired in us something else, too: a need — a desire, an impulse — for containment.

“What if he is right”? Tom Wolfe asked, ominously. “What…if…he…is…right”?

And: He was right, about not everything but a lot, which is why today he is a Media Guru and a YouTube sensation and a ubiquitous subject of biographies both cheeky and earnest and a fixture of culture both nerd and pop, which are increasingly the same thing. He is the patron saint of Wired. Today, as the “electronic” age zips and zaps into the digital, as we are spun by the centrifugal forces of a nascent revolution that we can’t fully perceive because we’re the ones doing the spinning, McLuhan’s theories seem epic and urgent and obvious all at the same time. And McLuhan himself — the teacher, the thinker, the darling of the media he both measured and mocked — seems both more relevant, and less so, than ever before.

More, because, as the tale goes, McLuhan pretty much foresaw this whole Internet business. But less, too, because whatever foreseeing he did arrived, as foresight often does, prematurely. In the ’60s, at the height of his fame, McLuhan’s ideas were thrilling and shocking and, more generously, radical. Fifty years later, tempered by time, those same ideas have coalesced into conventionality (less generously: cliché). “The medium is the message” has been used to describe everything from cars to computers. I’m pretty sure I remember Bart Simpson writing it on a blackboard. McLuhan, controversial in his own time, has mainstreamed; the basic tenets of his thought — to the extent that his “thought,” an impressionistic assemblage of ideas that sweep and swoop and sometimes snap with self-contradiction, are a unit at all — have been, basically, accepted. We shape our tools, and afterward our tools shape us. Yeah, definitely. But…now what?

McLuhan wasn’t a journalistic thinker; he was a media theorist, and is most interesting when he’s talking not about the news itself, but about more theory-y things — modes and nodes and all the rest. (Though: If you want a treat, check out The Mechanical Bride, the collection of essays that formed his first book and that feature McLuhan before he became, fully, McLuhan — McLuhan not as an enigmatic intellect so much as a classic critic, trenchant and crotchety and indignant and delightful.) One feature of McLuhan’s thought that is newly relevant, though — to the world of the web, and to the new forms of journalism that live within it — is the one that is both core and corollary to the medium is the message: the basic tenet that our communications tools aren’t actually tools at all, but forces that disrupt human culture by way of human psychology. And vice versa.

Before print came along, McLuhan argues, we were, as a species, “ear-oriented”: Human culture was oral culture, with everything — community, ephemerality, memory — that that implies. Print changed all that, pretty much: It changed us, certainly — cultural evolution can take place approximately 1.5 million times faster than genetic evolution can — by imbuing in us a newly “graphic” orientation. Which brought with it literacy, which brought with it the easy outsourcing of memory, which brought with it an increased, if not wholly novel, notion of human individuality. Print captured and conjured the world at the same time, giving us a new kind of power over our environment that was almost — almost — mystical. Thoth, the Egyptian god of writing, was also the god of magic.

Online, the wonder of the whirligig — the cheerful circuity of oral culture — is returning to us, and we to it.

But writing, as it filled us with notions of our own narrative nobility, inspired in us something else, too: a need — a desire, an impulse — for containment. With print’s easy ubiquity, the default tumult of oral culture gave way to something more linear, more ordered — something that aspired to a sense of completeness. Communicating became not so much about interpreting the world as about capturing it.

And — here’s where things get especially relevant for our purposes — the media (“media,” now, in the daily-journalism sense) have been key agents of that shift. What journalism has been as much as anything else, on the mass-and-macro level of culture, is a collective attempt to commodify time. Not just in its staccatoed stories of human events, but in its measurements and mechanics: the daily paper. The weekly magazine. The nightly news. “The Epiphanator,” Paul Ford has called it. So journalism, for everything else it has done, has also carved out a social space — the newshole, the object that results when you attempt to stanch the flood of history with a beanbag — from the stretches of time. The newshole has been the graphic-man version of Mumford’s clock, a revolution in words and images and increments, implying if not imposing human agency, ticking and tocking to the beat of human events.

And the sense it has engendered of time as an episodic thing has translated, as well, to the content of journalism. Stories, in short, have endings. And they have beginnings. That is, in fact, what makes them stories. A Mumfordian media is one that is composed of a series of episodes, modular events that can be figured and configured and then reconfigured for our narrative needs. Frank Kermode looked at the sweep of literary history and saw within it a pattern of “end-determined fictions” — stories that were defined by arbitration and apocalypse and, overall, “the sense of an ending”; the kind of structural eschatology he describes, though, isn’t limited to literature. Nonfiction stories, too, have been defined by the sense — actually, the assumption — of an ending.

But! The web. The web, with its feeds and flows and rivers and streams. The web, which has endowed us with — and the phrase is, of course, telling — “real time.” Online, publishing schedules (and, increasingly, broadcasting schedules), byproducts of the industrial world, are increasingly out of place. Online, we are time-shifters. Online, the wonder of the whirligig — the cheerful circuity of oral culture — is returning to us, and we to it. The Gutenberg Parenthesis is quickly closing. The web is de-incrementalizing history. “Real time” is real precisely because it is timeless. It lacks a schedule. It is incessant.

And so are our media, made newly social. Facebook and Twitter and Google+ and all the rest swim with time’s flow, rather than attempting to stanch it. And they are, despite that but mostly because of it, increasingly defining our journalism. They are also, as it were, McLuhanesque. (Google+: extension of man.) Because if McLuhan is to be believed, the much-discussed and often-assumed human need for narrative — or, at least, our need for narrative that has explicit beginnings and endings — may be contingent rather than implicit. Which means that as conditions change, so may — so will — we. We may evolve past our need, in other words, for containment, for conclusions, for answers.

McLuhan’s vision is, finally, of a world of frayed ends rather than neat endings, one in which stock loses out to flow — a media environment, which is to say simply an environment, in which all that is solid melts…and then, finally, floods. And for journalism and journalists, of course, that represents a tension of rather epic, and certainly existential, dimensions. Paul Ford:

We’ll still need professionals to organize the events of the world into narratives, and our story-craving brains will still need the narrative hooks, the cold opens, the dramatic climaxes, and that all-important “■” to help us make sense of the great glut of recent history that is dumped over us every morning. No matter what comes along streams, feeds, and walls, we will still have need of an ending.

To which McLuhan whispers, ominously: “No, Paul, no. No, we may not….”

Images by Leo Reynolds and Phil Hollman used under a Creative Commons license.

September 15 2010

17:00

Twitter as broadcast: What #newtwitter might mean for networked journalism

So Twitter.com’s updated interface — #newtwitter, as the Twittery hashtag goes — is upon us. (Well, upon some of us.)

The most obvious, and noteworthy, changes involved in #newtwitter are (a) the two-panel interface, which — like Tweetdeck and Seesmic and other third-party apps — emphasizes the interactive aspects of Twitter; and (b) the embeddable media elements: YouTube videos, Flickr photos (and entire streams!), Twitpics, etc. And the most obvious implications of those changes are (a) the nice little stage for advertising that the interface builds up; and (b) the threat that #newtwitter represents to third-party apps.

Taken together, those point to a broader implication: Twitter.com as an increasingly centralized space for information. And even, for our more specific purposes, news. Twitter itself, as Ev Williams put it during the company’s announcement of @anywhere, is “an information network that helps people understand what’s going on in the world that they care about.” And #newtwitter, likely, will help further that understanding. From the point of view of consumption, contextual tweets — with images! and videos! — will certainly create a richer experience for users, from both a future-of-context perspective and a more pragmatic usability-oriented one. But what about from the point of view of production — the people and organizations who feed Twitter?

The benefits of restriction

We commonly call Twitter a “platform,” the better to emphasize its emptiness, its openness, its agnosticism. More properly, though, Twitter is a medium, with all the McLuhanesque implications that term suggests. The architecture of Twitter as an interface necessarily affects the content its users produce and distribute.

And one of the key benefits of Twitter has been the fact of its constraint — which has also been the fact of its restraint. The medium’s character limitation has meant that everyone, from the user with two friends following her to the million-follower-strong media organizations, has had the same space, the same tools, to work with. Twitter has democratized narrative even more than blogs have, you could argue, because its interface — your 140 characters next to my 140 characters next to Justin Bieber’s 140 characters, all sharing the space of the screen — has been not only universal, but universally restricted. The sameness of tweets’ structures, and the resulting leveling of narrative authority, has played a big part in Twitter’s evolution into the medium we know today: throngs of users, relatively unconcerned with presentation, relatively un-self-conscious, reporting and sharing and producing the buzzing, evolving resource we call “news.” Freed of the need to present information “journalistically,” they have instead presented it organically. Liberation by way of limitation.

So what will happen when Twitter, the organism, grows in complexity? What will take place when Twitter becomes a bit more like Tumblr, with a bit of its productive limitation — text, link, publish — taken away?

The changes Twitter’s rolling out are not just cosmetic; embedded images and videos, in particular, are far more than mere adornment. A link is fundamentally, architecturally, different than an image or a video. Links are bridges: structures unto themselves, sure, but more significantly routes to other places — they’re both conversation and content, endings and beginnings at once. An image or a video, on the other hand, from a purely architectural perspective, is an end point, nothing more. It leads to nowhere but itself.

For a Twitter interface newly focused on image-based content, that distinction matters. Up until now, the only contextual components of a tweet — aside from the peripheral metadata like “time sent,” retweeted by,” etc. — have been the text and the link. The link may have led to more text or images or videos; but it also would have led to a different platform. Now, though, within Twitter itself, we’re seeing a shift from text-and-link toward text-and-image — which is to say, away from conversation and toward pure information. Which is also to say, away from communication…and toward something more traditionally journalistic. Tweets have always been little nuggets of narrative; with #newtwitter, though, individual tweets get closer to news articles.

We’ve established already that Twitter is, effectively if not officially, a news platform unto itself. #Newtwitter solidifies that fact, and then doubles down on it: It moves the news proposition away from a text-based framework…and toward an image-based one. If #twitterclassic established itself as a news platform, in other words, #newtwitter suggests that the news in question may increasingly be of the broadcast variety.

“What are you doing?” to “What’s happening?”

“Twttr” began as a pure communications platform: text messages, web-ified. The idea was simply to take the ephemeral interactions of SMS and send them to — capture them in — the cloud. The point was simplicity, casualness. (Even its name celebrated that idea: “The definition [of Twitter] was ‘a short burst of inconsequential information,’ and ‘chirps from birds,’” Jack Dorsey told the Los Angeles Times. “And that’s exactly what the product was.”)

The interface that rolled out last night — and that will continue rolling out over the next couple of weeks to users around the world — bears little resemblance to that initial vision of Twitter as captured inconsequence. Since its launch (okay, okay: its hatch), Twitter has undergone a gradual, but steady, evolution — from ephemeral conversations to more consequential information. (Recall the change in the web interface’s prompt late last year, from “What are you doing?” to “What’s happening?” That little semantic shift — from an individual frame to a universal one — marked a major shift in how Twitter shapes its users’ conception, and therefore use, of the platform. In its way, that move foreshadowed today’s new interface.) Infrastructural innovations like Lists have heightened people’s awareness of their status not simply as communicators, but as broadcasters. The frenzy of breaking-news events — from natural disasters like Haiti’s earthquake to political events like last summer’s Iranian “revolution” — have highlighted Twitter’s value as a platform for information dissemination that transcends divisions of state. They’ve also enforced users’ conception of their own tweets: visible to your followers, but visible, also, to the world. It’s always been the case, but its’ one that’s increasingly apparent: Each tweet is its own little piece of broadcast journalism.

What all that will mean for tweets’ production, and consumption, remains to be seen; Twitterers, end-user innovation-style, have a way of deciding for themselves how the medium’s interface will, and will not, be put to practice. And Twitter is still, you know, Twitter; it’s still, finally and fundamentally, about communication. But the smallness, the spareness, the convivial conversation that used to define it against other media platforms is giving way — perhaps — to the more comprehensive sensibility of the networked news organization. The Twitter.com of today, as compared to the Twitter.com of yesterday, is much more about information that’s meaningful and contextual and impactful. Which is to say, it’s much more about journalism.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl