Is hipsterism in decline

LIBREAS.Library Ideas

A look at current discourses.

by Ben Kaden / @bkaden

"There will no longer be someone like him in the digital age."

the publisher Michael Krüger finishes his obituary for Henning Ritter in the Wednesday edition of the NZZ (Michael Krüger: readers, collectors and private scholars. Henning Ritter - a memory. In:, 25.06.2013) and leaves the reader differently touched, than an obituary usually has. Because in a certain way he is denying the present, which is little defined as being dominated by the digital, the possibility of producing an intellectual culture for which Henning Ritter stood. That sounds almost resigned and can also be explained from the deep feeling of loss. The sad insight is also light years away from the furor that was briefly added to the debate about a gap between the Internet and the culture of thought just a few years ago with essays such as the one on the hatred (!) Of intellectuals on the Internet by Adam Soboczynski. (Adam Soboczynski: The Net as Enemy. In: Die ZEIT, May 20, 2009) I don't know whether the net is now understood as a friend. In any case, three interpretations have prevailed: the network as a medium (soberly functional), the network as a marketplace and the network as a social sphere.

In Germany, on the other hand, it is less noticeable that the Internet is also suitable as a venue for intellectual discourse, simply because the sparse counterparts to the almost abundance of the corresponding magazine culture in the English-speaking world, which has long been a media e- and p-Hybrid culture, ranging from the New Yorker to The Point to the White Review, is barely present on the Internet. Perhaps there are actually too few (young) intellectuals of this caliber in Germany. (On some rub themselves against the thankless topic of copyright law and sometimes something slips into the blog on that points in this direction.) And there are certainly a number of reasons for this deficit. However, the fact that we have now embedded digital media forms in our everyday lives is hardly likely to be the cause.

The fact that the intellectual culture of a given present differs from that of the present before and after is also obviously due to the constitution of culture. The digital age will also not produce a Giordano Bruno (at most a Giordano Bruno Foundation) and no Walter Benjamin (at most a Walter Benjamin Platz) and no Pjotr ​​Kropotkin (at most an inappropriately thrown in quotation from a clown in the political science proseminar). But sending “postcards with glued-on drawings or mixed up messages, letters that consisted of nothing but quotations” - that's still happening. And actually in postcard and letter format and also, for example, on Tumblr, which makes this cultural practice of collaging, which was seen elsewhere as the decline of Western culture, almost unleashed.

Experience has shown that there are also intellectual bohemians, and at least not too scarce in Berlin. They just sit - excluding some young columnists - no longer with “Lutter & Wegner”, but sometimes in the publishing program with the edition unseld and otherwise, if they are still a little more progressive, doubly excluded (voluntarily and because the sales calculators do not ascribe a market to this thinking besides the trace) from the conventional publishing establishment in their own discourse spaces (often more chambers). And even the "independent private scholar who looks down on academic luminaries with a certain malicious contempt [...]" can be found almost in packs at any digital bohemian party. The times when you were avant-garde when you differentiated yourself from the staid functional science in universities were over before you wrote e-mails. Hipsterism has even made this attitude almost a recreational tool. Everyone knows by now that progressive thinking and university careers do not enter into a causal relationship, but rather come together by chance, as the university and science systems as a rational functional corpus are almost naturally tailored to standardize thinking and guide it in very narrow channels.

The melancholy of the Hanser publishing director Michael Krüger is undoubtedly understandable. Speaking neutrally, the exit sentence would be completely correct and self-evident. The temptation to glorify the personally lived time as the best of all possible variants of existence will certainly press almost everyone at some point. And one day a luminary of digital intellectual culture will surely write a similar sentence full of sadness about a network intellectual. One can, however, wish him to write the sentence without naming the succession time calculation (ie without the then current “digital age”). Because how truer the statement would be if it simply meant: “There will be no more someone like him.” Or better still: “He no longer exists”. Great personalities, to which Henning Ritter, Homme de Lettres and Mensch als Mensch, undoubtedly belonged, do not need comparative ones How-Location over time.


Perhaps the "method freak" (Süddeutsche Zeitung) Franco Moretti is in this "once“In retrospect, they are seen as a leading figure in the humanities in the age of their digital re-evocability. Lothar Müller dedicates the showcaseDigital humanist from Stanford already today in the feature section of the Süddeutsche Zeitung on Wednesday a full six-column (Lothar Müller: Einethodfreak. In: Süddeutsche Zeitung, 26.06.2013, p. 14) with an impressive portrait shot from the camera of the well-known with its shots of the German national football team become Regina Schmeken. In terms of image representation, the literary scholar already plays in a top league. Isolde Ohlbaum, on the other hand, from whom the central obituary photo by Henning Ritter comes (including Tuesday in the FAZ feature section), is better known from her photos of Alberto Moravia, Susan Sontag and Elias Canetti. The cultural break seems to be inscribed in the photograph.

Franco Moretti, who originally (and “ideally”) wanted to become a theoretical physicist, is now breaking into the stable culture of literary analysis, and perhaps even more so than any thesis of the death of authorship Close reading with his quantitative reading practice, carefully worded, questions.

For practical reasons, close reading was for a long time the only practicable, real, thorough examination of text. Because in view of the immense amounts of published material, an intellectual middle-distance reading usually only led to - often very impressive - cross-sectional knowledge, which is excellent for the fireside chat, but too superficial for the hard scientific discussion. The meticulous knowledge of a few texts and the associated contexts has long been considered a noble goal of the interpreting text disciplines - although the meticulous knowledge of many texts would have been better, but simply not feasible in a normal academic professional life. Hence canonicalization, hence restriction to the key texts and leading figures. Therefore I also like the third book about Achim (e.g. von Arnim).


Digital literary studies now means making use of the strengths of computer technology (= the automatic processing of large amounts of data) in order to grasp, develop and ( initially) to make interpretable. The manipulability - perhaps as “distant writing” - would be easily conceivable as an encore in the digital world. Moretti thus carries the scientific idea of ​​the reliable model into the study of literature:

“I'm on models [...] on the analysis of structures. What I don't like about 'close reading' is the sense of well-being found in most variations of the infinite multiplication of interpretations, for example the 'deconstruction', where people are happier the more contradictions they discover in a text. This restriction to the increasingly rich interpretation of individual works has never appealed to me. "

For example Moretti, who of course completely hides the purpose of deconstruction, which expressly and for good reason turns against the reduction of the world to general rules and towards the recognition or investigation of the specific. Reducing this approach to wellbeing and happiness shows above all how deep the rifts must be between schools in the United States. Anyone who has tried deconstructive reading knows how uncomfortable and inevitably unsatisfactory this endless hard work in the constant shifting of meaning must be. Deconstruction can hardly be reconciled with a model-oriented understanding of science, presumably even with a science that is oriented towards knowledge (instead of understanding) and control (instead of recognition). This kind of dealing with text does not produce certain knowledge, but on the contrary, if you will, consciously unsettling knowledge. That may make them so uncomfortable and also hardly easy to use.


What Moretti does when he quantitatively and statistically evaluates masses of text corresponds in principle to digital economic data analyzes, which, as we now know, can also be used for the secret service's counter-terrorism. His concept - he names it Quantitative formalism - also serves to develop the structure of the documented, i.e. the empirical, whereby deviations can be systematically isolated, which can then be examined qualitatively if necessary. This gives you a more solid basis for dealing with texts in their entirety (if they are digitized in their entirety and are part of the corpus). With the right analysis, one learns a lot about the syntactic and sometimes also conceptual relationships between text products and utterances. You can isolate and track cross-connections and references. Sometimes one also uncovered plagiarism and double creations. Statements can be made about the production and reception conditions of texts (for example via mapping with lending statistics from public libraries). From many perspectives, we can make statements about the subjects of literary studies more precisely than ever in the history of science, draw conclusions and cross-connections and prepare insights. We are able to look at and at the same time overlook a huge field of literature from a kind of bird's eye view.

Indeed, and that is the difference between a good one Close reading and the best distance reading imaginable, the literature from the basket of that hot air balloon inevitably remains incomprehensible. We cannot be good with our perception of things and in things at the same time. So, in the best case scenario, we can use a combination of close and distant Use Moretti's top view method as a drone to - depending on the level of knowledge and interest in action - specify access. In this respect, statistical approaches are a grandiose extension of what is scientifically possible. Deconstruction could even make good use of quantitative formalism when it comes to shifting established perspectives of deconstruction. As an alternative, however, the method of statistically developed full empiricism would obviously be unsuitable, since it simply looks at something completely different from the practice of proximity.


“Only those who purely serve the cause have 'personality'”, writes Max Weber, Moretti's idol Thing as a substitute for God. The identity of the scientist personality results solely from the service to the object. However, the empirical piety that has unfortunately often been derived from it to this day has little to do with enlightenment. One dogma only fights against the other and for really intellectuals this prize boxing for the truth has never been a particularly attractive spectacle, especially since they were usually only allowed to watch from the cheap seats. A post-factual, i.e. consequently de-dogmatized, science would no longer be science as we know it, appreciate it and consider it meaningful. But perhaps systematized intellectuality. Or just what you meant in the best sense of the word when you were Humanities just not as Sciences conceived.

The Digital humanities and quantitative literary studies are not, as is often assumed, a further development of the humanities practices, but a novel approach to their reference points. Which is really nice if Max Weber adepts don't point out these softer approaches as inferior at the same time, because they assume that literature is above all empirical and thus wander astray. Since the 20th century at the latest, the claims of empiricism have been one thing above all in literature and art: thoroughly suspect. Literature - as well as deconstruction, by the way - is no longer about mirroring the world, but about withdrawing from the mirrors as much as possible, bending the image in order to make visible what could lie behind it. This type of literature does not aim at truth and presumably makes it creepy for many and for a few friends of the Close readings just attractive. If they are creative enough not to read to the point of truth, they even succeed in deriving something fruitful and connectable from them that no statistics can ever provide.


Now we have to answer what this fruitful might consist of. A look into the present or the somewhat more recent past, i.e. into the development time of Michael Krüger’s "digital age" (as if the reference to aeons still had some discursive value) provides clues. Anyone who looks at the development of digital culture over the past two decades can easily see that its success is based on the combination of two basic principles: that of the narrative and that of statistics. Platforms like Facebook are biographical mapping areas and social files that are down-addressed to the smallest cultural unit: the individual. While the numbers (friends, likes, shares, etc.) - adoption from the economy - precisely prove our success in the social integration (network) of our lives, we use the narration of our selves to these numbers (our social acceptance) influence. Of course, in the next step, it's not just the quantity that counts, but also - FOAF - the quality of our contacts.

A turning point lies roughly in the middle of these two decades of public network culture. The first ten years established the digital communication space largely following the pattern of the analog world (mass media communication here, private communication via e-mail, chat, P2P file sharing there and on top of that a few mail-order companies with online catalogs) on the other hand with the so-called web 2.0 the partition walls gradually became permeable, so that it is now visible on our private profiles which newspaper articles we have accessed and which music titles we have played. The Special as the basis of the action structures comes to the fore. We fluctuate between Me too and Me-first, the latter with the chance to be in the trend very early and to score with it (the pyramid game of social prestige) and at the same time with the risk of going wrong.

These statistics of our cultural behavior, which are relatively easy to grasp and visualize, help us on the one hand to gain a statistically stabilized overview of our usage activities in extremely complex media and on the other hand to integrate this into the auto-fiction of our identity. So what we can take over from Max Weber is the subjective meaning and what is embedded in it, what is irreducible even before the sum of what is individually divided. According to a supplementary thesis, this subjective meaning does not go completely into what we think, what others should read into our actions, but is also constantly interpreted by ourselves, namely against the horizon of our self-perception and with regard to the coherence of our self-narrative.

The explication media for our social behavior now help us to maintain the permanent updating of this identity narrative as systematically as it would otherwise have been possible at best with a meticulously kept diary. What is appealing for library and information science in this context is the observable expansion of documentary practices into the self-administration of identity. Where we explicitly record our life, the traces from which we can generate memories and thus make them verifiable, sometimes publicly visible, it is not unlikely that we will organize our life in the sense of a documented continuity and coherence of these overall processes.It cannot be overlooked that social pressure, arising from the shared travel, consumption and activity balances of others, plays a significant role.

Our web self is manipulated or can be manipulated for certain effects and our, let's call it that, offline self is adapted accordingly. Experience has shown that especially with phenomena such as online dating, differences often arise between the expected image (or the assumed expected image) and the actually redeemable offline original. Purely structurally, the platforms support the transformation of our social representation into a supposedly objectively (statistic) assessable commodity. This applies not only with regard to our addressability for advertising, but also with regard to the pressure to represent our social contacts, which can remove us from their explicit network at any time if we appear to them to be inappropriate or irrelevant.

Corresponding effects are even more evident at the interface with the professional environment. Where potential employers can come into possession of party photos and, on this basis, make decisions that are negative for us in the real world (a big debate a few years ago), these structures automatically affect our behavior. Every current mobile phone has a camera - so we can be photographed anywhere and must therefore appear adequately everywhere. One is almost happy that this tip of behavior control has not yet been implemented and that the party tourists in Berlin-Mitte are still behaving as if there were neither Instagram nor a tomorrow. And finally, there are those who are no longer interested in using these administrative media, for whom the risk of external observation and / or the effort of such self (image) control is too high. What remains open, however, when a virtual representation becomes normative, maybe even controlled by the state. Anyone who thinks this is absurd should take a look at the history of the identity card. It then seems almost logical that sooner or later unique digital ID markers will be introduced as necessary for our virtual representations.


A nice derivation of the general digital cultural development to this day is offered in an almost encyclopedic form (after all, after the demise of the encyclopedias and with it the idea that a general canon of factual knowledge could be represented in a closed form) a special edition of the central one that appeared in May and is still available Accompanying medium of the network culture - the magazine Wired: Wired. The first 20 years. (Interestingly enough, it has become a parallel medium of pop culture and has become much more anarchic Vice Magazine founded almost at the same time. Anyone who has preserved the years of these publications should have a solid research basis for a genealogy of the popular culture present together with the associated culture industry.)

Wired. The white booklet, here placed on a retro reclining cushion. On the Moretti Close-Distant-Reading scale, just invented, the reading recommendation for this issue is: Medium. If you like smooth Californian self-reflection journalism, you will find exactly what you expect, so you just have to skim it. If you would like to remember how it all began with internet culture, you will discover the right keywords (or your Madeleines as Proustians would say). If you are looking for gentle insider jokes about Ted Talk presentation practice, you will get entertainment for the length of about 10 tweets. All in all, the issue is a neat collector's item and in ten years we will certainly leaf through it again with some astonishment and no little joy in nostalgia.

Those who got involved with net culture in the 1990s will also find documentation of relevant key phenomena from their own past in the anniversary issue. The starting points of A like Angry Birds, apps and Arab Spring above Friendster, Microsoft (very detailed), Napster (one and a half columns), Trolling ("But when skillful trolls do choose to converse with their critics, they poison the discussion with more over-the-top provocations masquerading as sincere statements.", P. 160) and Wikileaks to Z like Zeus are in any case diverse.

A nice trigger is of course the keyword Blogs: In 1998, when most of the user-generated content was still available from freespace providers such as Geocities or, you can find out on pages 34, 23 weblogs. Web-wide. A year later it was a few tens of thousands, and the key was a comparatively simple web application called Blogger by a two-person company called Pyra Labs.

This story, like Google, which Blogger took over in 2003, shows the fascinating core of digital culture. The decisive developments were always characterized by the commitment of a few actors (a portrait gallery by the president photographer Platon presents all these luminous figures in black and white from Marissa Mayer to Tim Cook), very simple ideas, little long-term planning and luck. All these traces are documented more comprehensively than in any other powerful cultural development in human history. Today's Big player The net culture all started with little more than a comparatively minimal budget and an idea. When you proceeded according to plan and invested a lot of money (, it usually went wrong.

In the 1990s, exhausted by the end of history, a fresh wave of the future penetrated with the Internet and the promise was that everyone, with only a computer and a modem and a little imagination, could not only enter a new world, but even create it. (And also get very rich with just one idea.) For the culture industry, the cards were reshuffled at this time, which many representatives of the established branches neither understood nor even noticed. (see also Sony, p.136)

That could of course also fail, for example the idea of ​​enforcing non-linear forms of narration thanks to hypertext. "You can see this as a classic failure of futurism", writes Steven Johnson on the corresponding keyword (p.92), "Even those of us who actually have a grasp of longterm trends can’t predict the real consequences of those trends." Real web literature has never conquered an audience and even those who love non-linear texts prefer to read their Danielewski (at least in my experience) as foliage.

However, the transfer of the networking principle to the reproduction of an administrative file for personal contacts has become the current central habitat of the web population. (see Facebook) That the main emphasis is actually on identity (the face) and not just on contact (a Rolodex 2.0 would certainly have failed in terms of sound alone) is part of the recipe.

The white Wired magazine itself corresponds to the fluffy, anecdote-laden and never boring style that is peculiar to the web, so it is also a good contemporary document of journalistic standards of network culture and is also great to read on the beach. Ideally on one with network coverage, because the article on the subject of QR code is consistently printed as a QR code link. It seems at least as astonishing that the company 1 & 1 from Montabaur placed an advertisement on page 193, the appearance of which would have looked old-fashioned in Wired back in 1993. But maybe that's why.


After reading the issue, do you know more precisely where we are after 20 years of Wired (or 20 years of online popular culture)? Perhaps as much as one knows about the literature of a literary genre after reading the Brockhaus article or a Moretti essay on the subject. You have an overview and a few separate facts. But without direct experience, it remains a bare surface. The direct experience of the present shows a certain consolidation of the usage practices, which merge with everyday life. The Wired edition shows what has actually shifted over the past 20 years and now we are sitting here in front of the Macbook Air and actually think it's okay if there is a small crack in the display, i.e. a break in the facade. The hypertext entry says: "and in the long run the web needed the poets and philosophers almost as much as it needed the coders." (P. 92)

The relationship seems to have long since turned over to me. Digital early enlightenment figures such as Yevgeny Morosow and counter-revolutionaries such as Jaron Lanier have long been to be found in the metadiscourse at eye level and in the same permanent presence as the long-dominant futurists and heralds of salvation. This polyphony is not a bad development. In this respect, too, a great deal of ripening has happened in recent years. Anyone over twenty knows that twenty years is not a long period at all. If you keep in mind how fundamental and disruptive the cultural shift actually is (the Wired magazine only suggests it), then it was downright peaceful and naive - even against the horizon of historical knowledge of cultural upheavals - which is certainly also true is the result of a fast and fairly comprehensive empowerment, control, moderation and design of these processes through what is now digital capitalism.

The unfolding of the dark side of technology, its intrusiveness has so far been balanced with the potential for emancipation. We are well aware of their dystopian content (and if you have some catching up to do, Gary Shteyngarts should be Super Sad True Love Story read).

Edward Snowden's insights into Tempora and Prism are certainly the next turning point and an important step towards understanding the risks of omni-modification and networking of our living environments. In any case, they show the need for a superordinate countercheck to the ordered monitoring of our digital actions. The role of intellectuals and a deconstructing intellectual discourse on it is correspondingly important. It doesn't necessarily have to be in science. But of course it would do no harm to science if it led him.

The fruitful of this braking discourse would, by the way, be that society constantly reflects the entire range of its values, wishes and fears in relation to what is digitally feasible and digital culture, in order to be able to raise objections early and clearly enough if something gets out of hand and if new tendencies towards totalization are about to occur what kind arise. Literature manages to keep us sensitive to it, precisely by touching it in close reading. Not sure how the style analysis of the Distant readings from the Stanford Literary Lab could be used for this.