"If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in ourselves but in our culture.”

If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake: “I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.””

As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

Carr, N. (2008). Is Google making us stupid?, The Atlantic Magazine, July/August Issue

"Digitization and knowledge have a complex and quite paradoxical relationship."

Knowledge both admits of a vast number of characterizations and comes in different flavors. While it is possible to hold shared views on the purview of knowledge while at the same time disagreeing on its exact definition, disagreement may still loom over the horizon. Whether tacit or practical knowledge refers to a phenomenon that can be subsumed under one heading along with scientific knowledge, or knowledge as traditionally conceived by epistemologists, is a question that remains largely open to debate. For that reason, it would at first glance seem illusory to contrast (and not necessarily to oppose) a unified concept of knowledge with non-knowledge. Yet, without such a unified concept, the need for a correlative unified concept of non-knowledge becomes, at best, moot. The best-known philosophical answer to the question, “What exactly is knowledge?” has long been “justified true belief.” Despite the paradoxes this definition lends itself to (in particular the Gettier problem), let us take it as a departure point and add that knowledge is knowledge of a referent, whether in the form of an accurate description of it or true predictions regarding its behavior, etc.

What about digitization, then? Digitization and knowledge have a complex and quite paradoxical relationship. Going back to the concept of “knowledge economy,” made possible by the advances of digitization, one immediately sees this relation for what it is: a relation of commodification. “Knowledge” in the knowledge economy no longer denotes any norm or domain (which it merely connotes) but rather betokens a broad assimilation to a commodity, essentially cultivated in order to sustain growth. Both the normative and pluralistic aspects of knowledge have as a consequence seemingly vanished or at least been largely obscured.

While paradoxical, this evolution shouldn’t come as a surprise for it may very well characterize digitization as such. As a result, one of the claims in this paper will be that digitality has both overplayed and downplayed salient aspects of knowledge, to the point that we might, on initial approximation, think of this evolution as bringing knowledge nearer to its negation, what might be called “non-knowledge.” As we shall see, however, as we progressively move away from epistemic questions, the case for introducing an additional category and situating non-knowledge on a different plane will become more and more compelling.

Overplayed, I would argue, because conceptual knowledge already grasps its referent in a simplified way, if only to articulate true propositions where, for instance, proper nouns denote individuals, and common nouns denote properties (a conceit still used within logical artificial intelligence (AI)). Mathematical models, despite potentially being very complex, must nonetheless simplify reality in order to allow for more accurate predictions. In this regard, they may be revised to accommodate some of the minute details of a world they never exhaust. Science, then, produces knowledge about the world but not necessarily one conclusive picture.

Now, with digitality, models and abstractions have become not only a sign of the portability of conceptual knowledge but also a means to perform assemblages that induce new realities instead of deferring, one way or another, to some preexisting world—again in the name of simplification and formalization. Make no mistake: deferring to the world involves taking into account the intricate ways in which the world is being transformed by our own activity—especially in the Anthropocene! That said, digitization tends to consider its models within its own reality without always properly deferring to the world. Google’s PageRank algorithm is a good example. It construes incoming hyperlinks as votes or endorsements (never as signs of defiance!) in its willingness to redefine the web by using measures of authority, while pretending to remain neutral—even though its own existence modifies the very topology of the thing it was supposed to measure independently.

And then downplayed since the commodification of knowledge, made possible by the lack of regard for traditional norms of knowledge (in a sense “anything goes” in the knowledge economy so long as its goals are achieved), resulted in more and more data, metadata, documents, and so on and so forth—what I would term “knowledge traces”—being produced, gathered and made available with unforeseen consequences that are well worth examining.

Innovation is better served, or so it seems, by people who have little regard for the minutiae of everyday life, assured as they are of the well-foundedness of their mission to transform it.

Monnin, A. (2018). "Digitality, (Un)knowledge, and the Ontological Character of Non-Knowledge". In Bernard, A., Koch, M. & Leeker, M. (Eds.). Non-Knowledge: Digital Cultures. (1 ed., pp. 106-109) Meson Press

"And what the Net seems to be doing is chipping away my capacity for concentration and contemplation.”

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”

Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.

Carr, N. (July/August 2008). Is Google making us stupid?, The Atlantic Magazine

“Today, we are ourselves now submitted to the contemporary narcosis of the internet and its accompanying technologies"

One can certainly hear McLuhan’s ghost rattling its chains in these various statements (not forgetting that ‘ghosts’ in such a context have to be considered dissimulating traces of older media situations lingering as attenuated figures amidst the broken realities of the new). As McLuhan famously phrased it: ‘The medium is the message,’ before adapting this to ‘The medium is the massage.’ Media don’t only convey information, but constitute regimes of existence for humans; that is, they not only establish the limits of reality as such, but thereby also provide forms of psycho-physiological therapy. As such, all media systems are also remedial. And they are so in a number of senses. They are remedial because they remedy or supplement physical and psychical deficiencies, and do so by providing their so-called ‘users’ with doses of information that are at once anxiety-inducing and pleasure-giving. In a different-if-related context, Jacques Derrida drew a word from Plato to nominate such a double effect as that of the pharmakon, simultaneously poison and cure; for his part, Jacques Lacan spoke of the scraps of ‘surplus-pleasure’ or jouissance that humans convulsively seek to squeeze from their subjection to media.10 This irrevocable human submission to media creates a kind of ‘Narcissus-Narcosis,’ that is, the druggy, self-loving affects that media users experience in using their preferred media, for the most part unable to even suspect that the limits of the medium are the limits of their world. For McLuhan, we are all narcoleptic narcissists, except at those rare moments when the emergence of a new medium shocks us all into a momentary recognition that our ways of thinking are programmed by media — before we settle back again into the next generation of self-confirming narcoses. Aside from anything else, this means that media are not, whatever the claims made for them, primarily epistemic; they are rather affective.

Yet this remedial quality in the sense of sensorial therapeutics is doubled by another key structural feature for McLuhan: the ‘content’ of any medium is always another medium or media. A medium is also re-medial in this sense. Media never ratify any particular viewpoint, but render any particular viewpoint relative to their own operations; indeed, they drive the proliferation of competing viewpoints in the service of their own dissemination. In other words, media are always correlated to a mass-age, in that they not only give our cerebra a nice buffing, buffeting and buffering, but effectively produce masses of anonymous persons who, despite not knowing each other at all, start to act without knowing it in coordinated ways precisely because they are all subject to the same medium or media. Today, we are ourselves now submitted to the contemporary narcosis of the internet and its accompanying technologies....and in a way that is impossible for us to ‘think’ or ‘feel’ our ways out of.

McLuhan called the ‘Gutenberg Galaxy’ the material restructuring for the new interstellar vistas opened up by the early-modern print revolution. For McLuhan, the linearity, anonymity, and mass-production of print have certain consequences for the human sensorium, including the suppression of orality-aurality to the benefit of vision, the in-dividuation of persons, and social segmentation. The ‘visual homogenizing of experience’ and functional differentiation of modern social systems is an effect of the dominance of a particular medium, print, a dominance that is now at best residual. For McLuhan, it was the new audio-visual mayhem ushered in by television that gave Gutenberg Man his most serious shock; today, it is what we call ‘the Internet’ that has even swallowed the TV in its infinite maw.

Clemens, J. (2018). "The Paradox of Dissemination". In Haylock, B. (Eds.) Distributed (1 ed., pp. 229-231) Open Editions

"The more powerful the means of distribution, the weaker the messages thereby distributed"

‘Similar crypto-analyses must become universal and mechanical in the chaos of codes that begins with the world-historical dismissal of everyday language in favor of a universal discrete machine.’ Friedrich Kittler

The German media theorist Friedrich Kittler has pointed out that, with the development and domination of the Internet, not only do we have — for the first time in history — a real-time integrated system of absolute knowledge founded upon a universal binary code capable of transcribing and disseminating any possible message globally with vast volumes of other data at essentially the speed of light, but one which thereby renders absolutely anything any individual human can think, know, or say utterly irrelevant. No wonder Kittler invokes the Absolute Knowledge of G.W.F. Hegel as now not only possible, but accomplished. If such knowledge is, to use Hegel’s jargon, both finally for- and in-itself, it is also definitively not-for-us.

The consequences — epistemological, existential, and environmental — couldn’t be more extreme. First, nothing anyone can say anymore will be either true or false. Second, nothing anyone can say anymore is in principle distinguishable from anything ‘said’ by a bot: we have ourselves been integrated into the system as minor components; the projected AI supplantation of humans is not a dystopian future, but an achieved (and banal) actuality. Third, there is no longer any message, or even vast aggregates or sets of messages, that will enable any further significant transformation in the system of knowledge. Yet, fourth, as Kittler immediately adds, one crucial message will forever remain to be delivered: the apocalypse itself. This ‘message about the message’ doesn’t simply mean that ‘the revolution will not be televised’ (of course it won’t). It rather means: the only message that can affect the system is foreclosed from the circuits of the system itself; the message can only be received with and as the system’s obliteration, even as the system continues to transmit right up to the very nanosecond before impact.

This is therefore a decisive paradox, or, perhaps, one of the great revelations of contemporary media dissemination: the more powerful the means of distribution, the weaker the messages thereby distributed — except for the ultimate message itself, the one that bears upon the totality of the system as such. Part of the uniqueness of the Internet as a ‘discourse network’ (another Kittlerian term, adapted from the paranoid-schizoid visions of Judge Schreber) is that its powers of representation are patently immeasurably in excess of the content it can represent. No wonder everybody always seems to feel that they’re ‘under-represented’ whenever they consider where they are in the field of representation. Because they are; but they aren’t, too. For this feeling is not only mutual, but mandatory. It’s also essentially correct, if not in the sense that people usually seem to give it.
Rather, to repeat, under the current conditions of digital media, the content as well as the impact of the content of any message tends towards zero. The epitome of a contemporary message is that it has minimal content and minimal qualities — indeed, these tend towards the absolutely infinitesimal — although its mere circulation produces hitherto-unprecedented volumes of further information. In other terms, this is what the Italian philosopher Giorgio Agamben considers the exposure of the essence of communicability — and not just communication. For the first time in human history, at the very ‘end’ of this history, we are confronted by a communicability which is itself incommunicable.

Or, as the French linguist Jean-Claude Milner puts it, there is no name of the name. There are, accordingly, very many ways of phrasing the difficulties, each almost as dissatisfactory as the others. Accomplished ‘accelerated’ distributivity entails the evacuation of meaning, indistinguishable from a chaos or chaosmos of meanings. Because any proposition regarding the putative context under which any meaning can be established is itself subject to another context which exceeds that proposition’s delimitations — if one wishes one could refer to the familiar paradoxes of self-reference or to decision problems or to the reentry-of-theform-into-the-form, or whatever— it turns out that only the end of all things can constitute a stable enough referent for such an establishment. Until then, the difference between so-called ‘metadata’ and ‘data’ will remain utterly unstable — and only the aforementioned apocalypse will stop this abyssal and truth-corrosive precession of metas.

In the meantime — or at least in what James Joyce called ‘the pastime of past time’ — we can while away the hours with speculative archaeologies. Here and there, drawing on some of the familiar authorities regarding the status of the distributivity of modern media regimes, such as Marshall McLuhan and Jacques Rancière, let’s enumerate some basic affordances of our own media situation that literally demolish — or, perhaps, remolish — the received architectures of production and circulation of prior media, including almost all the older justifications for particular kinds of address, hierarchy and transmission. When all media have become subject to a single über-medium, that of ‘the internet,’ media themselves disappear. In a phrase: we are no longer subjects to ‘the distribution of a-medium-in-dominance’ nor to ‘regimes of the sensible,’ but to a dissemination of the distribution of insensibles.

Clemens, J. (2018). "The Paradox of Dissemination". In Haylock, B. (Eds.) Distributed (1 ed., pp. 229-231) Open Editions

“Information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.”

Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.

The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.

Carr, N. (July/August 2008). Is Google making us stupid?, The Atlantic Magazine

“When we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed.”

In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.

The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.

Carr, N. (July/August 2008). Is Google making us stupid?, The Atlantic Magazine

"Technology either eliminates non-knowledge or plants it deep within contemporary cultures"

Digital media today are accompanied by emphatic stances on knowledge, non-knowledge, and their relation to one another. Generating, distributing, and making available massive amounts of data that take form by modeling, digital media provide us with abundant information and potentially new ways of gaining knowledge. This has been attracting various, sometimes radical scenarios in which technology either eliminates non-knowledge or plants it deep within contemporary cultures, due to the alleged universal power and opacity of algorithms. Both conceptualizing and researching non-knowledge have proven to be epistemological challenges that are key to under - standing contemporary digital cultures.
(...)
Discussing the epistemological challenges tied to non-knowledge and its relation to knowledge is of great value to digital cultures research. It brings up the question of whether digital technology goes along with a qualitatively new mode of entangling knowing and not knowing. This question currently fuels vast amounts of research, attracting both emphatic stances on the alleged revolutionary nature of digital technology and careful, tentative descriptions of the historical, technological, and epistemological conditions of knowing and not knowing today. One prominent topos in current research is that at the core of contemporary media culture there is a fundamental epistemic opacity (Humphreys 2009), which relates to thoughts about the unrepresentability of algorithms (Galloway 2012, 78–100) and their governmental power (Rouvroy 2011). Other key factors for this opacity are found in the ubiquity of digital media and their deep insertion into all sorts of everyday practices, perception, and body techniques, leading up to a “transformation of the contemporary affective fabrics” (Baxmann, Beyes, and Pias 2012, 9, translated by the author). All-encompassing and altering the capacities of sensation, such a situation has been called an ecology of affect (Angerer 2017).

All this makes digital cultures research a prominent case of the perceived contemporary crisis of representation, and focusing on non-knowledge promises to deliver valuable insights into these epistemological dilemmas. It implies discussing the means, range, and limits of current scientific description and understanding.

It also highlights the basic questions of what is thought of as known/not known and knowable/not knowable today, the various historical contexts of today’s situation, and even the question of whether one can operationalize non-knowledge to learn about digital cultures. Relating non-knowledge to digital cultures may not only tell us something about the status of digital media as a topic of research, it may also tell us something about the status of contemporary interdisciplinary media research itself.

Koch, M. (2018). "Introduction: Non-Knowledge and Digital Cultures". In Bernard, A., Koch, M. & Leeker, M. (Eds.). Non-Knowledge: Digital Cultures. (1 ed., pp. 11-16) Meson Press