Three+Annotations

//**ALEXANDER GALLOWAY: PROTOCOL**//

__**What three quotes capture the critical import of the text?**__

P7: Thus, protocol is a technique for achieving voluntary regulation within a contingent environment.

P241: (discussing speed bumps vs reducing the legal speed limit) Bumps, on the other hand, create a physical system of organization. They materially force the driver to acquiesce. Driving slower becomes advantageous. With bumps, the driver //wants// to drive more slowly. With bumps, it becomes a virtue to drive slowly. But with police presence, driving slowly can never be more than coerced behavior. Thus, signage appeals to the mind, wile protocol always appeals to the body. Protocol is not a superego (like the police); instead it always operates at the level of desire, at the level of “what we want.”

P244: But protocol is more than simply a synonym for “the rules.” Instead, protocol is like the trace of footprints left in snow, or a mountain trail whose route becomes fixed only after years of constant wear. One is always free to pick a different route. But protocol makes one instantly aware of the best route – and why wouldn't one want to follow it?

__**What is the main argument of the text?**__

The text argues that protocol (ie, standards of implementation) is the means by which control is establish after decentralization. It argues that protocols are not coercive mechanisms. Rather, they point to the path of least resistance; they offer advantages to those who follow them and disadvantages to those are chose to walk besides them. As such, the text argues that the Internet, despite possessing a decentralized structure, remains a highly hierarchical network by virtue of the very standards of implementation that undergird its very fabric.

__**Describe at least three ways that the main argument is supported.**__

1. Galloway analyses the content of the original Internet RFC's in order to understand how they have structured the way that information can and cannot circulate over the Internet (TCP/IP structure resulting in a distributed, decentralized, but still controllable network).

2. Galloway examines the rise of hacking in parallel to the establishment of protocols. Hacking becomes a way to break free of the restrictions imposed by protocols.

3. Galloway examines decentralization by focusing his attention on the physical infrastructure that composes the Internet. He examines the server hierarchies, IP address structures, and HTML language in order to demonstrate a protocolary tendency towards ease of control.

__**Describe the main literatures that the text draws on and contributes to, and the particular contribution made by the text.**__

This text can be situated within the institutional literature that elaborates upon the intersection of discourse and social fabric. In particular, it can be located in the lineage composed of Hobbes, Foucault and Deleuze. Each of the previous authors elaborated upon a different kind of society that was a reflection of the institutions present within them – Hobbes' sovereign societies, Foucault's disciplinary societies, Deleuze's control societies. This text takes the argument one level further by asking what shape control takes after systems have been decentralized.

__**Explain how the argument and evidence in the text supports, challenges or otherwise relates to the argument or narrative that you imagine developing.**__

This book is interesting to me because of the paradox it presents. Namely, it argues that in order to be accessible, distributed networks must be standardized and controllable, or protocological as Galloway puts it. As such, models and interfaces must impose a particular subject effect in order to be effectively navigated by the largest number of people possible.

__**List of at least three details or examples from the text that you can use to support the argument or narrative that you are developing.**__

1. “Protocol is synonymous with possibility” (p.167). Here, possibility is equated to access. Therefore, protocols created increased access by allowing a greater number of people to navigate cyberspace. When hackers attempt to 'open' up proprietary technologies, they are in fact attempting to create a protocol by which the information contained within can be accessed.

2. Protocols can fail. When technologies are proprietary (e.g. MS Windows), protocol isn't allowed to act as a network of circulation.

3. Cyberfeminism demonstrates that when protocol rises, patriarchy declines. One example is the possibilities enacted by virtual communities when it comes to creating new bodies.

//**PAUL EDWARDS: A VAST MACHINE**//

__**What three quotes capture the critical import of the text?**__

<span style="font-family: Times New Roman,serif;">Pxiii-xiv: Nor is there any such thing as a pure climate simulation. Yes, we get a lot of knowledge from simulation models. But this book will show you that the models we use to project the future of climate are //not// pure theories, ungrounded in observation. Instead, they are filled with data – data that bind the models to measurable realities. Does that guarantee that the models are correct? Of course not. There is still a lot wrong with climate models, and many of the problems may never be solved. But the idea that you can avoid those problems by waiting for (model-independent) data and the idea that climate models are fantasies untethered from atmospheric reality are utterly, completely wrong. //Everything we know about the world's climate – past, present, and future – we know through models//.

<span style="font-family: Times New Roman,serif;">P109: In the age of the World Wide Wed, it is easy to forget that data are never an abstraction, never just “out there.” We speak of “collecting” data, as if they were apples or clams, but in fact we literally //make// data: marks on paper, microscopic pits on an optical disk, electrical charges in a silicon chip. With instrument design and automation, we put the production of data beyond subjective influences. But data remain a human creation, and they are always material; they always exist in a medium. Every interface between one data process and another – collecting, recording, transmitting, receiving, correcting, storing – has a cost in time, effort, and potential error: data friction.

<span style="font-family: Times New Roman,serif;">P433: Another of this book's arguments regards the mutually constitutive character of models and data: //model-data symbiosis//. Since the 1950s, computer models have player four complementary roles in the infrastructures of weather and climate knowledge. First, the demands of forecast models provided powerful incentives to build planetary data networks – to //make global data//. Without the models, forecasters could never have used huge volumes of information, so they would never have tried to collect them. Second, data-assimilation models gave scientists the ability to //make data global//: to process heterogeneous, spotty, inconsistent, error-ridden signals into homogeneous, complete, physically consistent data images. Modern 4-D assimilation systems literally synthesize global data, constrained but not determined by observations. Third, general circulation models – based on theory, yet constrained by data used in their parameterizations – let scientists forecast the whole world's weather, simulate climate dynamics, and perform climate experiments. Finally, in the 1980s the reanalysis of historical weather data for climate studies, using 4-D data assimilation models, reunited forecasting with climate science.

<span style="font-family: Times New Roman,serif;">__**What is the main argument of the text?**__

<span style="font-family: Times New Roman,serif;">The text argues that, through a global climate knowledge infrastructure, information is gathered at multiple points over the globe (making global data). This data is then processed through models that allow the production of universal climate data (making data global). However, in the process, the historical and contextual specificity of the data-gathering often is ignored (data friction). As such, current debates on climate change science are built upon a false dichotomy, that of models versus data. However, the text demonstrates how models and data cannot be separated as easily, and that the study of large-scale distributed systems such as climate cannot – and perhaps never will – be done independent of models.

<span style="font-family: Times New Roman,serif;">__**Describe at least three ways that the main argument is supported.**__

<span style="font-family: Times New Roman,serif;">1. Edwards traces the historical change of US climate politics from the 1960s (focus on weather modification), through the 1970s (focus on the greenhouse effect), 1980s (focus on ozone depletion) prompted by the increased sophistication of simulation models.

<span style="font-family: Times New Roman,serif;">2. Edwards discusses the role of the World Weather Watch as the beginnings of an international infrastructure to allow climate modeling. This shifted the focus of research from the climate to the models themselves.

<span style="font-family: Times New Roman,serif;">3. Edwards describes how the IPCC's focus on assessment rather than research brought about an era of climate data and climate model auditing that played an important role in the existing controversy about global climate change.

<span style="font-family: Times New Roman,serif;">__**Describe the main literatures that the text draws on and contributes to, and the particular contribution made by the text.**__

<span style="font-family: Times New Roman,serif;">This book interacts with the literature on scientific controversies in general, and climate change controversies in particular. However, it goes beyond many of the works in these literatures by attempting to build a conceptual framework by which climate change science is produced. Rather than taking a look at each side of the controversy, the text delves deep into the structural and infrastructural level to understand what kind of knowledge about climate change can be produced.

<span style="font-family: Times New Roman,serif;">__**Explain how the argument and evidence in the text supports, challenges or otherwise relates to the argument or narrative that you imagine developing.**__

<span style="font-family: Times New Roman,serif;">An interesting aspect of this book for my own project is its exploration of how models tend to refocus all the attention towards themselves. As such, the subject effects of models become increasingly influences by the models themselves as opposed to being equally divided between user and model through practice.

<span style="font-family: Times New Roman,serif;">__**List of at least three details or examples from the text that you can use to support the argument or narrative that you are developing.**__

1. The proliferation of models shifts the focus from the data collected to the models themselves.

2. The reliance on models can lead to the creation of counter-models, which forces science to retreat from research and enter into a phase of assessment.

3. As models become more sophisticated, their projections can be simultaneously extended (globalized) and retracted (localized) more effectively.

//**CHRISTOPHER KELTY: TWO BITS**//

<span style="font-family: Times New Roman,serif;">__**What three quotes capture the critical import of the text?**__

<span style="font-family: Times New Roman,serif;">P23: If recursive public is a useful concept, it is because it helps elaborate the general question of the “reorientation of knowledge and power.” In particular it is meant to bring into relief the ways in which the Internet and Free Software are related to the political and economy of modern society through the creation not only of new knowledge, but of new infrastructures for circulating, maintaining, and modifying it.

<span style="font-family: Times New Roman,serif;">P61-62: My answer, in contrast, is that geeks' affinity with one another is structured by shared moral and technical understandings of order. They are a public, an independent public that has the ability to build, maintain, and modify itself, that is not restricted to the activities of speaking, writing, arguing, or protesting. Recursive publics form through their experience with the Internet precisely because the Internet is the kind of thing they can inhabit and transform. Two things make recursive publics distinctive: the ability to include the practice of creating this infrastructure as part of the activity of being public or contesting control; and the ability to “recurse” through the layers of that infrastructure, maintaining its publicness at each level without making it into an unchanging, static, unmodifiable thing.

<span style="font-family: Times New Roman,serif;">P310: Understanding Free Software works and how it has developed along with the Internet and certain practices of legal and cultural critique may be essential to understanding the reliable foundation of knowledge production and circulation on which we still seek to ground legitimate forms of governance. Without Free Software, the only response to the continuing forms of excess we associate with illegitimate, unaccountable, unjust forms of governance might just be mute cynicism. With it, we are in possession of a range of practical tools, structured responses and clever ways of working through our complexity toward the promises of a shared imagination of legitimate and just governance.

<span style="font-family: Times New Roman,serif;">__**What is the main argument of the text?**__

<span style="font-family: Times New Roman,serif;">The text argues that the Free Software movement and the Internet provide an opportunity to consider and enact a reorientation of power and knowledge. They accomplish this be critically experimenting with the public sphere through the configuration of recursive publics that can actively contribute the the form and content of that which makes them a public in the first place. By being espousing values of availability and modifiability, recursive publics are able to reorient power and knowledge through the active modification of the avenues in which these objects circulate, and are able to create a self-leveling environment that enhances social justice and decreases inequality.

<span style="font-family: Times New Roman,serif;">__**Describe at least three ways that the main argument is supported.**__

<span style="font-family: Times New Roman,serif;">1. Kelty explores how and why the Free Software movement emerged at the moment and in the manner it did. Rather than understanding the Free Software movement as a novel phenomenon created by new circumstances, it attempts to follow the movement's genealogy through the 1950s.

<span style="font-family: Times New Roman,serif;">2. Kelty examines and compares the cases of the Creative Commons and of Connexions to understand how recursive publics influence practice, power, and knowledge.

<span style="font-family: Times New Roman,serif;">3. Kelty analyzes some of his attention on the stories geeks tell in order to understand how they make sense of their own political economy.

<span style="font-family: Times New Roman,serif;">__**Describe the main literatures that the text draws on and contributes to, and the particular contribution made by the text.**__

<span style="font-family: Times New Roman,serif;">The main bodies of literature that the text interacts with are those of the public sphere and the social imaginary. As such, the text engages heavily with the work of Habermas, Taylor and Warner. However, the text differs from other “information technology” studies of the Internet, which are mostly concerned with examining if the medium is an ideal public sphere. Rather, the text wishes to critically examine the consequences of recursive publics to the shape the public sphere can take, and the work it can do.

<span style="font-family: Times New Roman,serif;">__**Explain how the argument and evidence in the text supports, challenges or otherwise relates to the argument or narrative that you imagine developing.**__

<span style="font-family: Times New Roman,serif;">This book is interesting to me on many levels. First, it is an example of how to conduct research within a distributed medium such as cyberspace or the Internet. Parts of my own project will likely be conducted within similar environments, and therefore the text provides me with an interesting methodological example. Second, the texts deals explicitly with the concept of openness by making it the central concern of the public it defines. As such, Kelty's concepts and analyses can be interesting analytical tools to allow me to approach my own project. For example, an examination of the material conditions of cyberspace would allow an understanding of recursive publics rooted in physical space as well as in cyberspace, which might highlight additional ways in which power and knowledge are reoriented.

__**List of at least three details or examples from the text that you can use to support the argument or narrative that you are developing.**__

1. Geeks argue their points not through prose but through a technological semiotic. They interact and create technologies (software, websites, etc.) in order to demonstrate the very concepts and world views they espouse.

2. The two key aspects of the reorientation of power made possible by recursive publics are 1) availability, and 2) modifiability.

3. Modulations: practices of one domain that can be exported into another domain. When following modulations, it is possible to ask why certain practices can be modulated to another domain and why other practices cannot.