Ignorance

** The Business of Risk is the Business of Managing Ignorance ** he //business of risk**[1]**//, one could argue, is the business of managing ignorance [2]. I write this, contextualized by the larger shift in the conceptual thinking about risks – as described amongst others by Beck (1992), Baumann (2001), Luhmann (1993), Giddens (1990), and brought together by Deborah Lupton: “The modernist concept of risk represented a new way of viewing the world and its chaotic manifestations, its contingencies and uncertainties. It assumed that unanticipated outcomes may be the consequence of human action rather than ‘expressing the hidden meanings of nature or ineffable intentions of the Deity’, largely replacing earlier concepts of fate or fortuna. As Reddy argues: ‘Moderns had eliminated genuine indeterminacy, or ‘uncertainty’, by inventing ‘risk’. They had learnt to transform a radically indeterminate cosmos into a manageable one, through the myth of calculability’” (Lupton, 1999, 7). Risk, in this “modern” (i.e. modernist post-industrial progress-thinking) sense, is re-viewing danger and hazards as calculable and steerable (manageable!) as of principle, emphasizing human agency in and over the world – and particularly over their own fate. Hand in hand goes, quasi, the rise of the //business(es) of risk//: risk perception analysis, probability calculation, statistics, risk assessment [3], etc.. As such, these businesses are by definition concerned with control and management of uncertainties and unkonwns/unkonw-ables expressed in and as risk and quantified in measures of risk. Doing so, they also actively re-produce [4] the dangers and hazards that have been prior to this shift assessed as “matter of fate” [5]. The future, as it is the domain of risks and uncertainty (cp. e.g. Brown & Rappert, 2000), becomes opened up to human imagination: Felt et al. associate this relatively new interest in futures with recent contemporary dynamics, where “we witness a ‘colonization of the future’ that […] has become visible through massive investments in the development of anticipatory methods such as technology assessment and foresight exercises” (Felt et al., 2013, 16; cp. also Adams & Groves, 2007).Unknowns and uncertainties (dangers and hazard), projected in into potentially risky futures become recognized as //unknowns//, bearing not only the potential (danger) but also the //calculable// probabilities (risk) of affecting human life, regardless if known or not. As unknowns to be made known, the business of risk thus, at first and in principle, is concerned: - with the management of such unknowns, future and present - and the production of knowledge about these unknowns These risky sciences structurally attempt to articulate, quantify and – through that – manage //ignorance//: both in terms of (external) hazards and (internal) vulnerabilities (cp. …). This notion – of that the business of risk is in the business of managing ignorances – has four iterative meanings that I am going to turn to subsequently over the following pages. Purpose of this exercise is to //re-view a specific sampling of literatures// on governing ignorance [6], whilst taking the term //re-//viewing literally [7]. ** Non-Knowledge ** The first iterative reading of my little introductory notion was allured to already, //pointing to the technocratic quantification of uncertainty in risk assessment// (cp. Castell, 1991): Managing risk requires means for identification, articulation and assessment – most commonly in terms of indicators. Whilst there are different techniques for doings so, from risk perception analysis to risk and technology assessment (cp. Slovic, 1992; Fischhoff et al., 1978; Starr, 1969; Auyero & Swistun, 2008) – all with their own logics of risks (cp. Lupton, 1999, Chapter 1) – what they generally have in common is the problematization of what necessarily is/remains in the unknown. Where there are risks and uncertainties, there is need for control, mitigation and containment. With Lupton: “An apparatus of expert research, knowledge and advice has developed around the concept of risk: risk analysis, risk assessment, risk communication and risk management are all major fields of research and practice, used to measure and control risk in areas as far-ranging as medicine and public health, finance, the law, and business and industry” (Ibid., 9). This logic points to a **first set of – the most basic – readings of ignorance**: **non-knowledge** and **negative knowledge** (cp. for both Gross, …, 68). Knowledge that is known not to be known (non-knowledge), or even deemed dangerous to be known at all (negative knowledge). As for example Gross describes, managing these known unknowns as to restrict potential negative consequences of such ignorances is a key element of knowledge society: “The contemporary explosion of knowledge or the observation that our current age is the beginning of a knowledge society thus has a little remarked on corollary: new knowledge also means more ignorance. Thus, surprising events will occur more frequently and become more and more likely. If this is the case, handling ignorance and surprise becomes one of the distinctive features of decision making in contemporary society. The challenge in dealing with surprises lies in the fact that they lie beyond the spheres of probability and risk” (Ibid., 1). This ties also into what Beck and others described as the increase of uncertainty, to an extent of uncontrollable unknowns (cp. …, …, …), culminating in a key shift of how risks are recognized and managed. Described by Beck and later … as the consequence of stratified postmodernity, and the rise of global technological risks, their management and control becomes increasingly understood as impossible (which does not mean that earlier societies were less impossible to control). Risk management, in a general conception, thus is increasingly forced to find means for dealing with unknowns and uncertainties that must necessarily remain such (cp. …). “Early” risk management sciences sought (and claimed to be able) to control objective risks, and critiques focused on showing how the notion of “objective” risk is not only misleading, but concealing (cp. …) (for critiques cp. e.g. Douglas & Wildavski, 1982; Rayner, 1992); operating on an notion of risks, managerial capacities, and regulatory science that assumed that uncertainties can not only indeed be made known, but fixed and managed (cp. e.g. …). Yet, the recognition of high complexities and uncertainties of techno-social risks (in the rise of what Beck describes risk society, or post-modernity) demands new means and strategies for addressing //non-knowledge//, where they are increasingly understood as not possibly to be undone, brought into knowing. Magnus (2008) points us in this context to the development of new strategies in governing risks that shift from an (illusion) of control, to a larger structural attempt for dealing with risks: precautionary vs risk-taking governance. Magnus describe how contemporary risk management has built in **structural ignorances** in their seeking of limiting – and even eliminating – risks: “This approach to risk management might be characterized as ‘we know what we know and we ignore what we don't know.’ Taking this approach is difficult in the absence of good information about risks, and, in a sense, it invites agnogenesis since creating uncertainty about the existence of risks reduces the role risks play in the assessment and therefore limits the impetus to manage those risks. Risk management, as it is currently practiced, is essentially an invitation to move forward with an activity in the face of a great deal of uncertainty in the hope that serious environmental problems do not emerge.” (Ibid., 253). Whilst the US is taking such a risk approach to risk management (cp. Frickel et al, 2009; also Gross, …, 2ff), the precautionary principle seeks to rather emphasize inherent non-knowledge about potential risk, instead of ignoring them: “Third, the precautionary principle became an important tool for risk management. Ignoring uncertainty was simply not sufficient for adequate risk management. The precautionary principle provided managers or regulators with a new tool that would allow them to reasonably move forward when there was clearly sufficient evidence to warrant concern, but not sufficient evidence to establish risks with a high degree of certainty. Sometimes we know what we don't know—and the precautionary prin ciple turned ignorance into knowledge” (Ibid., 254). These two general approaches to dealing with risk in the context of non-knowledge foreground two different strategies in risk management, where both are build on the basic assumption that something always remains in the unknown and bears the potential for hazardous surprise (cp. Gross, …). The one yet chooses the path of ignorance in its more narrow conceptualization, as outlined taxonomically by Tuana (…). The other seeks to explicitly address non-knowledge and actively address it in the ways risk is governed and managed. Both, simultaneously are dealing with uncertainty and actively //produce// risk, by making unknowns visible in quantitative terms of probability. This, anyhow, is but one iterative reading of my opening sentence. Tuana’s taxonomy of ignorance can provide useful coordinates to open up and navigate further readings of “ the business of risk is the business of managing ignorance”. Let me now turn to another set of readings of ignorance in this context: structural ignorance. ** Structural Ignorance ** Already Fleck and Kuhn articulated in their concepts how knowing and not-knowing are tightly coupled, pointing out that dominating thought styles (Fleck, 1981) or paradigms (Kuhn, 1962) in different ways foreclose and open up certain questions and simultaneously shape the ways answers to those questions take shape. What is known is in that sense always build on what remains unknown and what is possible to become known. Fleck for example points not only to the historic heritage of concepts “with all its errors” (Fleck, 1981, 20), but also the resistances to accept contradictory to established knowledge that points to unknowns or ignorances in the epistemological structures thought collectives are building on: “Once a structurally complete and closed system of opinions consisting of many details and relations has been formed, it offers enduring resistance to anything that contradicts it” (Ibid. 21). A lot of subsequent works in STS, from SSK, SCOT and EPOR (and after all, to play with acronyms, SSI is in strong reference to SSK [8] ) over ANT to postcolonial and postmodern studies can be read not only as studying how we know, but also about how awe systematically fall short in our knowing, and how the social and cultural structures of knowledge in science and technology (and beyond) constantly produce ignroances and unknowns, with large political consequences and effects. This kind of not knowing and ignorning is a third kind of ignorance writ large: **structural ignorance.** Feminist studies in particular are explicit about the structural ignorances of our knowledge, and articulate a strong opposition to their political consequences for marginalized life, both human and nonhuman. Donna Haraway reminds us of the absences in witnessing science and the consequences these ignorances produce (Haraway, 1997). Londa Schiebinger (1986) describes far-reaching ignorances in charting the skeleton, and Tuana (2006) explicates in //speculum of ignorance// attention the need for “the ability partially to translate practices of ignorance among very different—and power-differentiated—communities. Ignorance, like knowledge, is situated. Understanding the various manifestations of ignorance and how they intersect with power requires attention to the permutations of ignorance in its different contexts” (Tuana, 2006, 3). And particularly Frickel et al. (2009) make the argument that the structures of knowledge production themselves produce ignorances with far reaching political consequences. All of these studies point towards a general – and only seemingly – contradiction in risk governance: whilst we hope for risk governance to reduce or at least control uncertainty, and with that risk, it arguably produces itself new ignorances. This can be either by method and scientific inquiry, as discussed here as //structural ignorance//, or strategically to avoid accountability or deflect responsibility (chapter 3). Intersecting both, structural and strategic ignorance, I will discuss in the next chapter (chapter 2) the organizational ignorances; ignorances and unkowns that are produced in the organizational infrastructures and set-up of the management of risk. Frickel et al. (…, 2) describe how the “shift in focus to the institutional politics of knowledge and innovation brings into sharper relief the problem of ‘undone science’, that is, areas of research identified by social movements and other civil society organizations as having potentially broad social benefit that are left unfunded, incomplete, or generally ignored.” By showing how social movements unmask ignorances due to undone science – similar to the studying-up approach in feminist STS – the authors detail quasi build-in disciplinary ignorances: where important knowledge (but also negative knowledge) is not produced or left undone due to disciplinary organizational or funding constraints, intertwinement with political, industrial and economic interests, and the paradigmatic blind-spots that e.g. Kuhn or Fleck point out. For Frickel, civic movements and marginalized groups more generally, relying/depending on knowledge that remains so far undone, can be a powerful source for changing research agendas. Another powerful case study in this regards is Epsteins’ study of AIDS activists and their success in shifting research agendas in pushing for an opening up of medication test trials. Both exemplify on the one hand structural ignorances, on the other – and more importantly – they point to deep political implications and marginalizing consequences of science remaining undone. A similar aspect is foregrounded by Frickel and Vincent (2007) who articulate in what they call **organized ignorance** how the de-contextualization and conflation into single points of measurement (e.g. in contamination testing) ultimately makes environmental health regulation partisan in the production of non-knowledge and ignorances – amongst others to preserve and maintain expert-authority and power positions (more on this later-on). They describe //organized ignorance// as “a system of knowledge production that articulates risk in ways that leave much potential knowledge ‘’undone’” (Ibid., 181). Expert systems, particularly in regulatory science, produce specific understandings of pasts, presents and future threats that – in effect – minimize the context of that knowledge produced, and thus limit its scope. As such, organized ignorance stands for a form of knowledge production “that articulates risk in ways that do not, and perhaps cannot, answer some of our most basic questions concerning safety, health, and sustainability” (Ibid., 182), specifically by erasing social and historical contexts form the knowledge that is produced, but also – on a larger scale – by the compartmentalization of regulatory science (e.g. along the media they are dealing with – heat, water, air, soil, …). This latter argument is mirroring Vaughan’s case study, as discussed next, yet simultaneously also makes explicit the technological materializations of ignorance, and in consequence their part in producing and simultaneously hiding ignorance. A data point, so the authors, is merely a snapshot. And many data points are thus only many snapshots: "In other words, the kinds of results testing produces are programmed into the testing parameters a priori. We find what we seek, not necessarily what is there" (Ibid., 184). Testing devices become disciplining devices. Many further recent contributions (….) outline similar aspects of the structural production of ignorance, and how it goes hand in hand with the attempts for managing risk. In juxtaposition to the simplistic, yet persistent view of risk management, and regulatory science in particular, as their primary aim being the reduction of ignorance, or at least finding systematic ways to address and manage uncertainties, these works quite substantially complicate such a notion. …. Talk about discounting lifes, and other effects of these issues … (Guthman, etc.) ** Organizational Ignorance ** Vaughan (1990) describes how the structural organization of NASA’s safety and risk management produced in itself problematic ignorances. Playing off of Perrow’s (2011) argument for the normalcy (and unavoidability) of accidents – an argument that mirrors the larger recognition of the inescapability of disaster and the shifts of risk management that came with it, as described above – she articulates the ignorance towards signified danger: In reference to Turner, she points to the “incubation periods” of disaster, going unrecognized, hidden in structural organizations: “The organizations that run these risky enterprises often contribute to their own technological failures. Turner (1976, 1978) has investigated accidents and social disasters, seeking any systematic organizational patterns that might have preceded these events. He found that disasters had long incubation periods characterized by a number of discrepant events signaling danger. These events were overlooked or misinterpreted, accumulating unnoticed” (Vaughan, 1990, 225). Vaughan points, in further reference to Turner to the part that “among the organizational patterns contributing to these [are] ‘failures of foresight’ (Turner, 1978: 51) were norms and culturally accepted beliefs about hazards, poor communication, inadequate information handling in complex situations, and failure to comply with existing regulations instituted to assure safety (Turner, 1976: 391)” (Ibid.). Vaughan then continues to carve out how NASA’s organizational safety (infra-)structures contributed to the Challenger disaster. Although she does not explicitly use the terminology of ignorance, her description of the multi-layered and multi-scale failures within NASA’s safety and risk managment leading up to the Challenger disaster can easily be read as an exemplarily case study for how such organizational structures themselves produce the ignorances and unknowns they are built to address: for example through lost information, bad access to informants or information, double binds between regulators and regulated, insufficient funding, or conflicting interests. This also points to one of the central social functions [9] that Moore andTumin (1949) identify for ignorance. On the one hand they point out that ignorance plays a key role in preserving privileged positions through information control (p. 787ff) and later go on to describe “formal bureaucratic structures” (p. 792) as “by their nature depend[ing] upon narrowly and precisely defined roles and, therefore, personalities”. Where the authors discuss this further in terms of interpersonal ignorances to maintain work, Vaughan’s case can be read as a re-iteration re-tracing these very build-in ignorances across scales and as playing a large part in the production of disaster. Edited by Sullivan and Tuana, the outstanding book “Race and Epistemologies of Ignorance” (…) plays a lot into what has been articulated so far. For example, Mills (Chapter 1) points out how the organizational ignorances maintain racial relationships through “White Ignorance”, and Alcoff (Chapter 2) describes how ignorance emerges from the situated knowers. Both go hand in hand, as I argued here. Other aspects will follow. What makes this volume particularly important is the closely-nit and complex perspectives it offers on the organizational production, reproduction and fostering of ignorances, unmasking the racial invisibilities created at many scales and on many sites. Tuana herself takes it on elsewhere (2006) to provide a taxonomy of ignorance, describing the inherently power-laden and oppressive characteristics of ignorance, and particularly carving out how these different kinds of ignorance are always structurally situated, but also produced. Articulating these ignorances becomes then a powerful means for challenging authoritarian power inherently embedded in these structures. A similar story unfolds in such a re-reading of Vaughan, where the clear double binds and interdependencies are unmasked in the structural re-tracing of ignorance inherently embedded in NASA’s safety management. This problem has since that also been articulated more explicitly regarding risk governance and risk management, doing so on multiple scales. ** Strategic Ignorance **** à **** Trust Authority Expertise … ** Yet, so far at the forefront was the structural production of ignorance – no doubt an important matter to pay scholarly attention to. Less articulated, and only present between the lines, was yet the matter of responsibility and accountability, and going hand in hand with that: the maintainence of power and authority, the contradictory demands of maintaining trust and authority on the one hand, and the recognition and //public// articulation of new ignorances, risks and hazards. One work I cannot quote here, as it is not yet published, is that of Laura Rabinow, who is in the process of formulating such a critique in explicit ways [10]. So I might build on what others had to say on this matter. What I am alluring to here, anyhow, is the recognition of new risks a posteriori to their bringing into the world, and how this recognition in public ways is discouraged not only structurally, as described above, but also more explicitly through the emergent contradictions with need of self-preservation: How can it be that the poisoning of the residents in Flint, Hoosick Falls, Benningtion and many many other communities was not recognized despite already available, or at least emergent knowledge of the poisonouse effects of PFOA? How is it possible, that chlorine chemicals are continued to be put into the world, despite their toxic characteristic, as revealed by (Murphey, …, … ,… )? An array of literatures intersects here with what has been previously discussed under the large umbrella of structural ignorance, yet extends it by the problematization of maintaining power, securing privilege, and ensuring the trust of publics that constitute positions of expertise, authority, regulatory power and political control alike. The lines are fluid, as already Vaughan articulates. More revealing maybe, as embedded in the cold, objective language of statistical analysis of organizational management studies, may be this quote from an analysis of those reports attempting to explain what “really” happened in the 2003 heat wave in France: “Reports are sensemaking devices aimed at restoring social order by depoliticizing the event, (re)legitimizing the institutions concerned, and restoring trust in public action and control” (Boudes & Laroche, 2009, 389). And continue: “According to Brändström and Kuipers blame can be attributed either to an actor or a network (an organization or a network of organizations), focusing on either a lower operational level or a higher strategic level. Brändström and Kuipers label the failing policy makers configuration (blaming higher-level actors) as the most politically explosive, whereas they label scapegoating as the attribution of blame to a lower-level actor. Organizational mishap and failing policy incriminate organizations at operational and strategic levels, respectively” (391). One of the key findings of their study is, that the reports managed to avoid assigning blame, and thus individual political (and legal) responsibility, by designating the explanation of the heat disaster of 2003 at the bureaucratic organizational shortcomings displayed in the disaster. Not only appear these reports as narrative registers for articulating explanation and justification, they also play towards a similar assertion about how structural ignorance often works //seemingly// detached from individual actors. Their asserts becomes yet explicit, when they call out this move as a) allowing for active avoiding of responsibility and blame, and b) pointing out that these can become deliberate measures. Structural ignorance is not only problematic in the mere production of ignorance in domains where such ignorances are deeply disturbing and often can have deadly consequences – such as risk management. In this perspective it also becomes problematic as the structures allow for the explanation of ignorance whilst avoiding accountability and responsibilities, across different scales. These structural conditions also point towards a key contradiction: Whilst maintaining safety often requires the recognizing that one did something wrong, or was ignorant about something they should not have been ignorant of (e.g. for reasons of hazardous consequences), securing ones position within an organization (keeping ones job) or securing the trust in an regulatory organization often requires the contrary. The role of trust was repeatedly articulated, for example Shaffer (trust & authority of science), more generally also in Beck, as discussed in the introduction, more explicitly in regards to regulatory and scientific authority in the wide arrays of science and policy communication literatures (some points of departure for those interested in these literatures: …). As such, ignorance must be explicitly understood as a political tool for managing responsibility and accountability at the one hand, claims of authority and power on the other. Here, a fast literature expands on matters of secrecy (Moore & …; …., …., … [11] ), but this is not the point I seek to make here. Rather the point I want to put forward is to be situated on the intersection of both aspects. McGowey (2012) describes ignorance as a function for justifying sociological authority. On the one hand nonkonwledge is tightly coupled with the maintainence of monopolies in expertise: Who gets to know, and know what is unkown, secures authority to speak on both, knowledge and non-knowledge. Secondly, ignorance becomes an alibi for maintaining organizational power, by scaling ignorance over organizational hierarchies (this is playing into what has been described for the dispersal of accountability in bureaucratic power (Weber, …) and bureaucratic organizational principles). The term McGowey uses here, strategic ignorance, points towards a similar phenomenon, where the expert-figure that did not know, is justifiable not knowing, because – in a direct function – nobody could have known. Proctor and Schiebinger (1997) in a similar assessment, also points out how this runs into maintaining the authority claims of experts towards publics, pointing out that …. Orsekes and Conways (2011) provide one of the most powerful examples of one nuance of this dynamic, when Tabaco lobbies and climate doubters actively engage in the business of constructing doubt – playing into the category of ignorance in very specific ways, and under a new iteration in contrast to what has been discussed so far. And, as a further example, Michaelis articulates similar observations as Oreskes does, yet he asserts: “All scientific work is incomplete—whether it be observational or experimental. All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone action that it appears to demand at a given time” (Michaelis, 2008, 109). This passage points to the double-binds between industry and scientific knowledge production (cp. here also the expanding literatures on the commercialization of science and its problematization, …), yet it also points to the active obscuring and mismanagement of available information, going beyond the structural conditions that facilitate the loss of such crucial information, as it is pointed out by Vaughan. Beyond just “looking away”, Michaelis (2008) – although more nuanced in language – suggests an active obscuring, an political choice for seeking ignorance to maintain and preserve ones own position (here a departure into the more socio-psychological literature on ignorance and self-preservation, as problematic as it often gets (cp. …), can only be recommended, yet not actualized in this context of writing). The notion of ignorance is further expanded by: Katz (1979), Messick (1999), Carrillo and Thomas (2000), Gershon (2000), McGoey (2012), and Rappert (2012). Related 'intentional ignorance' in Kaptchuk (1998   Maybe further expand with: “Roberts and Armitage (2008) developed the concept of ignorance economy that they advanced would be of particular interest for economists, managers, scientists, and policymakers. The authors argue that "the knowledge economy is one wherein the production and use of knowledge also imply the creation and exploitation of ignorance, for not only knowledge but also ignorance now play a main role in the formation of advanced global capitalism" (2008:345). Also see High (2012:122-123) in the context of social scientists engaging in the scientific ignorance economy (i.e., competing ignorance claims in grant review dynamics).“    Conclude with public communication, science and the public + ignorance    e.g. Woodhouse, E.J., David Hess, Steve Breyman, and Brian Martin. 2002. "Science Studies and Activism: Possibilities and Problems for Reconstructivist Agendas." Social Studies of Science 32:297-319. Wynne, Brian. 1992. "Uncertainty and environmental learning: Reconceiving science and policy in the preventive paradigm." Global Environmental Change 2:111-127. DOI: 0959-3780/92/02011 l-l 7. ** Conclusion: What is the study of Ignorance in Risk Governance? **
 * T ||

** References ** Adam, B., & Groves, C. (2007). //Future matters: Action, knowledge, ethics//. Leiden: Brill. Retrieved from [] Auyero, J., & Swistun, D. (2008). The social production of toxic uncertainty. American Sociological Review, 73(3), 357-379 Bauman, Z. (2001). //Liquid Modernity//. //Contemporary Sociology// (Vol. 30). [] Beck, U. (1992). //Risk society: Towards a new modernity// (Vol. 17). Sage. Boudes, T., & Laroche, H. (2009). Taking off the Heat: Narrative Sensemaking in Post-crisis Inquiry Reports. //Organization Studies//, //30//(4), 377–396. http://doi.org/10.1177/0170840608101141 Brown, N., Rappert, B., & Webster, A. (2000). Contested Futures. A Sociology of prospective techno-science, //1//. Retrieved from [] Castel, R. (1991) From dangerousness to risk. In Burchell, G., Gordon, C. and Miller, P. (eds), //The Foucault Effect: Studies in Govern-mentality.// London: Harvester/ Wheatsheaf, pp. 281–98. Douglas, M., & Wildavsky A. B. (1982). Risk and culture: an essay on the selection of technical and environmental dangers (pp. 1-66). Berkeley: University of California Press. Felt, U., Barben, D., Irwin, A., Joly, P. B., Rip, A., Stirling, A., & Stöckelová, T. (2013). Science in Society: caring for our futures in turbulent times. //Policy briefing//, //50//. Fischhoff, B., Slovic, P., Lichtenstein, S., Read, S. & Combs, B. (2000) [1978]. How safe is safe enough? A psychometric study of attitudes toward technological risks and benefits. In P. Slovic (Ed.), The perception of risk (pp. 80-103). London/Sterling, VA: Earthscan Publications. Fleck, L. (1981). //Genesis and Development of a Scientific Fact//. Chicago: University of Chicago Press. Frickel, S., & Vincent, M. B. (2007). Hurricane Katrina, contamination, and the unintended organization of ignorance. //Technology in Society//, //29//(2), 181–188. http://doi.org/10.1016/j.techsoc.2007.01.007 Frickel, S., Gibbon, S., Howard, J., Ottinger, G., & Hess, D. (2009). Undone science: charting social movement and civil society challenges to research agenda setting. //Science, Technology & Human Values//. Giddens, A. (1990) //The Consequences of Modernity.// Cambridge: Polity Press**.** Haraway, D. J. (1997). //Modest− Witness@ Second− Millennium. FemaleMan− Meets− OncoMouse: Feminism and Technoscience//. Psychology Press. Kuhn, T. S. (1962). //The Structure of Scientific Revolutions//. London/Chicago: The University of Chicago Press. Luhmann, N. (1993) //Risk: A Sociological Theory.// New York: Aldine de Gruyter. Lupton, D. (1999). //Risk. Key Ideas//. London and New York: Routledge. http://doi.org/10.1017/CBO9781107415324.004 Magnus, D. (2008) Risk Management versus the Precautionary Principle: Agnotology as a Strategy in the Debate over Genetically Engineered Organisms [pp. 250-265]. In R. Proctor & L. Schiebinger (Eds.) //Agnotology: The Making and Unmaking of Ignorance//, Stanfod: Stanford University Press. McGoey, L. (2012). The Logic of Strategic Ignorance. //British Journal of Sociology//, //63//(3), 533–576. [] Michaelis, D. (2008). Manufactured Uncertainty: Contested Science and the Protection of the Public's Health and Environment. In R. Proctor & L. Schiebinger (Eds.) //Agnotology: The Making and Unmaking of Ignorance//, Stanfod: Stanford University Press. Moore, P. G., & Moore, P. G. (1983). //The business of risk//. Cambridge University Press. Moore, W. E., & Tumin, M. M. (1949). Some social functions of ignorance. //American Sociological Review//, //14//(6), 787-795. Oreskes, N., & Conway, E. M. (2011). //Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming//. Bloomsbury Publishing USA. Perrow, C. (2011). //Normal accidents: Living with high risk technologies//. Princeton University Press. Proctor, R. N., & Schiebinger, L. (Eds.). (1997). //Agnotology. The Making and Unmaking of Ignorance//. Stanford, Ca: Stanford University Press. http://doi.org/10.1177/0306312713484646 Rayner, S. (1992). Cultural theory and risk analysis. In S. Krimsky & D. Golding (Eds.), Social Theories of Risk (pp. 83-115). Westport: Praeger, 83-115. Schiebinger, L. (1986). Skeletons in the closet: The first illustrations of the female skeleton in eighteenth-century anatomy. //Representations//, (14), 42-82. Slovic, P. (1992). Perception of risk: reflections on the psychometric paradigm. In S. Krimsky & D. Golding (Eds.), Social theories of risk (pp. 117-152). Westport: Praeger. Starr, C. (1969). Social benefit versus technological risk. Science, 165 (19 September), 1232-1238. Sullivan, S., & Tuana, N. (Eds.). (2007). //Race and Epistemologies of Ignorance//. Albany: State University of Newy York Press. http://doi.org/10.1080/02691721003749927 Tuana, N. (2006). The speculum of ignorance: The women's health movement and epistemologies of ignorance. //Hypatia//, //21//(3), 1-19. Vaughan, D. (1990). Autonomy, Interdependence, and Social Control : NASA and the Space Shuttle Challenger. //Administrative Science Quarterly//, //35//(2), 225–257. http://doi.org/10.2307/2393390 Gross, M. (2010). //Ignorance and Surprise - Science, Society, and Ecological Design//. //Discourse//. http://doi.org/Doi 10.1086/660075

Frickel, S., Gibbon, S., Howard, J., Ottinger, G., & Hess, D. (2009). Undone science: charting social movement and civil society challenges to research agenda setting. //Science, Technology & Human Values//.

[1] Alluring to Peter Moore, who wrote „The business of risk“, and who was very much himself in this business: „The three principal messages conveyed [in his book] are: first, risk arises in some form or other in virtually all fields of endeavor; second, it is important neither to ignore risk nor to be frightened by it; third, systematic methods to assess and handle risks can be developed” (Moore, 1983, ix). [2] Overview on ignorance in sts [3] Lupton, quoting Castel: grandiose technocratic rationalizing dream of absolute control of the accidental, understood as the irruption of the unpredictable...a vast hygienist utopia plays on the alternate registers of fear and security, inducing a delirium of rationality, an absolute reign of calculative reason and a no less absolute prerogative of its agents, planners and technocrats, administrators of happiness for a life to which nothing happens. [4] as in newly produce our understanding thereof [5] Fate, as in danger striking at the will of a deity that has written all past, present and future [6] as in: governing of ignorance, but also the governing through ignorance [7] Reassembling the literatures under discussion as to make a larger point for thinking through risk assessment, attending to the central role that ignorences in their multiple meanings play. Re-viewing, in the sense of viewing differently/newly in looking back. [8] SSI, the Sociology of Scientific Ignorance: Holstein (1993, 2009), Wehling (2001), Gross (2007, 2010), Frickel et al. (2010), Hess (2009, 2010, 2011), Kempner et al. (2011), and Gaudet (2013). Related 'science-based ignorance' (Ravetz, 1990s), 'specified ignorance'(Merton, 1987), 'ignorance mobilization' (Gaudet, 2013), 'forbidden knowledge' (Kempner et al., 2011), '(scientific) cultures of non-knowledge' (Böschen et al, 2006, 2010), and 'undone science' (Hess, 2010; Frickel et al. 2010). [9] pointing– amongst other things – to their problematic functionalist language that clearly needs reading in reflection of the time of the writing [10] I am writing this, as most of my working through the ignorance literature under discussion here was acomplished in the fruitful conversations with Laura Rabinow, but also Kim Fortun and of coures our colleagues in class. [11] „Kempner et al. (2011), including a review of the contested conceptualizations of 'forbidden knowledge'. In a typology and dynamic model proposed by Gross (2010:68), forbidden knowledge is under 'negative knowledge'. In a typology and dynamic model proposed by Gaudet (2013), it is under 'latent non-knowledge'“.