• Search
December 01, 2019 |  RAFIQ TANTRAY

Society and Technological Risks

The study of risk, danger, and catastrophe is a special case of the larger field of social breakdown

 

 If there is an organizing theme in social institutions and social relationships, it is social order: what it looks like, how to think about the various forms it takes, and how to explain it. Conversely, what happens when social order breaks down? What changes are wrought in how people see the world, and most important, what is altered in how they relate to one another when social order goes awry? The study of risk, danger, and catastrophe is a special case of the larger field of social breakdown. So­cial scientists have long been interested in phe­nomena that harm people and what people value. Until recently, most of this work concentrated on harm from natural events such as earthquakes, floods, and tornadoes, but many researchers now write about ‘technical’ or ‘technological’ risks.

In some ways the distinction between natural and technological risks or disasters is not help­ful: There is no objective difference between a death caused by a fire and a death caused by an airplane crash. Yet in other ways those who have been fascinated by how modern technologies fail people have asked a broader set of questions than they could have if they did not see a differ­ence between natural and technological risks. They have asked new questions about the func­tions of expertise and science in modern society, the roles of power and authority in the creation of danger, and the capacity of people to build systems they cannot control. Risk and danger are treated mainly as a sociological problem, but this is not necessarily the case. Scholars writing about these issues come from economics, geography, psychology, anthropology, and even engineering and physics. This is basically a good thing: Too much sociology is self-referential and inbred, and truly interdisciplinary work creates considerable intellectual, if not professional, excitement.

No one can write about technological risks in an interesting way without reading and thinking in interdisciplinary terms. Scholars concerned with technological risks have addressed a wide variety of topics that range from how individu­als think about risks to how nation-states de­velop strategies to mitigate threats from failures of high technology. Some scholars even write about risks that might be faced by societies far in the future. Toxic threats have drawn par­ticularly close scrutiny from scholars, and there are important sociological studies about nucle­ar waste, and nuclear weapons. One reason for this is that toxic risks invert the way natural disasters do damage. Rather than assaulting people from the outside, as do other calamities, toxic hazards assault bodies from within. Toxic injuries also have no definable end, and so their victims can never know when they are safe from further damage. The point here is that the mean­ing of toxic threats is fundamentally different from that of natural disasters. The disruption in social order thus can be largely internal, with psychological and emotional suffering caused by the breakdown of external social systems.

In general, the sociology of risk is concerned with researching and explaining how interac­tions between technology and modes of social organization create hazards or the potential for hazards. A hazard can be an actual threat to people’s lives (toxic chemical contamination, for example) or the perception that there is a threat. Indeed, many analysts focus on risk perception: what people think is dangerous and why they think what they do. The word ‘technology’ refers to the social and mechanical tools people use to accomplish something, such as the design of a nuclear power plant and the vocabularies used by experts when they talk about effectively evacuating an urban area after a major radiation release from a nuclear power plant. ‘Modes of social organization’ refers to both social structure (e.g., hierarchies of power) and culture (e.g., the degree of legitimacy granted to experts). In the twenty-first century society will continue its march toward social and technical complexity. One expression of this complexity is a capacity to create machines and institutional arrange­ments that are at once grand and terrifying. With these developments, it seems, publics are increasingly aware of the potentially devastating consequences of system failures even as they enjoy the cornucopia engendered by modern social organization and modern technology. Also, a lot of work in psychology and economics, which echoes the concerns of political and economic elites, concerns public perception of risk. Much of that work has shown that the general public does not make decisions in accordance with a hyper rational calculus in which its preferences and values are always consistent and, more to the point, agree with those of trained scientific researchers.

Consonant with the concern with public ir­rationality is the notion that people panic when faced with risks they do not understand. It is easy to find this idea in media reports of high-level politicians’ remarks after a large accident: Politicians worry that people will overreact in very unproductive ways. The image is one of people trampling each other to make for the exits of a burning building or escape from a sniper’s random rifle shots. Translated to percep­tion of risk regarding accidents and disasters, the image becomes one of individuals pursuing their self-interest to the exclusion of those of their neighbors and communities: to get out of dangerous situation , to run away from volcanic eruption, or to flee a burning airplane that has barely managed to land.

In fact, research indicates that people rarely panic even when it might be rational to do so. The U.S. firebombing of Tokyo in World War II also elicited some cases of panic. With exceptions of that sort, it is hard to find widespread panic after any type of disaster. Even events such as fire and the stampede, which are commonly thought of as examples of panic, rather than panic, the modal reaction are one of terror, fol­lowed by stunned reflection or sometimes anomie and ending with a fairly orderly response (e.g., reconstruction or evacuation). Even in the hor­rors chronicled by the U.S. Strategic Bombing Survey, cities burn, bodies explode, houses fall down, and still people do not panic. One way to classify research on risk is in terms of micro and macro perspectives. Both micro and macro studies have made important contributions to an understanding of the connections between risk, technology, and society. Micro-level research, generally speaking, is concerned with the per­sonal, political, and social dilemmas posed by technology and activities that threaten the qual­ity of people’s lives.

Macro-level work on risk does not deny the importance of micro-oriented research but asks different questions and seeks answers to those questions at an institutional level of analysis. Macro work emphasizes the importance of the institutional context within which decisions about risk are made. Sociologists of risk are keen to distinguish between public and private deci­sions. Some people make choices that affect mainly themselves, while those in positions of authority make choices that have important implications for others. This is only one among many ways in which the sociology of risk is con­cerned with issues of power and the distribution of hazards and benefits.

December 01, 2019 |  RAFIQ TANTRAY

Society and Technological Risks

The study of risk, danger, and catastrophe is a special case of the larger field of social breakdown

              

 

 If there is an organizing theme in social institutions and social relationships, it is social order: what it looks like, how to think about the various forms it takes, and how to explain it. Conversely, what happens when social order breaks down? What changes are wrought in how people see the world, and most important, what is altered in how they relate to one another when social order goes awry? The study of risk, danger, and catastrophe is a special case of the larger field of social breakdown. So­cial scientists have long been interested in phe­nomena that harm people and what people value. Until recently, most of this work concentrated on harm from natural events such as earthquakes, floods, and tornadoes, but many researchers now write about ‘technical’ or ‘technological’ risks.

In some ways the distinction between natural and technological risks or disasters is not help­ful: There is no objective difference between a death caused by a fire and a death caused by an airplane crash. Yet in other ways those who have been fascinated by how modern technologies fail people have asked a broader set of questions than they could have if they did not see a differ­ence between natural and technological risks. They have asked new questions about the func­tions of expertise and science in modern society, the roles of power and authority in the creation of danger, and the capacity of people to build systems they cannot control. Risk and danger are treated mainly as a sociological problem, but this is not necessarily the case. Scholars writing about these issues come from economics, geography, psychology, anthropology, and even engineering and physics. This is basically a good thing: Too much sociology is self-referential and inbred, and truly interdisciplinary work creates considerable intellectual, if not professional, excitement.

No one can write about technological risks in an interesting way without reading and thinking in interdisciplinary terms. Scholars concerned with technological risks have addressed a wide variety of topics that range from how individu­als think about risks to how nation-states de­velop strategies to mitigate threats from failures of high technology. Some scholars even write about risks that might be faced by societies far in the future. Toxic threats have drawn par­ticularly close scrutiny from scholars, and there are important sociological studies about nucle­ar waste, and nuclear weapons. One reason for this is that toxic risks invert the way natural disasters do damage. Rather than assaulting people from the outside, as do other calamities, toxic hazards assault bodies from within. Toxic injuries also have no definable end, and so their victims can never know when they are safe from further damage. The point here is that the mean­ing of toxic threats is fundamentally different from that of natural disasters. The disruption in social order thus can be largely internal, with psychological and emotional suffering caused by the breakdown of external social systems.

In general, the sociology of risk is concerned with researching and explaining how interac­tions between technology and modes of social organization create hazards or the potential for hazards. A hazard can be an actual threat to people’s lives (toxic chemical contamination, for example) or the perception that there is a threat. Indeed, many analysts focus on risk perception: what people think is dangerous and why they think what they do. The word ‘technology’ refers to the social and mechanical tools people use to accomplish something, such as the design of a nuclear power plant and the vocabularies used by experts when they talk about effectively evacuating an urban area after a major radiation release from a nuclear power plant. ‘Modes of social organization’ refers to both social structure (e.g., hierarchies of power) and culture (e.g., the degree of legitimacy granted to experts). In the twenty-first century society will continue its march toward social and technical complexity. One expression of this complexity is a capacity to create machines and institutional arrange­ments that are at once grand and terrifying. With these developments, it seems, publics are increasingly aware of the potentially devastating consequences of system failures even as they enjoy the cornucopia engendered by modern social organization and modern technology. Also, a lot of work in psychology and economics, which echoes the concerns of political and economic elites, concerns public perception of risk. Much of that work has shown that the general public does not make decisions in accordance with a hyper rational calculus in which its preferences and values are always consistent and, more to the point, agree with those of trained scientific researchers.

Consonant with the concern with public ir­rationality is the notion that people panic when faced with risks they do not understand. It is easy to find this idea in media reports of high-level politicians’ remarks after a large accident: Politicians worry that people will overreact in very unproductive ways. The image is one of people trampling each other to make for the exits of a burning building or escape from a sniper’s random rifle shots. Translated to percep­tion of risk regarding accidents and disasters, the image becomes one of individuals pursuing their self-interest to the exclusion of those of their neighbors and communities: to get out of dangerous situation , to run away from volcanic eruption, or to flee a burning airplane that has barely managed to land.

In fact, research indicates that people rarely panic even when it might be rational to do so. The U.S. firebombing of Tokyo in World War II also elicited some cases of panic. With exceptions of that sort, it is hard to find widespread panic after any type of disaster. Even events such as fire and the stampede, which are commonly thought of as examples of panic, rather than panic, the modal reaction are one of terror, fol­lowed by stunned reflection or sometimes anomie and ending with a fairly orderly response (e.g., reconstruction or evacuation). Even in the hor­rors chronicled by the U.S. Strategic Bombing Survey, cities burn, bodies explode, houses fall down, and still people do not panic. One way to classify research on risk is in terms of micro and macro perspectives. Both micro and macro studies have made important contributions to an understanding of the connections between risk, technology, and society. Micro-level research, generally speaking, is concerned with the per­sonal, political, and social dilemmas posed by technology and activities that threaten the qual­ity of people’s lives.

Macro-level work on risk does not deny the importance of micro-oriented research but asks different questions and seeks answers to those questions at an institutional level of analysis. Macro work emphasizes the importance of the institutional context within which decisions about risk are made. Sociologists of risk are keen to distinguish between public and private deci­sions. Some people make choices that affect mainly themselves, while those in positions of authority make choices that have important implications for others. This is only one among many ways in which the sociology of risk is con­cerned with issues of power and the distribution of hazards and benefits.

News From Rising Kashmir

;