Guide to the vibrant and

Download 17.16 Mb.
Size17.16 Mb.
1   ...   125   126   127   128   129   130   131   132   ...   162

industry on scientific and technological controversies

(for example Dorothy Nelkin, Controversy, Politics

of Technical Decisions, 1979; Nelkin, Atom Besieged,

1982; and H. Tristan Engelhardt and Arthur L.

Caplan, Scientific Controversies, 1987). Such work

exposes the divergent theoretical assumptions,

rival experimental designs, and contrary evidentiary

interpretations, at the same time displaying

the communally developed procedures for reaching

closure on debate to restore continuity and

consensus (Harry H. M. Collins and Trevor Pinch,

The Golem: What Everyone Should Know About Science,

1993, 1998; The Golem at Large: What Everyone Should

Know About Technology, 1998). Other lines of research

in this general rubric focus on science

Science and Technology Studies Science and Technology Studies


institutions and funding; science education and

public understandings of science; and, technological

innovation, planning, and assessment.

Closely related are studies of the role of science

and science advising in government (for example

Chandra Mukerji, A Fragile Power, 1989; Sheila

Jasanoff, The Fifth Branch, 1990) and the role of scientific

evidence in law (such as Roger Smith and

Brian Wynne, Expert Evidence: Interpreting Science in

the Law, 1989; Sheila Jasanoff, Science at the Bar,

1995; and Simon Cole, Suspect Identities, 2001). Using

science to make policy, law, and property constitutes

a thick strand of STS scholarship (see law and

society). Since the 1980s, when American law

changed markedly, allowing the results of publicly

funded research to be patented and licensed, the

institutional and distributional issues associated

with technology licensing and transfer have been

the subject of extensive research. These topics were

present in the pre-STS work, primarily in political

science and policy studies. STS contributed a critical

dimension, revealing and unpacking the embedded,

often unreflective claims of scientific

expertise in law and elsewhere; at the same time,

research explores the ways in which such expert

authority is constructed and legitimated in and

through government policies and programs (for

instance in Brian Wynne, Risk Management and Hazardous

Wastes, 1987; or Stephen Hilgartner, Science

on Stage, 2000). STS scholars also study public and

private systems of risk analysis in such diverse

fields as weapons, environmental management,

and financial markets (for example, Donald Mac-

Kenzie, Inventing Accuracy, 1990; Mackenzie, Mechanizing

Proof: Computing, Risk, and Trust, 2001; Hugh

Gusterson, People of the Bomb, 2004). Some, not all,

of this research adopts a distinctly progressive,

democratic stance, worrying about the consequences

of concentrated expertise and public exclusion

from critical decisions and the public

responsibilities of science. This is an outgrowth of

movements such as Science for the People that

emerged as organized opposition to the American

war in Vietnam; the movement and publications

continue to this day in studies concerning such

issues as genetically modified foods, explosion in

the use and marketing of pharmaceuticals, as well

as global warming and worldwide environmental

degradation, unplanned growth, resource depletion,

and inequality. Other works look at the

human–machine interface from the point of view

of instrument design as well as the role of technology,

for example computers, in human relations

and development (Sherry Turkle, The Second Self,

1984), while yet other research focuses on human

relations with animals or nature in general (for

example Donna Haraway, Primate Visions, 1990;

Bruno Latour, Politics of Nature, 2004). In essence,

this thread of STS scholarship marries in-depth

technical knowledge of particular scientific fields

or pieces of technology with examinations of the

public and private uses for business, management,

government, and interpersonal relations.

The second general rubric of STS research looks

more centrally at the production of science and

technology than at their appropriation, distribution,

regulation, and use. Beginning in the 1970s,

anthropologists and sociologists undertook closely

observed, ethnographic studies of laboratory practices,

processes of scientific discovery, and technological

invention. Subjecting scientists, and later

engineers in work groups, to the same scrutiny and

in-depth analysis of social organization, culture,

and epistemology that anthropologists had long

applied to small-scale, often pre-industrial societies

and human groups, STS researchers produced

rich descriptions of the unarticulated and

often tacit understandings that made science

and scientists. In this way, they demonstrated

that science is not a distinct realm of social action,

but is like other social settings, rife with conflict,

compromise, pragmatic adjustments, and power,

as well as taken-for-granted habits that make

social settings transparent and familiar to socially

competent members but alien and uninterpretable

to non-member outsiders (Bruno Latour and

Steve Woolgar, Laboratory Life, 1979; Sharon Traweek,

Beamtimes and Lifetimes, 1988; Karin Knorr-

Cetina, The Manufacture of Knowledge, 1981; Michael

Lynch, Art and Artifact in Laboratory Science, 1985;

Lynch, Scientific Practice and Ordinary Action, 1993;

Harry Collins, Changing Order, 1992; Collins,

Gravity’s Shadow: The Search for Gravitational Waves,

2004; Joseph Dumit, Picturing Personhood, 2004).

These studies built on and critiqued earlier research

in the history and sociology of science

that had identified functional, normative requisites

for scientific communities (Robert K. Merton,

The Sociology of Science, 1979) and the paradigmatic

development of scientific theories (Thomas

Samuel Kuhn, The Structure of Scientific Revolutions,

1962). While both Merton and Kuhn had described

the structures of normal science, for example dialectical

developments among theory, experimentation,

and career advancement, STS scholars

adopted insights from European critical theory to

pay particularly close attention to the cumulative

consequences of micro-transactions, discursive

strategies, and forms of representation, as they

produced a particular scientific fact or practice

Science and Technology Studies Science and Technology Studies


(for example David Kaiser, Drawing Theories Apart,

2005). These same perspectives and research

methods were also adopted to study technological

innovation, engineers, and designers (Hugh Gusterson,

Nuclear Rites, 1996; Gary Lee Downey, The

Machine in Me: An Anthropologist Sits Among Computer

Engineers, 1998; Stefan Helmreich, Silicon Second

Nature: Culturing Artificial Life in a Digital World,

1998; Katherine Henderson, On Line and On Paper,

1999; Trevor Pinch (with Frank Trocco), Analog

Days: The Invention and Impact of the Moog Synthesizer,

2002; David Mindell, Between Human and Machine:

Feedback, Control and Computing before Cybernetics,

2002). These closely observed studies of scientific

and engineering practice have led to extensive

research on processes of cognition and categorization

(G. C. Bowker and S. L. Starr, Sorting Things Out:

Classification and Its Consequences, 2000). Important

work, such as Shapin and Schaffer (1985) showed

that the mechanical experiments of Robert Boyle

did not satisfy Thomas Hobbes’s criteria for philosophical

truth, and hence their work is a bridge

between the philosophy of science and the sociology

of knowledge.

These categories are organizing tools for identifying

the variation within science and technology

studies more than means for identifying the

information and analysis within any particular

text. Many studies can fit within both families of

scholarship, looking at the production of science

as well as its distribution, appropriation, and implications

for particular groups or classes (for

example, Londa Schiebinger, The Mind Has No Sex?

Women in the Origins of Modern Science, 1989; Has

Feminism Changed Science? 1999). Steven Epstein,

for example, described the ways in which gay

rights activists became expert analysts of the existing

medical knowledge concerning AIDS when the

epidemic first took hold and eventually became

co-producers of new knowledge, especially treatment

protocols in drug trials (Epstein, Impure Science:

AIDS, Activism, and the Politics of Knowledge,

1998). The research of Emily Martin and Anne

Fausto-Sterling responded to critiques of both

the science and pseudo-science of gender and

reproductive medicine while exploring both the

production and appropriation of scientific knowledge

(Martin, Woman in the Body, 1992; Anne

Fausto-Sterling, Myths of Gender, 1992; Fausto-Sterling,

Sexing the Body, 2000). The scholarly work on

reproductive medicine and technology, like the

work on AIDS (Acquired Immunodeficiency Syndrome),

followed upon grass-roots activism that

exposed the limitations, and often ideological

or biased assumptions, of the then conventional

science in these areas. Similarly, Troy Duster has

shown how biological research can be inadvertently

used to feed racist policies, and how tacit

assumptions can feed a research agenda (Duster,

Backdoor to Eugenics, 1990, 2003). SUSAN SI LBEY

scientific management

This is the theory of management advanced by

Frederick Winslow Taylor (1856–1915) in his book

The Principles of Scientific Management (1911). This

approach to the rationalization of the production

process, as well as its modern-day successors, is

often called Taylorism. Taylor proposed systematic

management as a method for achieving national

prosperity based on productivity growth.

The economic-organizational context of the development

of scientific management is the rise of the

large firm, large-scale mechanized production,

and the shift from control by ownership to control

by the professionalizing class of managers in the

late nineteenth- and twentieth-century industrial

division of labor. Within this context the notion

of scientific management represented by Taylor

was not so much a complete innovation as it was

a bringing-together and popularizing of a number

of existing trends and new practices, including

new methods of accounting, the employment

of less-skilled workers (particularly in the United

States), and moves to rationalize and formalize

management methods. Unlike sociological paradigms

that assume (group or individual-level)

conflict, scientific management takes the true interests

of employers and employees as identical.

From this point of view, industrial conflict and its

solution appear as a matter of proper organization

through the application of scientific methods to

work organization. Taylor’s approach promised

higher profits, long-term beneficial organizational

development, and higher wages (based on the

then fairly novel idea of systematic incentive

pay) as the outcomes of efficient cooperation.

In scientific management, automation is not

limited to machinery but extended to workers’

behavior through extensive standardization. Taylor’s

system rests on close observation and control

of the labor process. Applying time-and-motion

studies to industrial organization (initially in

one plant in 1881, where he developed his main

approach), Taylor studied and proposed the breakdown

of tasks into the smallest possible units of

movement that could then be made quickly and

repetitively. To obtain optimum output levels, the

“one best method” identified in this way was to be

applied by the person best suited to the task:

Taylor’s protagonist, a worker named Schmidt,

Science and Technology Studies scientific management


showed a high work-rate and obedient following

of procedure and for Taylor represented the best

traits that a worker should possess. The deskilling

and loss of autonomy implied by this procedure

had the greatest effect on a specific class of

workers, that of skilled craftsmen.

Taylor’s book also implicitly presents a theory of

leadership; it asserts the role of the new professionalizing

group of managers, so the theory is

also a legitimization project for the managerial

class. Severe criticisms over the a-social view of

humans, specifically non-managerial employees,

in work organizations have been raised throughout

the twentieth century from a range of social

science disciplines, including from management

scholars who have deemed Taylorism morally indefensible.

Nevertheless, contemporary management

theorists broadly regard Taylor’s work as a

classic of management theory, and consultancy.

Taylor himself lectured and consulted widely

with managers and academics and was responsible

for the early spread and uptake of a range

of methods of scientific management. ANN VOGEL


– see church–sect typology.


This term is conceptualized differently by different

scholars but for the most part refers to the

constellation of historical and modern social processes

that allegedly bring about the declining

significance of religion in social institutions,

public culture, and individual lives. The secularization

thesis has its roots in the classical theorizing

of both Max Weber andE´mile Durkheim. Most

notably, Weber argued that the increased rationalization

of society – bureaucratization, scientific

and technical progress, and the expanding pervasiveness

of instrumental reason in all domains of

everyday life – would substantively attenuate the

scope of religion, both through the increased specialization

of institutional spheres (of family,

economy, law, politics) and as a result of disenchantment

in the face of competing rationalized

value spheres. Durkheim, although a strong proponent

of the centrality of the sacred to society,

nonetheless predicted that the integrative functions

performed by church religion in traditional

societies would increasingly be displaced inmodern

societies by the emergence of differentiated professional

and scientific membership communities

(see sacred and profane dichotomy). The secularization

thesis, especially its Weberian understanding,

was highly influential in the paradigm of social

change articulated by modernization theorists in

the 1960s. These theorists argued that among the

inevitable and linear societal processes associated

with modernization – including urbanization, industrialization,

the expansion of education and

mass communication (see mass media and communications),

and the increased autonomy of law and

politics from traditional authority – religion would

no longer have the authority that it allegedly commanded

in traditional societies; it would become

socially invisible (Thomas Luckmann) and lose

plausibility (Peter L. Berger).

The modernization–secularization thesis was

widely accepted by western sociologists and,

though there were some exceptions (for example

A. Greeley, Unsecular Man, 1972), many assumed a

priori that religion had lost its significance in

modern societies; whatever empirical evidence

suggested otherwise was largely a vestige of a

cultural lag that would soon disappear. Various

societal factors, such as the increased public visibility

of religious social movements in the United

States, Iran, and Poland and intradisciplinary

theoretical challenges to modernization theory,

converged in the late 1970s and resulted in more

complex and nuanced approaches to the study of

secularization. Sociologists have been particularly

vigorous in debating its meaning and measurement

and investigating evidence for and

against various indicators of secularization. The

application of rational choice theory to religion

has resulted in an intense debate about the ways

in which competitive religious environments

(religious economies) produce religious vitality

and church growth. This paradigm rejects the

assumptions of secularization theory as being

more appropriate for the historically monopolized

religious markets found in Europe but at

odds with the American context of religious pluralism

(R. S. Warner, “Work in Progress Toward a

New Paradigm for the Sociological Study of Religion

in the United States,” 1993, American Journal

of Sociology). Philip S. Gorski argues that credible

empirical claims for either secularization or

religious vitality need to be grounded in longer

historical and broader geographical perspectives

in assessing changes in religion over time

(“Historicizing the Secularization Debate,” in

M. Dillon [ed.], Handbook of the Sociology of Religion,


Secularization today, then, should be understood

in terms of a balance between extensive empirical

evidence in favor of the continuing sociological

significance of religion in the public domain and

in individual lives, and the coexistence of these

sect secularization


trends with equally valid empirical evidence indicating

selectivity in the acceptance of religion’s

theological, moral, and political authority. Both

sets of trends must necessarily be interpreted

with a cautious and differentiated understanding

of the nature and place of religion in earlier sociohistorical

contexts, and with greater attentiveness

to how the contextual meanings of religion and

religious commitment change over time. Given

the importance of religion, especially in political

life, some sociologists, such as Berger, have

argued against the secularization thesis that contemporary

societies are going through a process of

“resacralization.” MICHELE DILLON

Sedgwick, Eve Kosofsky (1950– )

A theorist who has made a major contribution to

the understanding of the theoretical, conceptual,

and emotional scaffolding of modern sexual sensibilities,

Sedgwick was educated at undergraduate

level at Cornell University and received her PhD

from Yale University. Working from within the

discipline of literary studies, Sedgwick’s early

work Between Men: English Literature and Male Homosocial

Desire (1985) establishes the complexity of

men’s relationships in the Victorian era and focuses

on the structured limits of homosocial relationships

and homosexual desires. For example,

Sedgwick highlights how male-to-male desire

became channeled into a competing love triangle

for the love of women. Sedgwick argues that increasing

social circumscription of the expression

of desire led to a reshaping of gender and sexual

relations, something that is explored in greater

depth in Epistemology of the Closet (1990). In this

book, Sedgwick delivers an analysis of how the

male homosexual–heterosexual binary has operated

to authorize the possibilities of sexual identities.

Central to this binary is a tension between a

“minoritizing view” in which gay identity is part

of an identifiable minority and a “universalizing

view” in which same-sex desire is inherent in all

men. Thus the “closet” is emblematic of gay identity

operating to “haunt” and police an already

fractured contemporary male heterosexuality. Deconstruction

of the binaries that fail to hold

sexual and gendered categories is explored further

in her collection of essays, Tendencies (1994).

More recently, her autobiographical work A Dialogue

on Love (1999) is a more personal exploration

of the tensions and incoherency of gender/sexual

categories in her own everyday life and her hidden

desires and fantasies.



This is a naturally occurring practice whereby

groups or social classes in a dominant position

enforce the social isolation of groups or classes

stigmatized as inferior. The isolation is imposed

to limit or deny access to basic social needs, including

housing, jobs, voting, public accommodations,

or the right to travel outside the segregated

region or ghetto. The stigmata are arbitrarily

made with reference to social differences such as

race, gender, sexual orientation, class, ethnicity,

or politics, or position in the world-system. Segregation

is ubiquitous, occurring in virtually any

social arrangement where there are identifiable

marks of differences that can be used by those in

power to enforce their social, economic, or political

advantages or to limit their contact with the

disadvantaged; such marks include physical or

mental disability, age, and body weight. The practice

is commonly de facto (for example social

avoidance of persons with discrediting physical

deformities), but more often is de jure (for instance,

codes excluding children from neighborhoods

reserved for the elderly).

The structural cause of segregation is the scarcity

of social goods – for instance, income, social

status, political power. The social goods may be

material needs necessary to survival (for example

water, food, shelter, health care) or immaterial

desires (for example, status, respect, recognition,

or club memberships). There is no known enduring

social structure without scarcities of one or

many kinds. Segregation is, therefore, a social

means for organizing social inequality which, in

academic sociology, is commonly referred to as

the stratification system. In the modern era, segregation

was thought to be a practice internal to

discrete nation-states, but as globalization became

more salient sociologists have come to understand

segregation as a global practice (e.g., the economic

Share with your friends:
1   ...   125   126   127   128   129   130   131   132   ...   162

The database is protected by copyright © 2020
send message

    Main page