Guide to the vibrant and



Download 17.16 Mb.
Page148/162
Date conversion17.05.2016
Size17.16 Mb.
1   ...   144   145   146   147   148   149   150   151   ...   162
part of a rationalization or secularization process,

whereby attention is focused on the activities

of bureaucratic organizations and so-called

experts. For Herbert Marcuse in such works as

One-Dimensional Man (1964), and other critical

social theorists, technology was characterized as

the dominant form of rationality in society.

For still others, technological change is seen as

an autonomous process in its own right,

according to a technocratic story-line by which

the key actors are engineers and other human

embodiments of materiality. In recent years,

this position has been made popular in social

constructionism, for example by Wsiebe Bijker

et al. in The Social Construction of Technological

Systems (1987). For the majority of social theorists,

however, technology is generally discussed in an

abstract or conceptual way, as principles of production

on the one hand, and procedures of

organization on the other.

For more empirically minded sociologists, technology

is a term that is usually subjected to qualification

or specification. Indeed, the notion of an

abstract, all-encompassing technological system

or technological rationality is seen with suspicion,

or at the least with a great amount of skepticism.

In many varieties of empirical sociological research,

it is rejected for what is often considered

to be its underlying technological determinism.

Instead, technology is seen as something that is

shaped by people in particular social settings or

contexts.

What is typically of interest are the ways in

which material artifacts, that is, technologies in

the plural, are produced by particular actors and

social groups, or the ways in which they are used

in various locales or arenas of social interaction.

Rather than discuss general, abstract relations

technologies of the self technology

624


between technology and society, the dominant

tendency in recent decades has rather been to

differentiate among technologies, and study particular

cases, in relation either to the various societal

sectors or branches of industry or to the

variegated sites or spaces of use and application.

Most empirical sociologists of technology emphasize

the importance of local contingencies, or

contextual factors, in understanding what is

characteristically referred to as the social shaping,

or construction, of technology. Technologies,

whether they be specific artifacts or more comprehensive

systems or clusters of artifacts, are seen to

be materializations of the interests of particular

groups of people. Particularly influential has been

the so-called actor network theory, which has been

associated with Michel Callon and Bruno Latour in

France and John Law in Britain, and the related

social construction of technology, or SCOT, program,

that has been promulgated by Wiebe Bijker

and Trevor Pinch. According to these research

approaches, technological development is investigated

as specific processes of mediation and representation,

in which even nonhuman objects can

become agents or actors.

Another influential stream of empirical sociology

has focused on user sites, or places in which

specific technologies are put to use, often homes

or offices. In these approaches, it is the domestication

or appropriation of technology that is of

interest, how artifacts are made to fit into patterns

of everyday life or organizational routines and

habits. Much of this sociology of technology has

been carried out in “transdisciplinary” settings, in

centers or institutes of science and technology

studies, cultural studies, or women’s studies.

As elsewhere in the social sciences, there is a

noticeable gap between the large number of

micro-level case studies, which have proliferated

in recent years, and the more overarching theories

at the macro-level that have been associated

with the classical writers of the nineteenth and

early twentieth centuries. In relation to technology,

the micro–macro issue has been exacerbated by

distinct national differences regarding the ways in

which sociology of technology has been funded and

institutionalized. Micro-level research has often

been part of programs funded externally, either

by companies or by national and local governments,

as well as by international organizations.

There have been some attempts to help fill the

gap by drawing on the kinds of institutional or

organizational theories that have been popular

in other fields of sociology. There has also developed

a certain interest in the investigation of

social movements that have either fostered technological

developments or opposed them, such as

environmental and anti-nuclear movements. It is

to be hoped that in the future the gap can continue

to be bridged between the disparate case

studies on the construction and use of specific

technological artifacts and the broader understanding

of the role that technology plays in the

contemporary world. ANDREW JAMISON

terrorism

Despite renewed efforts by official organizations

and academic scholarship to define terrorism in

the aftermath of September 11, 2001, there does

not yet exist a single, consensual, widely shared

definition. As a term of political discourse, terrorism

usually implies a value judgment equivalent

to moral condemnation. Although terrorism can

apply to state (state terrorism) as well as non-state

actors – which can act either on their own or in

connection to a state (state-sponsored terrorism) –

in the current international climate this term

habitually refers to the activities of non-state transnational

actors. As a concept, terrorism is usually

subject to important historical reinterpretations

(for example by the winners, in the context of

liberation struggles). As a concrete phenomenon,

it also presents itself in a variety of forms, and it

involves a wide range of social behaviors. At one

extreme, terrorism merges into organized crime,

or even psychopathic behavior by an individual

(for example the Shoe-bomber in the United

States) or a group of individuals (for example the

Aum Shinrikyo movement in Japan). At the other

extreme, it becomes indistinguishable from guerrilla

warfare and other forms of low-intensity

conflict. For analytical purposes, it is useful to

distinguish at least three main approaches to defining

terrorism. The first focuses on the intentions

of the agents perpetrating it; the second

defines it in relation to the values and institutions

of the society that it targets; and the third views

it as a technique of war or direct action.

The first approach, which looks at the intentions

of the agents, points to the historical origins

of the word terrorism. This term initially referred

to the period of the Terror (1793–4) during the

French Revolution, when terrorism was a state

policy designed to terrorize the enemies of the

French Republic, be they domestic or foreign, individuals

or collectivities. Although short-lived,

this conceptualization of terrorism as a necessary

evil to achieve the greater good of the nation

has had numerous followers. In particular, it was

reactivated in the nineteenth century by

technology terrorism

625

European nationalist and anarchist groups, proponents



of what became popularly known as

“propaganda by deed.” Of particular note were

the Russian revolutionaries of Narodnaya Volya

(1878–81) and the well-publicized exploits of

French anarchists such as Franc¸ois Koeningstein,

also known as Ravachol (1859–92) in the 1890s.

Unlike the earlier uses of terror, however, the

activities of these revolutionist and anarchist

movements were not targeting the population at

large but rather the political elite. In the early

twentieth century, and again during the period

of decolonization, such tactics became commonplace

amongst many nationalist movements fighting

European imperialism. Although terrorizing

the enemy was clearly a significant part of their

political and military strategy, the notion of terrorism

itself became charged with negative connotations

and few used it explicitly in their discourse.

Notable exceptions are the advocacy of “The Philosophy

of the Bomb” and the explosion of a bomb

under Viceroy Irwin’s special train in 1929 by the

Indian nationalists of the Hindustan Socialist Republican

Association, or the apology for political

violence by Third-Worldist thinkers such as Franz

Fanon during the Algerian war of decolonization.

Generally, however, political players began to substitute

the words terrorism and terrorist with

positively laden terms such as liberation struggle

and freedom fighter.

The second approach focuses on the actual or

potential victims of terrorism. From this perspective,

terrorism is the act of harming or planning to

harm innocent persons (usually defined as civilians)

in order to put pressure on the political elite

and force it to alter its policies. To make sense of

this perspective, it is crucial to be able to distinguish

between those social actors who can make

credible claims of exercising legitimate violence

and those who cannot. (It is also important to

determine procedurally whether those actors

whose claims to legitimacy are well substantiated

do not delegitimize themselves through their own

use of violence.) One can define this legitimate

violence from two main perspectives, legal and

moral. In the contemporary period, states have a

legal monopoly of violence over a territory – even

though they can be accused of terrorism if they

break existing rules and conventions – and nonstate

actors do not. From a moral perspective,

terrorism is the epitome of illegitimate violence,

destroying not only life and property but also

breaking the norms and values of a given

social order. After September 11, 2001, the latter

became a prominent reading of terrorism and it

was popularized by the “War against Terror”

launched by the United States. From this perspective,

states as well as non-state actors can therefore

be terrorists, but the notion of terrorism becomes

harder to encapsulate as different parts of society

may have different perceptions of what level of

violence is justified, against whom (for example,

is it acceptable to terrorize terrorists?), and so on.

The third and final approach to terrorism is the

systematic and descriptive account of the means

and techniques deployed in the production of acts

of terrorism. This perspective is most common in

the field of military science and other securityrelated

disciplines. What characterizes terrorism

from the point of view of strategy and tactics is

that it is a weapon employed by a militarily

weaker party against a stronger enemy. As a

weapon of war, it is usually used by a few against

the many. Because of its military inferiority, terrorism

must create an impact on its enemy that

far exceeds the actual targets that it is able to

destroy (for example the Twin Towers versus the

United States). Terrorist acts have therefore an

exaggerated impact on how states and societies

behave. At an immanent level, such impact on

popular consciousness is created by the media

coverage of these acts. More fundamentally, however,

this outcome is a consequence of the random

nature of these acts. By multiplying the number of

targets that are deemed legitimate, terrorists not

only force their opponents into a security race to

guarantee the protection of all potential targets,

but also induce the fear of attack in all corners of

society, as total protection is unrealistic. One

recent development in the field of terrorism that

has the potential to contradict these observations

is the emergence of so-called super-terrorism, particularly

in the guise of nuclear terrorism.

Through such technological developments it is

conceivable that, in the future, terrorist organizations

would be able to pose a serious direct military

threat to states and populations.

FREDERI C VOLPI

text/textuality

What counts as a text is a matter of considerable

debate in the social sciences. Since the impact of

poststructuralism and cultural studies in the

1960s and 1970s, texts can no longer be assumed

to refer simply to books. Television programs,

recorded music, magazines, films, soap operas,

and comics have all been studied as having the

properties of texts. The study of text and textuality

is bound up with concerns about the nature of

meaning and discourse. Rather than assuming

terrorism text/textuality

626


that meaning is stable, textual analysis has sought

to demonstrate the extent to which texts can be

the sites of conflict. At their most basic, texts are

simply assemblages of discourse that are combined

together to produce a dominant meaning.

These meanings are usually thought to serve the

interests of certain sections of society and can

thereby seek to reinforce power relationships.

Such texts are mainly produced by dominant

media and cultural industries and employ identifiable

social and cultural conventions to make

themselves understandable to modern audiences.

However, many scholars working in these fields

now like to emphasize the ways in which texts can

have more than one meaning. In this sense, texts

might be ambiguous, internally contradictory, or

be read in novel ways by certain sections of the

audience / consumers. In this sense, texts are often

thought to have a polysemic potential. By this,

what is meant is that the meaning of a text

depends upon the context of interpretation and

the social location of the interpreter.

Intertextuality builds upon the idea of texts

having meanings that are determined through

their relationship to other texts. In this respect,

we can broadly say that intertextuality has a horizontal

and a vertical dimension. The most important

aspect of horizontal intertextuality is that of

genre. Genre establishes a number of categories

that organize cultural production into identifiable

types. For example, television employs a

number of generic categories that include the

news, soap operas, quiz and game shows, reality

television, and so on. Vertical intertextuality, on

the other hand, concerns the relationship of any

text to other texts. This could include deliberate

attempts to reference other works of culture

within the text or the relationship between the

text and cultural commentary or publicity material.

Studying texts with this understanding requires

the interpreter to look at the ways that

meanings circulate between texts and other

aspects of cultural experience into which they

leak. In this respect, much recent work has looked

at the way in which audiences often seek to produce

their own texts as a response to popular or

other texts. This level of intertextuality has led to

the study of fanzines, websites, letters to newspapers,

or even conversations in relation to a

range of cultural material. It should be made clear

that intertextuality is not a structureless pluralism,

but usually involves questions of cultural

power that inevitably structure the ways in which

texts are understood. Cultural industries in capitalist

societies may not be able to control the ways

in which their products are understood, but wider

networks of power and influence will both suggest

certain meanings over others and concentrate the

distribution of some texts rather than others.

NICK STEVENSON

thick description

– see ethnography.

Third Way politics

The notion of Third Way politics derives from the

writings of Anthony Giddens during the period

since the mid-1990s (for example The Third Way,

1998, and Progressive Manifesto, 2003). The Third

Way is an irenic ideology, which is to say that it

postulates a set of principles and policies but does

so without polemical intent. The Third Way is

aptly named on philosophical grounds insofar as

it attempts to strike a balance between liberty and

equality, values that are antithetical when pushed

to extremes. Its policies pivot on the notion that

both increasingly global capitalist markets and

the welfare state are unavoidable institutions in

our time. Here again the Third Way tries to strike

an issue-by-issue balance, in this case between

laissez-faire and socialist policies.

The irenic tendency towards synthesizing extreme

positions is a hallmark of Giddens’s

thought at large. For example, he sees capitalism

and other modern institutional orders as replete

with risks that simultaneously present possibilities

and dangers for institutions and individuals

alike. One task of state policies from the Third

Way perspective is to increase the odds of positive

outcomes from risks while creating programs to

provide a measure of protection against the worst

consequences, should there be negative outcomes.

Along another line, Giddens has been proposing

for many years that the global and the local are

not antithetical alternatives. Rather, what matters

are the ways in which local citizens and governments

reconcile the structuring power of global

processes with their local material needs and cultural

ways of life.

The Third Way entered public life in 1997 when

adopted by the British Labour Party government

headed by Tony Blair (1953– ). Subsequently, the

term was used by Gerhard Schro¨der (1944– ) and

other European politicians and is now part of

political debates in many countries around

the world. The term has been used by former

President Bill Clinton (1946– ) in the United States,

although it is not prevalent in American politics

at large. Historically, Third Way perspectives

renew the spirit of centrist politics that was

text/textuality Third Way politics

627


influential in western Europe and the United

States following World War II. Like all centrist

politics, the Third Way includes an emphasis on

pragmatic possibilities that does not normally

evoke passionate enthusiasm. While commitments

to equality and liberty keep Third Way

policies oriented to moral ends, its supporters

must be watchful that they do not become so

mired down in short-term pragmatics that they

lose sight of their moral goals. Third Way politics

has long needed a philosophical redefinition of

what equality and liberty can and should mean

in our time. In addition, Third Way politics lacks

any well-articulated conception of social justice.

Nor has it developed many policies regarding relations

between state and civil society. Giddens’s

forthcoming writings on equality may begin to

fill in some of these gaps. But the success of the

Third Way will be measured by whether it can

establish centrist policies that will receive sufficient

public support to keep ideological and political

challenges from the left and the right at bay.

I RA COHEN

Thomas, William I. (1863–1947)

Born in Russell County, Virginia, the son of a

Methodist preacher and farmer, Thomas enrolled

in the University of Tennessee in 1880 and eventually

became Professor of English at Oberlin College,

but later became a graduate student in the

first American Department of Sociology at the

University of Chicago. Thomas was promoted to

Associate Professor in 1900 and to a full professorship

in 1910.

Thomas has been influential in modern sociology

for two reasons. First, he developed ethnography as

a special branch of qualitative methodology in his

famous study, with Florian Znaniecki (1882–1958),

of The Polish Peasant in Europe and America, which was

published in five volumes between 1918 and 1920.

In 1908 Helen Culver, founder of Hull House, had

offered Thomas $50,000 to study problems of migration

to the United States. Thomas decided to

concentrate his research efforts on the Polish community

in Chicago, but he also made research visits

to Poland, where he met Znaniecki in 1913, to

collect empirical data. Thomas analyzed 754 letters

that had been sent to Polish families in Chicago,

8,000 documents from the archives of a Polish

newspaper, documents from Polish parish histories

in Chicago, and from the diaries of Polish

immigrants.

Thomas left Chicago after a scandal had sullied

his reputation with the local establishment, and

he later taught at the New School for Social

Research and later, in 1936–7, he was appointed

to a lectureship at Harvard. At the age of seventytwo

he married his second wife Dorothy Swaine

who later became the first female President of the

American Sociological Association.

Second, Thomas is famous for the development

of the so-called Thomas Theorem or “the definition

of the situation” in The Unadjusted Girl

(1923) and, with Dorothy Swaine Thomas, The Child

in America (1928). This Theorem says that if people

define situations as real, they will be real in their

consequences. For example, if a neighborhood

which is dominated by white Caucasians believes

that the presence of low-status migrants would

drive down house prices, they will try to resist

such an influx. Their subjective “definition of the

situation” (outsiders are bad for the housing

market) has objective consequences (falling house

prices and social exclusion of newcomers).

Thomas’s sociological research and publications

have had an enduring impact on the sociological

study of deviance and on ethnographic methods.

BRYAN S. TURNER

Tilly, Charles (1929– )

1   ...   144   145   146   147   148   149   150   151   ...   162


The database is protected by copyright ©essaydocs.org 2016
send message

    Main page