Computer and Information Science Department, New Jersey Institute of Technology, University Heights, Newark, NJ 07102.
E-mail: email@example.com Published as: Whitworth, B., Van de Walle, B & Turoff, M., (200), Beyond Rational Decision Making, Group Decision and Negotiation 2000, Scotland
We review in this paper the assumptions of rational decision making, and present business situations where its assumptions are not met. We then explore how managers deal with such situations. The result is an expanded model of group decision making, which includes relational and group processes, which generate personal trust and group agreement respectively. Managers use these non-rational processes to extend decision making into many areas otherwise beyond pure rational analysis. They expect to use all three processes in group decision making, and judge the usefulness of decision support software accordingly. Unfortunately for software designers, each process demands different data structures and communication environment properties. For example anonymity improves rational analysis but not interpersonal relationships. Supporting all three overlapping processes in a flexible manner is the challenge facing groupware today. The "tragedy of the commons", where individuals compete for a limited common resource, illustrates a problem situation beyond simple rational analysis and occurring in conservation negotiations today. There seems no rational solution - each individual seems rationally driven to a course of action that destroys the common resource, to the detriment of all. The solution we propose is to focus on changing the decision making entity, by designing decision support systems to integrate agreement generation with protocols, structures and decision aids. How this could be done is considered. By including group and relational support, we hope to develop groupware that is used as widely as e-mail is now.
This paper begins with a simple question. Given what we know about rational decision making, plus the power of computer systems to implement defined methods, why are decision errors still common in organisational life? Considerable work exists on rational decision making, including utility theory , game theory  and Bayesian analysis . We also know human judgement has defects (or biases) which decision modelling can overcome . Surely software can guide the decision maker through the steps of rational analysis to eliminate such biases? Managers could use computers to confidently calculate the best answer for each situation. Since such decision support software exists, the same question becomes why do managers rarely use it, instead usually preferring their own judgement? Though these software tools have been available for some time, their lack of use is still widespread. Actual management decision making it seems involves processes that do not remotely approximate the rational ideal, yet are surprisingly effective for all that . It seems formal analysis is not used because it does not match human social demands. No amount of improvement analytic techniques will increase their use if valid social processes are ignored. To define these processes, this paper looks at the assumptions of rational decision making. We identify those that often fail in practical business situations, and consider how managers handle these situations. We suggest social processes are designed specifically to cope with these cases. Finally we consider how, by supporting social processes, computer decision support can begin to operate outside the boundary these assumptions imply.
The assumptions of rational decision making
Rational decision making operates where a decider in an environment must choose among alternative courses of action, each leading to a different expected environmental outcome, where some outcomes are preferred over others, based on the objectives of the decider. The decider must be able to predict significant properties of expected outcomes based upon some rational understanding of cause and effect in the given environment. We use ‘outcome’ to mean the particular environment state that follows a given choice action. In this situation, the elements of rational decision making can be considered to be:
Alternatives (X). A set of alternative courses of action, each leading to an expected actual outcome, e.g. to open a branch in Tokyo or not.
Criteria (C). A set of decision criteria that define desired values for significant properties of an outcome, e.g. that the branch will make a profit. These criteria derive from one or more objectives.
Model (M). A predictive logic which requires certain defined information about causal factors (A, B, C etc) to calculate significant properties of expected outcomes (given that information), e.g. profit and loss calculations.
Information (I). information required by the predictive model (e.g. cost of operations in Tokyo, likelihood of sales, etc).
Analysis (A). A means of analysing and evaluating the expected properties of the alternative outcomes against the decision criteria.
Decider (D). The decision entity.
Given information (I) regarding factors A, B, C etc, the decision maker (D) uses some predictive model (M) to calculate expected values for the outcome criteria (C) of the various alternative choices (X), and then analyses these by some method (A) to see which is most likely to produce the most desirable outcome. For example given the business set-up costs in Tokyo, plus estimates of likely sales, one can estimate the likely profit, and so decide whether or not to open a branch there. The decision maker is included because the other factors are not necessarily objective - perceptions about choices, methods, criteria and information may vary between individuals. Increasing the number of outcomes, choices or criteria, or using more complex methods to match criteria to expected outcomes, doesn’t change the basic procedure, as shown in Figure 1, where IA, IB, IC, … represent information about the factors (A, B, C, …) expected to affect the final outcomes. This information essentially describes the environment to which the decision applies.
Figure 1. Rational decision making From simple beginnings the above model has become more complex but not changed its basic form. For example early analyses assumed a fixed set of alternatives, a single criteria with linear acceptance values, a deterministic causal model, initially available information and a single defined solution. It has become apparent that in a dynamic decision making situation, the alternatives, information, and expected outcomes may change from moment to moment. Hence each of the above factors needs to be a function of time (t) . Multi-criterion decision analysis now allows for multiple criteria, which may conflict. Bayesian analysis incorporates probabilistic rather than deterministic predictive statements. Finally it has been necessary to recognize the decider, as many aspects of the process are subjective estimates. The final set of decision elements is formally defined in Table 1, with the enhancements to traditional decision theory noted. The result of a rational analysis is a final decision, which is one of the action choices xf. We can describe this process as:
xf = R (X, C, M, I, A, D)
where R is the rational analysis. Note that rational analysis is not considered to be merely the operation of the analysis method, but includes setting criteria, defining the alternatives, gathering information, applying a predictive model, applying criterion analysis, and a subjective component. We exclude the decider’s purpose because although it exists (and defines the criteria), the analysis does not operate upon it. For a similar reason the environmental outcomes are not considered an element of rational analysis. We will now consider the assumptions of rational decision making in terms of these elements.
Table 1. The elements of rational decision making
(Xi .. Xj)(t)
A set of i to j behavioural choices which may change with time (t). Each leads to an expected outcome.
Not necessarily a fixed set.
(Cm .. Cn)(t)
A set of m to n evaluative criteria which may change with time (t). Each may be applied to each expected choice outcome.
Can have multiple criteria, which may contradict. Not necessarily simple linear functions.
A logic which allows prediction of criterion outcome values given information.
Prediction may be probabilistic. Allows subjective estimates.
(IA .. I@)(t)
Information relevant to the likely criterion outcomes of the choice set.
May vary with time.
Applies the criteria to the expected outcomes to generate a result.
The choice of method may vary with situation.
As other factors are usually subjective, analysis may vary with decider state.
Recognises subjective element.
The result will be a list of assumptions. We will then decide which assumptions are significant, and how they may be dealt with.
Rational analysis requires the alternatives (Xi .. Xj) be given. If not, a phase of brainstorming alternative courses of action is required. Action choices which are not known cannot be evaluated. If there are unknown alternatives (Xz) with better outcomes, rational analysis may not produce the best alternative. In the real world there are always other possible courses of action, so this set can never be exhaustive or complete. However it needs to be given for the rational analysis to proceed. Problems where the choice set is not defined can be called undefined. Rational analysis assumes a problem that is defined.
Rational analysis generally assumes the analysis result itself (xf) is not one of the factors affecting the outcome (labelled A, B, C in Figure 1). If this were so, the action of making the decision based on various factors would be one of the factors used in deriving that decision. Such self-recursion is a feature of problems that resist logical analysis, involving as they do a potentially infinite regress. For example suppose Microsoft bought shares in a new company, based on a rational analysis that the share price value will go up. That Microsoft has bought shares in a company will influence its share price. Buyers would think “If Microsoft supports it, it must be good”, causing the share price to rise. In this situation, the original analysis becomes almost irrelevant - Microsoft could buy any company and its share price would go up. The situation arises when the decider has a major effect on the decision environment. Looking at business decisions, it is not uncommon for managers to make decisions in an environments they influence (to a greater or lesser degree), and mathematical feedback models may deal with this situation to some degree. This class of problems can be called recursive, because the decision affects itself. Rational analysis is not guaranteed for recursive problems.
Rational decision making assumes the decider knows what they want, i.e. their purpose. Without criteria, there is no basis for preferring one outcome over another. Since what one thinks one wants and what one actually wants may differ, this assumption may fail more often than usually imagined. In the classic story, King Midas wished that everything he touched would turn to gold. After unwittingly turning his only daughter into a statue of gold, he offered his kingdom to undo the outcome he had earlier desired with all his heart. While an extreme case, it illustrates that outcome criteria may change. Criterion changes move the foundation of rational decision making, and defining what the problem need really is may be the greatest challenge of decision makers . One might imagine some rational estimate could be made of outcome criteria certainty. Where outcome criteria error exists, any estimate of outcome criteria certainty is equally likely to be in error, if it is made by the same entity. Indeed, the larger the actual outcome error, the larger (i.e. lower) the expected outcome error estimate – supreme outcome confidence may be one of the signs of misdirection. Problems for which the outcome criteria are unclear can be called equivocal. Rational analysis fails for equivocal problems.
Rational analysis assumes a valid causal model that allows estimation of expected choice outcomes in criterion terms, given information about certain causal factors A, B, C, …. The model must define the set of information (IA .. I@) needed to predict outcomes. If we consider whether we know that the defined required information set is complete, the answer is we don’t. There may be some highly relevant information easily available (IZ), but which we are not seeking, because we don’t know it is relevant. Since all models are simplified representations of reality, they are necessarily incomplete. Hence we can never guarantee the completeness of the set of required information. If someone were to say “I have information relevant to your decision.” we would be foolish to reply “No thank you, I already have all the necessary information.” Problems where we don’t have a valid causal model that correctly defines the required information, can be called misunderstood.
In addition certain complex problems, not that uncommon in a business setting, are of a nature that changing a single fact changes the whole decision. Such problems are sensitive, and can be revealed by “sensitivity analysis”. While insensitivity is not an assumption of rational analysis, sensitivity does magnify the effects of misunderstanding any situation.
Rational analysis assumes required information is:
Gatherable without altering the decision situation
Descriptions of problem solving often list as an early step “gather relevant information”. This assumes information is like corn, waiting to be harvested. In business, this is rarely so. Information must be skilfully pursued, or it will not be obtained. For example, try calling companies for information on their staff or strategies. Ask a simple question like who made the most sales last year, or what product lines are likely to be discontinued next year. Where one cannot get the desired information, the decider can be said to be uniformed, because required information cannot be found. If relevant information is left out, the analysis may be incorrect.
A further assumption is that available information is correct, or that its degree of correctness is known. If everyone were honest, the only source of error would be genuine mistakes. However in business (as in love and war), people may deliberately seek to misinform. The military calls this “counter-intelligence” - the feeding false information to the enemy’s information gathering units. Misinformation may also occur when subordinates respond to superiors who may judge them. For example if a manager asks an employee “Any problems?”, the answer is likely to be no. Only a naive manager would conclude this meant there were indeed no problems. Where information provided is invalid, the decider can be said to be misinformed. In these cases the principle of “garbage in, garbage out” applies. The rational analysis is only as good as the information fed into it.
Finally it is assumed that the gathering of information itself does not significantly change the situation. If it does, the analysis can be invalidated. The problem arises where the gathering of information is a significant action. A simple example is looking at another person, who becomes aware of being observed and wants to know why. In looking at another we reveal our interest. Consider a company deciding whether or not to discontinue a product. One factor may be how many customers would be affected by the discontinuation. Now imagine asking them that question. Asking the question alerts customers that discontinuing the product is under consideration. Some customers may subsequently convert to another product. The company may be forced to discontinue the product from lack of customer support, making the original choice academic. Problems where people may judge our intent in gathering information, which then changes the situation significantly, may be called political. In these cases the operation of the rational process may alter the situation and invalidate itself.
Rational analysis assumes the elements of the method calculation do not change significantly during the time (t) taken in decision analysis. If it did, the analysis when completed might be out of date. One might suppose this is not a problem, given computers can make complex decisions faster than any human (e.g. the use of computers to plot courses for boats in the America’s cup). However this speed presumes computer access. It sometimes takes a relatively long time to access a computer. In some situations, in the time it takes to move to and boot up a computer to analyse some information, the situation has changed. Such problems can be called mobile, and rational analysis assumes the problem is not moving faster than the analysis.
The most subtle assumption in rational decision making is that there is a unitary decision making entity. If this is not so, the decider is in conflict. In this case, the elements of rational analysis may become undefined. Suppose the decision entity is a group, not an individual - a common situation in organisations. Each individual could have different outcome criteria, action choices, prediction models, information and analysis methods. Which set is to be used for the analysis? Can we somehow average the decision entities of group members? Common sense tells us this is not likely to be successful. For example in an early Delphi study on the future of Ferro-alloys in the steel industry, three experts were asked to design a simple flow model of commodities through various U.S. iron and steal making processes . The model had 45 legs, each representing a flow of a given material. While steel is one of the most measured industries in the U.S., thirty of these were not measured in any data source. It was decided to send the model to participating industry experts, giving them the 15 measured quantities for last year, and asking for estimates of the missing thirty. This turned out to be a failure because twenty five of the 45 experts decided to redraw the model diagram, as they did not agree it was representative of the industry. Without agreement on the model, the analysis failed. Clearly in business, agreement cannot be assumed regarding the foundations of any analysis. Rational analysis assumes a single entity doing the analysis, in order to get a single context within which the analysis can proceed. Does this mean rational analysis cannot operate with groups? If the group is in conflict over any of the elements of rational decision making, this seems to be the case. How can rational analysis be used to resolve conflicts over the assumptions of that analysis? For example how could one purpose to be logically chosen over another?
These problems are deep, even so, in business it is common for groups of people with different purposes, information and causal models to meet and try to make what all assume will be a rational decision, where the “best” option is chosen. Only upon investigating what groups actually do has it become obvious that rational analysis of some task is at best only part of what groups do. The creation of group from individual choices has been found to be more complex than the simple concept of majority view implies, as Arrow’s impossibility theorem indicates . Polarisation studies, where groups make decisions more extreme than their members, also show the transformation from member decisions to group decisions is not a simple, logical one . Computer-based studies show that group polarisation on an issue occurs just as well with or without the exchange of arguments and reasons, so long as group position information is exchanged . We must conclude the resolution of group conflict is unlikely to be based on a rational process. Situations where the decider is not unitary, and in conflict regarding analysis elements, can be said to be deciderconflicted. Rational analysis may undefined where the decider is conflicted.
Beyond rational decision making
Table 2. lists the problems that may fall outside rational analysis. These are for a single decision – not for a sequence of connected decisions, as for example in software development projects. They are not mutually exclusive. Suffice to say for a conflicted group that is both uninformed and misinformed, to try to apply rational analysis to an undefined, recursive, equivocal, mobile, and political problem, that is not properly understood, is not recommended.
Table 2. Problems which fall beyond rational analysis
The set of alternative action choices is not defined, or improperly defined.
The decision is a factor in determining the outcome criteria values.
There is no defined set of criteria by which outcomes may be assessed.
The problem situation changes significantly in the decision making time.
The model predicting the expected outcomes is invalid or incomplete (and the required information set is likewise invalid or incomplete).
Required information is not available.
Available information is incorrect.
The gathering of information changes the decision situation.
The decision is being made by a unitary entity.
Of the above assumptions, some can be accepted as self-evident. For rational analysis to work we must be able to define what we want, understand the situation and have a valid evaluation method. Others seem not to be general concerns. Fast moving problems occur in business, but most are not minute critical. And with computers access continuously improving (e.g. wireless applications), this seems not a major problem. Recursive problems only have significant effects when the decider has a majority influence on the decision environment, which by definition is generally not the case. Undefined problems generally respond to a brainstorming session, where participants propose alternative solutions. However the remaining assumptions cannot be dismissed. Problems where people are uninformed, misinformed, sensitive to information gathering and in conflict seem common in a business, where we can rarely assume complete and valid information, that gathering this information did not change things, and that our group agrees. Hiding information, presenting false information, sensitivity to information gathering and disagreement between members of a decision group, seem the rule, rather than the exception, in organisations. To exclude all problems with these properties would be to exclude most problems. These problems can be said to arise from two base causes:
People. In business most of the information we receive comes from or through other people, who choose what information to transmit, and judge our requests for information.
Groups. In business most decisions are made in, or on behalf of, a group.
In other words, it is generally people who manipulate information, and groups which cause decider conflict. It follows that these two factors (people and group) must be added into any decision analysis model. The decision process must then include these factors, as must any decision software support. How this can be done (and is in fact done by human beings) is suggested by a recently introduced Cognitive Three Process (C3P) model, which proposes people use two additional processes, distinct from the rational analysis process (A) defined earlier . It proposes human decision making (H) is defined by:
xf = H (R, P, G)
where, P is a non-rational people process, and G is a different, but also non-rational, group process. Hence while rational analysis may indicate one choice, the people process might favour another, and yet a third choice be suggested by the group process. The actual final decision will be a human judgement using all three, and it is in the balancing of these three processes that makes real decision making so complex. While human decision making may use sub-optimal rational analysis (R), its inclusion of P and G more than compensates for the deficiency, and hence for practical reasons is preferred by managers.
A cognitive three process (C3P) model
The C3P model proposes that humans have adapted to social activity by developing three distinct cognitive processes. Each is designed for a distinct type of problem. The first is raised by tasks in the physical world, the second by other people as individuals, and the third by groups we belong to. Three distinct processes are necessary because each problem raises different requirements. These processes can be described as purposes, namely resolving the task, relating to others, and representing the group, defined as follows:
Resolving the task addresses a goal based in the external world. It requires the gathering and resolution of task information. It leads to knowledge of the task and task environment. It allows informational influence (influence based on what is being communicated).
Relating to others addresses the goal of developing relationships with other people, which may be positive or negative. It requires a history of interactive communication with a known other that allows predictability for future interaction. It leads to mutual understanding between oneself and other individuals. It allows personal influence (influence based on who is communicating).
Representing the group addresses the goal of group unity. It requires each individual to reflect the group’s identity, values, and structure in a “prototypical” way. It leads to common knowledge, motives and behaviours within a group. It allows normative influence (influence based on what the group is doing or normally does).
The C3P model proposes that all three processes occur simultaneously, across the same physical behaviour set. This is possible because the proposed processes are cognitive. The upshot is that in a given situation, managers may be all at once resolving task information, relating to others and representing the group, rather than resolving the task alone. They are dealing with people and group issues as well as task demands. Because individuals in organisations must deal with tasks, people and groups, they seek task, personal and group information, which makes them subject to three types of influence, namely informational, personal and normative influence (Finholt & Sproull, 1990). For example a student may complete a homework assignment because they like the professor as a person (personal influence), because they find the subject interesting (informational influence), and because everyone else is doing it (normative influence).
The C3P model presents the process of relating to another individual, and personal influence, as fundamentally different from the process of identifying with and representing a group identity, and normative influence. The purpose of relating is to create relationships (which allow inter-personal predictability), while the purpose of identifying with a group is to create group unity (which allows group action). The latter generates agreement, while the former generates trust. Our language reflects this difference, as a person without friends is lonely, but someone without group membership is alienated. The distinction between normative and personal derives from social identity theory , which reinvents the group as a cognitive rather than physical reality . It contradicts traditional views of interpersonal attraction as “the ‘cement’ binding together group members” [25, p229], and the main cause of group cohesiveness . As evidence, it finds cohesiveness in groups whose members dislike each other [11, 28], and that cohesive groups can have low interpersonal attraction [16, 20]. It states clearly that normative influence is impersonal, and demonstrably so [4, 31]. Its most telling argument is that large groups are as cohesive as small ones, implying a common process. Since interpersonal attraction cannot cause cohesiveness in large groups, like “America”, it probably does not cause it in small groups either .
This model then gives three distinct ways for an individual to make a decision. The first is pure rational analysis of factual information (xf = H (R)). The second way is to is decide based upon relationships and knowledge of people (xf = H (P)). For example we let another person carry out the analysis, take their advice, and believe in their capability and honesty. This can save time in complex areas, where we cannot personally analyse every issue. Or we could decide to give a job to a friend even though they were not the best job candidate in order to keep a good relationship with them. Finally one can decide using the expectations of the group one belongs to (xf = H (G)). For example a group policy document or norm may define the case. Group norms and practices can give access to knowledge extending over the life of the group, distilling the knowledge of many individuals. The group process allows groups to enact solutions even in equivocal cases, such as which side of the road to drive upon. Members need only choose, and whatever the majority chooses will be seen as the group decision, and cohesion will occur around it. Which process is best? Each has advantages and disadvantages. The disasters of “group think” are well known, and can be considered to arise from the normative process operating without individual analytical thought . The problems created by pure “rational” task analysis, without personal relationships or concern for the group, may be just as great but are only recently becoming apparent . Likewise nepotism and cronyism (allocating special privileges to relations and friends), can be as excessive application of personal influence, and seem major problems in the world today. These problems arise when processes which naturally complement each other are used in isolation. The solution seems to be to use all processes together (xf = H (R, P, G)), rather than trying to pick the best and eliminate the rest. Without normative influence, how could a society work together? Without personal relationships, we would treat each other as objects. If we combine rather than isolate them, all processes are strengthened, and the many sides of human nature become an advantage.
The benefits of relationships
Building relationships helps the information source problems earlier found to be beyond rational analysis. For example consider the problem of calling a company to find out information about its staff or strategies. Everything changes if one knows somebody in the company. Hence the saying, it is not what you know but who you know. In addition to being willing to actually talk to you in the first place, someone with whom you have a relationship is much less likely to give you false information, because to lie, or be unwilling to communicate, could destroy the relationship. Relationships also allow us to ask questions in sensitive situations, and know whether the other person will over react, or broadcast our interest. Furthermore people who know us, and the decisions we face, may volunteer relevant information which we otherwise might not have considered. Research on relationships stresses self-disclosure, mutual understanding and predictability as key dimensions . When we understand another, and they us (through mutual self-disclosure), then we form expectations regarding their future behaviour. A relationship means the carrying forward of previous data learned about the other person, and a reduction of uncertainty about their nature and motives in future interactions. If people are major sources and transmitters of information in social groups, it makes a great deal of sense to build up relationships. A relationship is a repository of information, except it is a mutual repository, that is mutually sustained.
The benefits of group identity
The issue of group unity addresses the most fundamental of all the problems that are beyond rational analysis – decider conflict. To decide, there must be a decider. Whenever the decision maker is a group, there can be decider conflict. If the decider is conflicted, there can be no decision. For example if half a group wishes one thing, and the other half another, the group is paralysed (or splits apart). The ability to form (and maintain) groups is what makes us a social animal, and somehow we are able to make group decisions. The main mechanism by which this occurs is here proposed to be social identification. When people socially identify with a group, they incorporate the group’s identity into their own sense of being. Hence they may react to threats towards the group, or any part, as if threatened individually. We have cultural groups, national groups, sports groups, family groups and organisational groups. Each group defines a set of people who think, feel and behave the same way in certain areas. For example to be an American implies one values political freedom. Having a common identity results in common behaviour because an individual’s identity, their “idea of themselves”, strongly determines their behaviour. A person who believes himself to be honest is much more likely to be honest than one who does not. A common group identity generates agreement, which is an important group output because it allows the group to act as one . When a person becomes isolated or “cut off” from the group they lose more than just personal contact – they lose a sense of belonging, and of who they are. Abandoning a salient group, for example a religion or culture, has major existential ramifications. It not only affects all relationships dependent on that identity, it also changes the behaviour individual’s expect of themselves. It is not difficult to see why individuals do not like to leave the haven of a social group, and why we have a powerful drive to conformity. Such conformity occurs even when one neither knows nor interacts personally with other group members . Such conformity is generally seen as a weakness of individuality, but it is not necessarily so. Without conformity groups could not act. By generating agreement, it allows groups to form common purposes, hold information, and believe in common causal models.
Levels of groupware support
The C3P model proposes a balanced approach to group decision making, as all three processes are considered equally important and complementary to each other. Clearly computer groups need all these processes for the same reasons face-to-face groups do. The challenge of groupware then is to manage these multiple human purposes, suggesting three levels of support:
Support for task analysis and resolution.
Level I, plus support for personal relationships.
Level II, plus support for groups, norms and social structures.
Simple networks provide level I basics, e-mail systems add level II basics, and groupware offers the promise of level III, allowing balanced group decision making.Improving groupware design means not only improving support for each core process of the C3P model, but also their combinations. Unfortunately the communication properties that help one process can hinder another. For example, the resolution of factual information is aided by tools improving communication clarity and reduce ambiguity . However the same ambiguity that rational communication tries to eliminate may be a beneficial “social lubricant” when relating to others . Anonymity helps the impartial consideration of ideas , but lack of sender recognition also causes interpersonal problems [7, 36]. The ability to edit e-mail before sending (its “rehearsability”) can improve message accuracy . However the same rehearsability that improves accuracy also allows pretence, handicapping relating, which relies heavily on spontaneous “back-channels” (like tone of voice and facial expressions) to accurately reflect the sender’s state. Genuineness and rehearsing transmissions are mutually incompatible.
Different processes may require different data structures. The data structures for relating for example differ from the “discourse structures” of idea analysis, because relating is a sequential interactive process occurring in time, not in logical space . People often resort to re-sending the original mail in their reply, to get the continuity a turn-based conversation requires. After a few such replies, each transmission consists mainly of information that has already been transmitted. This wastes network resources, and inconveniences users. A tree based idea structure poorly serves the relating process. Each process requires different software design, and software primitives must be designed around psychological primitives . If users see themselves as in a “conversation”, the software should reflect that. So while one-to-many, one-way (1 m) communication suffices for factual information exchange, personal relationships require one-to-one, two-way (1 1) interactive communication, where a picture of the sender can be built up.
Into this complexity, the C3P model introduces a third fundamental psychological process, again with distinct communication and data structure demands. Normative influence requires each group member to continuously monitor the positions of all other members, so the group can work as one. This equates to a many-to-many, two-way communication structure (m n). For example in a choir, everyone affects, and is affected by, everyone else’s singing. Normative influence keeps the group together – so even when they sing off key, choirs usually do so together. Computer-mediated dynamic voting can implement the normative process, because it also allows many-to-many, two-way communication . Supporting group structure means more than just maintaining a membership list. It extends to formal group norms (constitutions), group delegated powers (authorities and roles), methods of new norm formation (e.g. parliamentary systems), sanctions (laws), group defined procedures (bureaucracy), and other social functions. Groupware is only beginning to explore some of these possibilities.
The tragedy of the commons
The tragedy of the commons illustrates a situation which is not amenable to individual rational analysis. Suppose a group of villagers live around a common grass area, and each owns a cow and a small plot of land. Each villager can graze their cow upon their small plot and it will survive. However if any cow also grazes upon the common area, it will grow fat and prosper. Unfortunately if all the villagers’ cows do this, the commons will become overgrazed, and die off. Now if each villager does a logical analysis in this situation, it seems always in their interest to graze the commons, even though this destroys it. This way they get at least some benefit, (if they don’t graze, they get no benefit at all). The parallel of this situation to modern conservation problems is clear (for example take the oceans as the commons and the nations of the world as the villagers). The questions that arises is not why we sometimes destroy common resources, but why we have not already destroyed all our common resources already, and ourselves as well. The answer seems to be the human ability to form a group and identify with it. If each villager identifies themselves with the village group, and thinks as if they werethe group, the rational analysis gives an entirely different result. If the group is the decider, it makes no sense to destroy its own resource by grazing all its cattle at once. An obvious alternative is managed grazing, where each cows takes a share, and the commons is preserved. The problem in this case was resolved not by analysis, but by changing the decider. The formation of larger groups may be the way in which such conflicts have been resolved in the past. Human history can be described as the development from hunter gatherer bands of dozens, to settled tribes of hundreds, to chiefdoms of thousands, to the states of today where many thousands live together as one group [6 p268]. The benefit of this is simply that it allows the “win/win” possibilities of co-operation, rather than the “lose/lose” probabilities of conflict. Such larger groups seem to reduce internecine war, although wars then occur between the larger groups. If the formation of a larger group is indeed the way to resolve conflict, the next question is how can computer interaction support this? For example consider negotiations in a “tragedy of the commons” situation, with resource conflict. Most negotiation software allows the parties to direct communications at each other, hoping for some sort of rational reconciliation. However this bargaining approach encourages the distinctiveness of the parties, which may be the base cause of the conflict. There is no place provided for common statements, and the software does not encourage or represent the formation of a higher level decider. How can this be done? One possibility is allowing a window for common views. Even asking two conflicting parties to name the entity created by both parties together, could help them to think in a new way. Views in this window would only appear with the agreement of all parties. The software might still present the views of each party in separate windows, as in a typical negotiation procedure. Exactly how common views could be generated into the common window is still unclear. Also each party may consist of many individuals, who may also need to form a common view prior to presenting their view to the other party. However the software could be designed so that it is clear that the end result of any negotiation is what is produced in the commonly agreed window. If at the end of the negotiation the window was empty, everyone would have no doubts that nothing had been achieved.
This paper has suggested that decision making in a business setting involves more than just rational analysis of factual information. Recognising of the value of relationships, and of group membership, is evidently sensible in social environments. One may consider such processes irrational because they are fundamentally cognitive, built upon expectations and social perceptions that vary between individuals. Yet they have their own rationality. For an individual to give up their life for the group makes no sense in terms of individual outcomes, yet on a group level it makes sense to lose a part rather that the whole, as the good of the many outweighs the good of the few. When military units decide to withdraw, they must always leave behind a few to “hold the fort”, or the withdrawal becomes a rout, with major loss of life. Those left behind know their chances of survival are low and they cannot win. Yet history is replete with examples of such sacrifices. Such “irrationality” is what makes us human, and it has its own logic. Any process that has a logic it can be supported in computer-mediated interaction. At NJIT we are looking at ways in which software can support personal and group aspects, as well as rational analysis using causal models, using an expanded concept of group interaction. It is hoped that this will lead to groupware that is as widely accepted in business organisations as e-mail is today.
D. Abrams and M.A.E. Hogg, ed., Social Identity Theory: Constructive and Critical Advances, Hemel Hempstead: Harvester Wheatsheaf/ New York, Springer-Verlag, 1990.
K. Arrow, Social Choice and Individual Values, 2nd ed., Yale University Press, 1963.
S.E. Asch, Social Psychology, NY Prentice Hall, 1952.
R.S. Crutchfield, "Conformity and character," American Psychologist, vol. 10, 1955, pp. 191-198.
A.R. Dennis and J.S. Valacich, "Rethinking Media Richness: Towards a Theory of Media Synchronicity," in Proceedings of Proceedings of the 32nd Hawaii International Conference on System Sciences, Hawaii, 1999, IEEE.
J. Diamond, Guns, Germs and Steel, Vintage, 1998.
M.C. Er and A.C. Ng, "The anonymity and proximity factors in group decision support systems," Decision Support Systems, vol. 14, 1995, pp. 75-83.
S. French, Readings in Decision Analysis, Chapman and Hall, 1989.
J.J. Gabarro, "The development of working relationships," in J. Galegher;R. Kraut;C. Egido, ed., Intellectual Teamwork, Hillsdale NJ: Lawrence Erlbaum, 1990, pp. 79-110.
M.A. Hogg, The social psychology of group cohesiveness, Harvester, Wheatsheaf, 1992.
M.A. Hogg and J.C. Turner, "Interpersonal attraction, social identification and psychological group formation," European Journal of Social Psychology, vol. 15, 1985, pp. 51-66.
I.L. Janis, Victims of Groupthink: A Psychological Study of Foreign Policy Decisions and Fiascoes, Boston:Houghton-Mifflin, 1972.
L.M. Jessup and D. Van Over, "When a system must be all things to all people," Journal of Systems Management, vol. 47, no. July/August, 1996, pp. 14-21.
M.F. Kaplan, "Discussion polarization effects in a modified jury decision paradigm," Sociometry, vol. 40, 1977, pp. 262-271.
P. Keen, "Information systems and organizational change," Communicatins of the ACM, vol. 24, no. 1, 1981, pp. 24-33.
D.M. Landers and G. Luschen, "Team persformance outcome and the cohesiveness of competitive coacting groups," International Review of Sports Psychology, vol. 9, 1974, pp. 57-71.
H. Linstone and M. Turoff, The Delphi Method: Techniques and Application, Reading Mass.: Addison-Wesley, 1975.
R.D. Luce and H. Raiffa, Games and Decisions, Dover Publications, 1985.
T.W. Malone, K.R. Grant, F.A. Turbak, S.A. Brobst, and M.D. Cohen, "Intelligent Information Sharing Systems," Communications of the ACM, vol. 30, no. 5, 1987, pp. 390-402.
J.E. McGrath, "The influence of positive interpersonal relations on adjustment effectiveness in rifle teams," Journal of Abnormal and Social Psychology, vol. 65, 1962, pp. 365-375.
J.E. McGrath and D.A. Kravitz, "Group research," Annual Review of Psychology, vol. 33, 1982, pp. 195-230.
M. Nagasundaram and G.R. Wagner, "Ambiguity in human communication and the design of computer-mediated communication systems," in Proceedings of Proceedings of the 25th Hawaii International Conference on the System Sciences, Hawaii, 1992, IEEE, pp. 90-100.
B.N. Reeves and A.C. Lemke, "The problem as a moving target in cooperative system design," 1991 HCI Consortium Workshop, no. Jan, 1991, pp.
J.R. Saul, Voltaire's Bastards: The Dictatorship of Reason in the West, Penguin, 1993.
S. Schachter, "Deviation, rejection and communication," Journal of Abnormal and Social Psychology, vol. 46, 1951, pp.
C. Sia, C.Y. Tan, and K.K. Wei, "Will distributed GSS groups make more extreme decisions? An empirical study," in Proceedings of Proceedings of the 17th International Conference on Information Systems, Cleveland, Ohio, 1996, pp. 326-338.
H. Tajfel and J.C. Turner, "The social identity theory of intergroup behaviour," in S. Worchel and W.G. Austin, ed., The Psychology of Intergroup Relations, Chicago: Nelson-Hall, 1986, pp. 7-24.
J.C. Turner, I. Sachdev, and M.A. Hogg, "Social categorization, interpersonal attraction and group formation," British Journal of Social Psychology, vol. 23, 1983, pp. 97-111.
M. Turoff, R. Hiltz, M. Bieber, A. Rana, and J. Fjermestad, "Collaborative Discourse Structures in Computer Mediated Group Communications," Journal of Computer Mediated Communication, Special Issue on Persistent Conversation, vol. 4, no. 4, 1999, pp.
A. Tversky and D. Kahneman, "Judgement under uncertainty: heuristics and biases," In Judgement Under Uncertainty, D. Kahneman, P. Slovic ;A. Tversky (Eds)., vol. 3-20NewYork:Cambridge U Press, 1982, pp.
B. Whitworth, Generating group agreement in cooperative computer mediated groups: Towards an integrative model of group interaction, University of Waikato, Doctor of Philosophy thesis, 1997.
B. Whitworth, "The web of system properties - A general view of systems," Association of Computing Machinery. Special Interest Group Computer Science in Education, vol. 30, no. 4, 1998, pp. 46a-50a.
B. Whitworth and R. Felton, "Measuring disagreement in groups facing limited choice problems," THE DATABASE for Advances in Information Systems, vol. 30, no. 3 & 4, 1999, pp. 22-33.
B. Whitworth, B. Gallupe, and R. McQueen, "A cognitive three process model of computer-mediated group interaction," Accepted for publication by Group Decision and Negotiation, 2000, pp.
D.v. Winterfield and W. Edwards, Decision Analysis and Behavioral Research, Cambridge University Press, 1986.
R.E. Yellen, "Introducing group decision support software (GDSS) in an organization," Journal of Systems Management, vol. October, 1993, pp. 6-8.
P.L. Yu, "Second order game problem: Decision dynamics in gaming phenomena," Journal of Optimization Theory and Applications, vol. 27, no. 1, 1979, pp. 147-166.