1 Cautious Cars and Cantankerous Kitchens: How Machines Take Control


Communicating With Our Machines: We Are Two Different Species



Download 100.29 Kb.
Page7/7
Date26.02.2016
Size100.29 Kb.
1   2   3   4   5   6   7

Communicating With Our Machines: We Are Two Different Species

It’s morning, time to wake up. I get out of bed, and as I do so, my house detects my movement and welcomes me. “Good morning,” it says cheerfully, turning on the lights and piping the sound of the early morning news broadcast into the bedroom. My house is smart. Nice house.


It isn’t morning yet. In fact, it’s only 3 AM, but I can’t sleep, so I get out of bed. My house detects my movement and welcomes me. “Good morning,” it says cheerfully, turning on the lights and piping the sound of the early morning newscast into the bedroom. “Stop that!” yells my wife, annoyed. “Why are you waking me up at this time in the morning?” My house is stupid. Bad house.
How would I communicate with my house? How would I explain that behavior that is perfectly appropriate at one time is not at another? By the time of day? No. Sometimes my wife and I need to wake up early, perhaps to catch an early morning airplane. Or perhaps I have a telephone conference with colleagues in Bangalore. For the house to know how to respond appropriately, it would need to understand the context, the reasoning behind the actions. What are my goals and intentions? Am I waking up deliberately? Does my wife still wish to sleep, or does she wish to wake up also? Do I really want the coffee maker turned on? For the house to truly understand the reasons behind my awakening it would have to know my intention, but that requires effective communication, a level that is not possible today, nor in the near future.
What do we make of autonomous artifacts, devices that are intelligent and self-supporting, controlling much of our infrastructure, and more and more, also aiding us in our intellectual pursuits? These devices are not intelligent in the ordinary sense of the word. Their intelligence is in the minds of the designers who try to anticipate all possible situations that might be encountered and the appropriate responses. The systems have sensors that attempt to identify the situations and context, but both the sensors and the analyses are deficient, and at times, behave quite inappropriately. As a result, automatic, intelligent devices must still be supervised, controlled, and monitored by people. In the worst of cases, this can lead to conflict. In the best of cases, the human+machine forms a symbiotic unit, functioning well. Here is where we could say that it is humans who make machines smart.
The technologists will try to reassure us by explaining that all technologies start off by being weak and underpowered, that eventually their deficits are overcome and they become safe and trustworthy. “Don’t get excited,” they say, calmly, “all the problems you speak of are true, but this is just a temporary situation. Every problem you mention can be cured – and will be. Relax. Wait. Have patience.”
Their message is sensible and logical. And they have been saying it since the beginning of the machine age. At one level they are correct. Steam engines that propelled trains and steamships used to explode, killing many: they seldom do anymore. Early aircraft crashed frequently. Today they hardly ever do. Remember Jim’s problem with the cruise control that regained speed in an inappropriate location? I am certain that this particular situation can be avoided in future designs by coupling the speed control with the navigation system, or perhaps by systems where the roads themselves transmit the allowable speeds to the cars (hence, no more ability to exceed speed limits), or better, where the car itself determines what speed is safe given the road, its curvature, slipperiness, and the presence of other traffic or people. But faster than we can solve the old problems, new technologies and new gadgets will appear. The rate at which new technologies are being introduced into society increases each year, a trend that has been true for hundreds of years. Lots of new devices, lots of potential benefits, and lots of new, unanticipated ways to go wrong. Over time, our lives become better and safer, worse and more dangerous, simultaneously. Do we really only need more patience?
No. I believe the problems that we face with technology are fundamental. They cannot be overcome. We need a new approach. A fundamental problem is that of communication. Communication between two individuals requires that there be a give and take, a discussion, a sharing of opinions. This, in turn, requires that each understand the arguments, beliefs, and intentions of the other. Communication is a highly developed, highly advanced skill. Only people can do it, and not all people. Some conditions such as autism interfere with the give-and-take of real communication. Autistic people do not seem to have the same social skills, the same ability to understand issues from the other person’s point of view that other people have. Well, our machines are autistic. Problems with machines arise when the unexpected happens, or when people get in the way, or when communication is required. It is then that the autism of machines becomes apparent.


Autistic Machines

“I see you are in a bad mood,” says the house as you return home at the end of the day, “so I’m playing your favorite cheery music.”


Autism: A severe psychiatric disorder with numerous behavioral deficits, especially in communication and social interaction.
Autistic people are sometimes described as lacking empathy, the ability to understand things from another’s point of view. They are more logical than emotional. And in general, communicating with an autistic person can be difficult and frustrating. It may be unfair to autistic people to identify their difficulties with the failings of machines, but the analogies are compelling. Machines are autistic. They like simple, direct commands. The prefer to utter simple, unambiguous statements. There is no dialog possible, no mutual understanding. Yes, people can come to understand machines, but can machines understand people? Not today, not in the near future, perhaps not even in the far future. And until there is mutual understanding, mutual conversation, the communication difficulties between humans and machine remain the fundamental roadblock to the efficient deployment of autonomous devices.
The clothes washing machine and drier, the microwave oven, and for that matter, the regular oven in my home all claim to be intelligent, to determine how and when they will wash, dry, and cook my food. But the communication with these devices is very one-sided. I tell it to start and off it goes. There is no way for me to understand what it is doing, how I might modify its actions. No way to know just why it has decided to take the actions that it has decided upon, no way to modify them. We talk to it, but it refuses to communicate with us. The devices would be so much nicer, friendlier, social if they would explain what they were doing, let us know what actions we could take. Lack of knowledge leads to lack of comfort,
The fact that machines are autistic does not require us to stop developing technology that may be of use to people. It does require us to rethink the methods by which we interact with the new technologies.
I am a technologist. I believe in making lives richer and more rewarding through the use of science and technology. But that is not where our present path is taking us. We need a calmer, more reliable, more humane approach. In the current approach, our new, smart technologies act as autonomous agents, working by themselves, in theory, without human assistance. They are more and more becoming masters of their own fate, and masters of our behavior. We have become servants to our machines.
When it comes to mechanical operations, there are good reasons to trust machines more than ourselves. After all, I use a calculator precisely because I often make mistakes in arithmetic. I use a spelling corrector because I often make typing and spelling errors. Automatic safety equipment does save lives. But what happens when we move beyond the mere mechanical into the symbolic, the intentional, and the emotional? What happens when we start talking about values, and goals, and trust? Why should we trust artificial devices more than people? Good question.
The problems with our interaction with machines have been with us for a long time. How long? Maybe since the very first stone-age tool. My records only go back as far as the year 1532, with a complaint that there were so many adjustments that the 1532 model plow was far too difficult to operate: perhaps the first recorded case of featuritis. Today, an entire industry of human-centered design has grown up, attempting to ameliorate this problem. But the problem is getting worse, not better. Moreover, as intelligent, autonomous devices are introduced into everyday life, first in our automobiles, then on our homes, the problems will explode.
As we enter the era of intelligent devices, my major concern is that the communication difficulties between these two species of creatures, people (us) and machines (them), will cause the major difficulties. Us versus them. We intended this, they intended that. Many an accident, I fear, will result from these mismatched intentions. How do we overcome these communication problems? The problem is, I fear, that our machines suffer from autism.
But it is not good enough to complain: complaints without solutions get us nowhere. So this is a book about solutions, a call to change the approach. We must maintain our own autonomy. We need our technologies to aid us, not control us. We need more devices that act as servants, as assistants, and as collaborators. It is time for a humane technology.
We fool ourselves into thinking that we can solve these problems by adding even more intelligence to the devices, even more automation. We fool ourselves into thinking that it is only a matter of communication between the devices and people. I think the problems are much more fundamental, unlikely to be solved through these approaches. As a result I call for an entirely different approach. Augmentation, not automation. Facilitation, not intelligence. We need devices that have a natural interaction with people, not a machine interaction. Devices that do not pretend to communicate, that are based on the fact that they do not and cannot. It is time for the science of natural interaction between people and machines, an interaction very different than what we have today.


References and Notes


Licklider, J. C. R. (1960). Man-Computer Symbiosis IRE Transactions on Human Factors in Electronics, HFE-1, 4-11.


a Copyright © 2006 Donald A. Norman. All rights reserved. http://www.jnd.org don@jnd.org. Excerpt from “The Design of Future Things”. Draft manuscript for a forthcoming book.


1 Copyright © 2006 Donald A. Norman. All rights reserved. http://www.jnd.org don@jnd.org


2 “The Sensor features detect”. Manual for General Electric Spacemaker Electric Oven, DE68-02560A. Dated January, 2006.


3 “The COTTONS, EASY CARE and DELICATES cycles automatically sense fabric dryness.” Manual for General Electric Spacemaker Driers, 175D1807P460. Dated August, 2003.


4 “the ExtraClean™ Sensor is measuring.” Manual for General Electric Triton XL™ GE Profile™ Dishwashers, 165D4700P323. Dated November, 2005.


5 “human brains and computing machines will be coupled together very tightly.” (Licklider, 1960)


6 From an emailed conference announcement. Material deleted and the acronym AmI has been spelled out (as Ambient Intelligence). See http://www.di.uniba.it/intint/ase07.html


7 Lee, C. H., Bonanni, L., Espinosa, J. H., Lieberman, H., Selker, T. Augmenting Kitchen Appliances with a Shared Context Using Knowledge about Daily Events. Proceedings of IUI 2006


8 http://web.media.mit.edu/~jackylee/kitchensense.htm Accessed Oct. 10, 2006.


9 Excerpt from Scott Frank’s script for the movie Minority Report. The script was found at http://www.dailyscript.com/scripts/MINORITY_REPORT_--_May_16th_2001_revised_draft_by_Scott_Frank.html (accessed Oct. 10, 2006).



Share with your friends:
1   2   3   4   5   6   7




The database is protected by copyright ©essaydocs.org 2020
send message

    Main page