Prof: Joel Vessels



Download 36.44 Kb.
Date conversion25.05.2016
Size36.44 Kb.
The Invention of a Microchip

Aleksandr Shpiler

Prof: Joel Vessels

History 2104

October 26, 2003

The invention of the microchip became a catalyst to advancing technology the world over. As a key component in microprocessors, the microchip has played a significant role in what is typically referred to as the information age.

Outline
I. Introduction

A. Theories

i. Scientists had many ideas but couldn’t put them together

ii. Something had to be invented

B. Invention

i. Finally man came up with the idea

ii. Revolutionized technology world over
II. Development

A. Production of the microchip

i. Took too long to produce

ii. Very effective but in small amounts so updates had to be made

B. Chip too expensive

i. Way too costly for the public

ii. One in 500 people afford a few chips
III. Procedure of implementing the chip for public use

A. Almost 15 years of struggling with the price

i. For some time the technology stopped in terms of microchips

ii. Ideas ran low

B. Invention of silicon chips very cheap

i. The invention of a silicon based microchip stopped the idea depression

ii. Price was now affordable to everyone
III. Effect

A. Effect on the whole world

i. Changed technology all over the world very fast

ii. Silicon chips were the new television


B. Used everywhere

i. Today used in computers and different microprocessors

ii. Also found in cell phones and all little electronic gadgets
IV. Conclusion

A. People rely on chips every day

i. People do not notice they use microchips every day in at least one form

ii. Slowly microchips become something of the past

B. Improvements for the future

i. Scientists looking to improve the silicon microchip –>store more data

ii. Many different examples are already made but not available to public

Books


Reid, T. R. The Chip : How Two Americans Invented the Microchip and Launched a Revolution. Random House Trade Paperbacks. California: October 9, 2001.

This book provides the background and the history of the inventions of the microchip. The first half of this book talks about how unaffordable computers were until 2 scientists invented a silicon microchip. The second part of this book describes the features of the microchip and why it was such a big breakthrough and the beginning of the digital age. The author presents the material clearly and simply.


Zant, Van. Peter. Microchip Fabrication: A Practical Guide to Semiconductor Processing. Prentice Hall. New York: McGraw-Hill Professional April 3, 2000.

This book gives a clear explanation of different ways that microchips can be used. It talks about the formation and the actual makeup of the microchip. It provides a technological theory of the importance of having invented the microchips and talks about an alternative future of what it would have been like if microchips were never invented.
Wolf, Stanley. Microchip Manufacturing. Lattice Press, July 1 2003.

This book presents the actual manufacturing of all different microchips. It compares their costs and describes which microchips are the best for certain purposes. This is basically a book about the technological production of the microchip. It gets into a lot of detail as you read deeper into the book where a novice on this topic would get lost.


Zygmont, Jeffrey. Microchip : An Idea, Its Genesis, and the Revolution It Created. New York: Perseus Publishing, December 24, 2002.

This book is another interesting book that talks about the revolution that the invention of microchips has created. From the beginning, when scientist came up with theories of different gadgets to revolutionize the computer world to the present times and the use of the chips today. The good thing about this book is that it outlines a national information policy.


Furber, B. Steve. ARM System-on-Chip Architecture (2nd Edition). Cambridge: Addison-Wesley Pub Co. August 25, 2000.

This book goes into deep details of how the architecture of the microchip has evolved and how it is built. It familiarizes us with how the chip is used with mobile information. It goes into the core issue of memory, debugging and the production tests. This book helps us to understand what the chip actually does for us in real life
Ball, R. Stuart. Embedded Microprocessor Systems: Real World Design, Third Edition. New York: Newnes, November 2002.

This is a text that provides embedded features of the chip. It talks about the integration of the chip in the future and about expanding the chips capabilities It also talks about different applications that are used in everyday life that have chips in them and how they function.

Articles


Microcomputer abstracts.” Medford, NJ : Learned Information, Inc., 1994-c1999.

­This article discusses the abstracts of microprocessors and talks about different microchips that were invented that are used in the fully functional supercomputers and microprocessors.
Bellis, Mary. “The History of the Integrated Circuit (IC) - Jack Kilby and Robert Noyce” Journal Inventors. Inventors of the Modern Computer 8th volume (August 2003).

This journal article is from the Inventors Journal; section Inventors of the Modern Computer 8th volume talks about the uses of circuits with microchips. It gets into digital logic and digital circuitry as well as the utilization of the microchips with other circuits.


Throughout the beginning of the 20th century, scientists wondered around in the fields of technology in search for a break through invention. Finally, in 1958, the first completed microchip was created. The successful laboratory demonstration of that first simple microchip on September 12, 1958 came to be one of the biggest events of the 20th century.

Jack Kilby, the scientist that created the first microchip, went on to pioneer military, industrial, and commercial applications of microchip technology. He headed teams that built both the first military system and the first computer incorporating integrated circuits. He later on designed the first known TI calculator and a printer that was portable. In the 1970s, Jack Kilby left the TI independent inventor project on its own to go explore among the silicon industry in search for technology to generate electrical power from sunlight. Jack Kilby received two of the nation's most prestigious honors in science and engineering. In 1970, in a White House ceremony, he received the National Medal of Science. In 1982, he was inducted into the National Inventors Hall of Fame for the annals of American innovation. In Mr. Kilbys life he held over 60 U.S. patents and tons of medals. He was inducted into many associations of engineers and of the inventors of 20th century. Later on he received the Nobel Prize for physics and the integrated circuit and was congratulated by the president of the United States personally. Jack Kilby's first simple circuit has grown a worldwide integrated circuit market whose sales in 2000 totaled $177 billion. These components supported a 2000 worldwide electronic end-equipment market of nearly $1,150 billion. Billions of dollars were spent on the invention and the development of other microchips, which were theoretically trying to compete with the silicon microchip. So much money went to waste over the period of 40 years however, nothing came close to the invention of Jack Kilby’s, the microchip. This just goes to show how one idea could change the world.

The actual production of the microchip was the part that was the most impressive. Right before Jack Kilby had invented the first simple microchip, other complicated microchips were invented. About three science laboratories throughout the United States have been working on the idea of microchips. Some were created, however, these microchips would never be available to the public because of their complexity and the fact that they took too long to produce. Beside the point that these newly invented microchips were very big, they had no long term effects because they were very inefficient. Very small amounts of these microchips were invented due to the size and length of making one of the microchips. The complexity of these microchips didn’t quite help the production. However, the scientist that had invented these microchips did not want to give up on their inventions. They released the microchips to the public to get a general feel of the market that was about to open up for them. The results they got were worse then they expected. It seemed that to produce one microchip was very expensive and when it was to be sold; only 1 in 500 people could afford this microchip. Considering the fact that only about 2000 scientists at that time knew about microchips and what they were used for, that meant that only 4 chips would be bought. This came to be a great disappointment to the teams that had developed these microchips. To have an idea that an invention could be shut down like this was unheard of unfortunately it was true. For the next 15 years, technology struggled in terms of creating microchips. No one could come up with an idea that was very cost effective as well as efficient to release for public use as well as for them. People had few ideas, however, those ideas revolved around the fact that they had to take the previous microchip and make it smaller somehow.

September 12, 1958 came to be one of the biggest events of the 20th century when first simple silicon microchip was invented. Jack Kilby had beaten every scientist out there to create what was to revolutionize the age of technology. To come up with an idea that would be in use of millions of people world wide was not one of Mr. Kilby’s purposes. This invention stopped the depression of ideas that ran through the last 15 years in the technological industry. With this new invention people could afford this microchip and utilize it in different devices. The invention of this chip has opened a doorway into a totally new era of technology. Gadgets and other meaningful machines that were never thought of were invented. Of course, like many other inventions, the microchip has created many factors on the technology. The main factor that the chip has played was that microchips are the chips are definitely the mile stones in computer hardware. Computer chips make up our everyday lives enabling many of the things we use like coffee machines, microwaves, ATMs, and computers work and are reliable for use. These chips are no larger than a fingernail and are getting smaller every other year and are amazingly capable of holding over 1.5 million transistors. The computer chip’s history is very short and many of us do not remember a time when we were without integrated circuits.

The developments of the microchip lead to the biggest breakthrough of all time, the invention of a computer. The first computers used components called vacuum tubes. These vacuum tubes functioned as electronic switches. The tubes worked fine except that they were not reliable. Because of the big structure vacuum tubes would generate intense heat, which caused many of the components to deteriorate, and consumed enormous amounts of power. A smaller flow of electrons was sought after, and what they found were transistors. Transistors changed the way computers could and would be built



Then integrated circuit, or chip, combined multiple transistors into a small silicon disc. This silicon chip was the backbone of the development of smaller systems.
Computer chips have a phenomenal rate of performance over time and ware considered as a revolutionary device. The number of components in a single chip doubles every two years. Computers have since become cheaper and their capabilities have increased exponentially over time. A Computer chip is a tiny piece of material, that contains a complex electronic circuit and are made by placing and removing thin layers of insulating conducting and semi-conducting materials in hundreds of steps. The bodies of most chips are made of silicon and are used because silicon is a semi-conductor. Clean rooms are special laboratories where a lot of the manufacturing process is performed. The components of a chip are so small that the tiniest dust particle could ruin a chip. These clean rooms are very sterile and are one thousand times cleaner than hospitals. People who work in these laboratories are required to wear bunny suits and use an air shower to remove any dust form the suits before entering these rooms.
The first step in the manufacturing process involves melting the silicon crystals. After these crystals have reached its melting point seed crystals are carefully dipped into the melted silicon to form a cylindrical ingot five to ten inches in diameter and several feet long. This long ingot is then smoothed and cut into wafers. The manufacturing of these wafers can take from ten to thirty days. Each wafer forms the foundation for hundreds of chips. Engineers use a computer-aided circuit design program to design each layer of the chip. Depending on the amount of layers a chip may take a month to several work years to complete. Dangerous chemicals are used by the robot to perform the next step of the process since these chemicals are dangerous to humans. The robot polishes, sterilizes, and cleans the silicon wafer in a chemical bath. The wafers are then placed in a diffusion oven where they are coated with photo resist which is a step just before all the layers of the microchip are pushed together. After the photo resist is applied, the wafers are put through a process called photolithography which patterns almost every layer into the shape of specific electronic components. An Ultra violet light projected through the glass mask, prints each layers circuit pattern on the photo resist. The photo resist that is exposed to the light becomes hard while the photo resist covered by the chip remains soft. Channels in these layers of materials are then carved off. The soft photo resist and some of the surface materials are embossed away with hot gases, leaving circuit pathways. This step is repeated with each layer of the chip since all layers have different paths which are the key to the chips greatness.
Of course, there are different types of chips. For every chip different t procedures have to be performed. For example, to the silicon chip, manufacturers add certain impurities like Boron and Phosphorus to enable the chips to conduct electricity at room temperature. These little properties give the chip the ability to do what is required at any temperatures as well as be conductors of electricity at room temperatures. Manufacturers need to create two types of chips, the chips that conduct electricity and the chips that do not. The reason for that is that if someone was to use their cell phone for 20 minutes, they do not need the cell phone to give off heat, that is where non conductors com in to play. On the other hand, if someone was to microwave some food, they want their food to be hot, that means the chip has to give off heat to the food, and that’s where conductors come in. There are also divisions to conductors such as semi-conductors. Some chips contain millions of components. Manufacturers create thin lines of metal, usually aluminum, to connect the tiny circuits. When all the circuit layers are added, a machine tests individual chips on the wafer by applying electrical current.
The final steps of the chip have great importance as well. Before every chip is actually sold, the process called dicing, which is when a diamond saw cuts the wafer into individual chips called die. Each packaged chip is tested once more and is ready to be sent to companies of who will include them in a wide range of items. This is how the regular microchip is created and where microprocessors come to interact with the chips.

The silicon chip on the other hand is way more useful, easier to produce, stores more memory, conducts more electricity, in other words its much better then a regular microchip. One of the reasons that silicon was chosen is because it is the second most abundant substance on the earth. It is extracted from rocks and common beach sand and put through an exhaustive purification process. In this form, silicon is the purist industrial substance that man produces, with impurities comprising less than one part in a billion. That is the equivalent of one tennis ball in a string of golf balls stretching from the earth to the moon. Silicon is the raw material most often used in integrated circuit fabrication.

The most common semiconductor material used today is silicon. It is used for transistors, integrated circuits, memories, infrared detection and lenses, light-emitting diodes (LED), photo sensors, charge transfer devices, radiation detectors and a variety of other devices. Silicon belongs to the periodical group IV in the periodic table. It is a gray brittle material with a diamond cubic structure. Silicon is conventionally combined with Phosphorus, Arsenic and Antimony and Boron, Aluminum, and Gallium acceptors. The energy gap of silicon is 1.1 eV which is not that nigh but permits the operation of silicon semiconductors devices at higher temperatures than germanium as used in the regular microchips. The integrated circuit (IC) and the silicon chip are bought together and combined in the following process. Before the IC is actually created a large scale drawing, about 400 times larger than the actual size is created. It takes approximately one year to create an integrated circuit. Then they have to make a mask. Depending on the level of complexity, an IC will require from 5 to 18 different glass masks, or "work plates" to create the layers of circuit patterns that must be transferred to the surface of a silicon wafer. Mask-making begins with an electron-beam exposure system called “mebes”. “Mebes” translates the digitized data from the pattern generating tape into physical form by shooting an intense beam of electrons at a chemically coated glass plate. The result is a precise rendering, in its exact size, of a single circuit layer, often less than one-quarter inch square. Working with incredible precision, it can produce a line one- sixtieth the width of a human hair which cannot be seen by human eye. After purification, molten silicon is doped, to give it a specific electrical characteristic. Then it is grown as a crystal into a cylindrical ingot. A diamond saw is used to slice the ingot into thin, circular wafers which are then polished to a perfect mirror finish mechanically and chemically. At this point integrated circuit fabrication is ready to begin.

To begin the fabrication process, a silicon wafer is loaded into a 1200 C furnace through which pure oxygen flows. The end result is an added layer of silicon dioxide "grown" on the surface of the wafer. The oxidized wafer is then coated with photo resist, a light-sensitive, honey-like emulsion, just like in the regular microchip design. In this case we use a negative resist that hardens when exposed to ultra-violet light. To transfer the first layer of circuit patterns, the appropriate glass mask is placed directly over the wafer. In a machine much like a very precise photographic enlarger, an ultraviolet light is projected through the mask. The dark pattern on the mask contains the wafer beneath it, allowing the photo resist to stay soft; but in all other areas, where light passes through the clear glass, the photo resist hardens. The wafer is then washed in a solvent that removes the soft photo resist, but leaves the hardened photo resist on the wafer. Where the photo resist was removed, the oxide layer is exposed. An etching bath removes this exposed oxide, as well as the remaining photo resist. What remains is a stencil of the mask pattern, in the form of minute channels of oxide and silicon. The wafer is placed in a diffusion furnace which will be filled with gaseous compounds, for a process known as impurity doping. In the hot furnace, the “dopant” atoms enter the areas of exposed silicon, forming a pattern of n-type material. An etching bath removes the remaining oxide, and a new layer of silicon is deposited onto the wafer.

The first layer of the chip is now complete, and the masking process begins again: a new layer of oxide is grown, the wafer is coated with photo resist, the second mask pattern is exposed to the wafer, and the oxide is etched away to reveal new diffusion areas. The process is repeated for every mask 18 times and is needed to create a particular integrated circuit. The most important part of this exercise is the precise alignment of each mask over the water surface because if the alignment is even more then a fraction of one millionth of a centimeter is off, then the whole wafer is useless and can be thrown out. During the last diffusion a layer of oxide is again grown over the water. Most of this oxide layer is left on the wafer to serve as an electrical insulator, and only small openings are etched through the oxide to expose circuit contact areas. To interconnect these areas, a thin layer of metal is deposited over the entire surface. The metal dips down into the circuit contact areas, touching the silicon. Most of the surface metal is then etched away, leaving an interconnection pattern between the circuit elements. The final layer is "vapox", or vapor-deposited-oxide, a glass-like material that protects the integrated circuit from contamination and damage. It, too, is carved away, but only above the "bonding pads", the square aluminum areas to which wires will later be attached.

Even though, the procedure that takes to create a silicon chip seems a lot more complicated, it requires less materials, less expensive materials and is actually easier to perform then the original design of the microchip. This is the main reason why silicon microchips are used almost everywhere in the world today.

As the journal states, about 76% of the population of New York City has some kind of home computer or a laptop. Now, the great part about that is most of these people do not realize that they have thousands of little microchips that allow them to have these computers. If one was to take their laptop apart with an attempt to find a microchip and observe it, they would not succeed. These tiny chips are too small to be seen with the naked eye, so these chips are put under a microscope with magnification strength of 100x. These images are revealed in tiny spaces on the microchips that run our computers, cell phones, and video games. These images are sometimes the company logo, a favorite cartoon character like Mickey Mouse, or are pictures of some industry inside jokes understood by a handful of designers, and some are unique, sort of a signature for its designer. Therefore, most of the people would not only know where these chips are but also what is drawn on them.

The future of the chip is very blurry at the moment, however some comic books and movies have given us plenty of idea of what is to come in the future. There have been many theories that people will have microchips implanted in them since birth so the government controls their every single movement. However, with a society like in the United States, the control of the people’s moves is not going to be accepted. Unfortunately, scientists are not very far from developing chips that identify us for who we are. As seen in the movie A Beautiful Mind, the mathematics professor had a microchip surgically implanted in his hand so he could read the number and enter it as a code. As absurd as it sounds, recently a professor at the University of Reading in England had a silicon chip transponder implanted into his arm. He has done this to demonstrate how computers and humans might be able to communicate in the future. This is one of many possibilities that are getting closer and closer to becoming reality. The chip was implanted into Professor Warwick’s arm in a surgical procedure that lasted only fifteen minutes. The chip is a commercially available product used in computers and other products for identification. Since the chip has been in his arm Warwick has turned on computers just by coming into a room, doors open, and computers say “Hello Professor Warwick,” or tell him how many e-mail messages he has. Although the implant is temporary, the futures of computer chips are endless. In the future keyboards and mice will be of no use, predicted Warwick. “This is the communications revolution between humans and computers,” he said.


Computer chips have affected our society in many ways. They have made life easier and faster to live with. We can only imagine the type of things that lie ahead for these chips. But until then our keyboards and mice will have to do what they do for us now.


The database is protected by copyright ©essaydocs.org 2016
send message

    Main page