The transistor was the definitive breakthrough for a new and large family of materials used in technology and manufacturing. Semiconductors are a special type of crystal whose electrical properties put them between conductors and insulators.
Initially some uncertainties remained with respect to the manufacture of transistors and diodes with three and two electrodes, respectively. The factors determining how and why a transistor worked were only partially understood. The component's surface and surface properties, for example, played an important role. The same applied to the structure of the material itself, meaning the crystal structure.
Theoretical advances and the development of manufacturing methods therefore went hand-in-hand. An important advance was achieved in the early 1960s, when silicon replaced germanium as the dominant material. At first it had been difficult to manufacture silicon crystals. Now it was instead possible to exploit that material's superior heat-handling properties. Much later, another material, gallium arsenide, would be used, most often in optical applications. Today, silicon carbide is increasingly used, but silicon is still the dominant material.
Over the years, manufacturers have learned to produce purer materials and much larger crystals, and manufacturing methods have been developed that exploit these properties. Crystals can be grown one layer of atoms at a time, and it is possible to inject the substances that determine the component's characteristics using ionizing radiation. This makes it possible to precisely specify component characteristics, to achieve better production economy, and to fit more functions on the same chip.
The ability to combine many functions in the same circuit is based on a breakthrough made independently around 1960 by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductors. The technology behind this invention has been variously called microelectronics, integrated circuits and planer technology.
Instead of just making one transistor at a time, it was now possible, using the same piece of silicon and the same manufacturing process, to create many different components, including not only transistors, but also diodes (rectifiers) and resistors. This method would make possible such component combinations that would allow a whole computer to be put on a single chip of silicon as a microprocessor.
In addition to larger and better silicon crystals, another prerequisite for semiconductor development was continuous improvement in photographic techniques. Integration of many components on a single chip of silicon was achieved by photographic exposure of layer after layer with a pattern that could later be etched out to form the various components, which now number in the millions on each chip. This rapid and continuous development was characterized at an early stage by one of the pioneers, Gordon Moore, who together with Robert Noyce started Intel in 1968. What is now called Moore's Law states that the cost for raw computing power drops by 50 percent every 18 months – a trend that has held true for several decades.
Author: Bengt-Arne Vedin