The year was 1971, and Robert Noyce had created what many were dreaming to do: an invention that would revolutionize technology. This invention, a silicon-based integrated microchip, would allow humans to control electrical energy more precisely, and serve as the foundation of most electronic devices we know today. But the path to using silicon to create these small devices was not as easy as it seems.
Silicon is an important component of many minerals and is the second most abundant element, making up 27.7 percent of the Earth’s crust. With the chemical symbol Si, and the atomic number 14, it is the seventh most abundant element in the universe. Because of its abundance, silicon is considered as one of the elements necessary for life to exist as we know it.1 Although silicon is not found in its free state in nature, it does appear in various forms, including in sand, quartz, and rock crystal, among many others. It is also found in outer space, more specifically in the Sun and the stars.2Silicon’s abundance on Earth has allowed it to be an important part of some of our most important creations. One of the very first cradles of civilization, Mesopotamia (known as Southern Iraq today), found several ways to use silica (the silicate mineral quartz, SiO2) to make glass.4 As silica is very abundant and not chemically reactive, Mesopotamians were able to create concrete and brick, pottery, and glass, all important for civilization and development. One example of silica as an easy-to-use material is quartz sand. This is the principal ingredient in making glass, one of the most inexpensive and versatile materials with excellent mechanical, optical, thermal, and electrical properties.5 The early creation and widespread use of glass was one of the key steps in the process of one of the most far-reaching the inventions of the twentieth century: the microchip.
Although the glassmaking process continued to be refined, humans did not use silicon as a pure substance for thousands of years. It wasn’t until 1787 that Antoine Lavoisier (1743-1794), a famous French chemist, first identified silicon as an element in rocks. Despite its abundance, however, chemists after Lavoisier took a long time to study and learn to use this element. In 1823, almost 40 years after its initial discovery, Swedish chemist Jöns Jacob Berzelius (1779-1848) was finally able to isolate silicon in the form of a metalloid element. This was not only a major discovery in the field of chemistry, but it also allowed for more research and exploration of properties and potential applications of silicon.6One of the most important innovations linked to silicon was the creation of transistors. Initially invented at Bell Labs in the late 1940s, these tiny semiconductor devices used the electrical properties of silicon to amplify and control electronic signals and electrical power. Because of their small size and low energy demand, transistors were a huge improvement over their predecessor, the heat-powered (thermionic emission) vacuum tube. Initially produced and marketed by companies such as Raytheon and Western Electric, these small devices were used in countless military and consumer (think transistor radios) applications throughout the 1950s and 1960s. Ultimately, transistors opened the door to new tech companies in the 1950s.8,9
The 1950s gave rise to transistors because of international confrontations in the mid-20th century, including World War II, the Korean conflict, and the resulting Cold War.10 These events, although terrible in many ways, mobilized some of America’s greatest scientific minds. Electronics and their application were placed at the front line of the war effort to defend the nation’s values. Innovation rose from urgency, giving scientists and engineers the opportunity to explore the use of minerals, including the role of silicon. One of the major outcomes of this research was the beginnings of the digital computer, an example of an invention that began as part of the war effort, but was later marketed as a commercial product. As scientists began to explore the basic idea of a computer, they quickly identified the need for a better way to control electricity, and small, energy-efficient transistors were an obvious choice.
From the creation of transistors and the need to commercialize them, researchers began to focus on creating more compact, yet more powerful devices. This “shrinking” of the transistor meant finding a way to add more of them onto an electronic product to increase their function.12 For technology companies, it was important to make them work better, faster, and in a much more complex way than just flipping a light switch. It was not until 1959 that Robert Noyce (1927-1990), an American physicist, was able to use silicon to create the first monolithic integrated circuit or microchip. This device was comprised of a set of fast, small and inexpensive electronic circuits that allowed a finer control of electricity.13
In reality, it wasn’t only Robert Noyce who invented the microchip. Another electrical engineer, Jack Kilby (1923-2005) at Texas Instruments, independently invented almost identical integrated circuits at nearly the same time. The success of these two scientists allowed for the creation of the “Monolithic Idea”, which integrates all the parts of an electronic circuit into a single(“monolithic”) block of semiconductor material, in this case, silicon. The availability of this resource allowed these two scientists to create what would be the building block of digital computer technology, and what would facilitate generations of innovation.14 One example of an innovation that came with this invention was Moore’s Law, which came after Gordon E. Moore, co-founder of Intel, predicted that the number of transistors that can be packed into a given unit of space will double about every two years, increasing their speed and capability while the cost is reduced by half.15 Today, fifty years later, Moore’s Lay continues to be a guide for technologists around the world, and the pace doesn’t seem to be slowing down. This prediction is what pushed society to transform technology and computing from a rare and expensive experiment into an affordable necessity. Everyday software that we see today like internet, social media, analytics, etc, all sprang from the semiconductor industry and the foundation of Moore’s Law.16The use of silicon to create the microchip did more than just provide one of the greatest advancements in technology; it gave rise to what we have come to know as Silicon Valley in California, home to the $200 billion semiconductor industry. Robert Noyce, as head of Fairchild Semiconductor Corporation in the 1960s, and Intel Corporation shortly after, played a leading role in establishing the idea of Silicon Valley.18 The concentration of technology industries in Silicon Valley created a center of innovation that inspired the founders of many technology companies that we know today, such as Google, Apple, Facebook, and many others.
Silicon, an abundant element and a fundamental component of many natural occurring minerals, has developed into a key player in society as we know it today. Discoveries of early civilizations and human history trace the use of silicon in quartz as a key element to make glass and other building materials that are still in use today. More recent developments have used the physical and electrical properties of silicon to establish it as a key player in the advancement of computer technology. These advances, including the creation of the transistor, allowed the better working technology and electronic applications to be developed. The power of silicon is undoubtedly far-reaching, not just because of its natural abundance, but as a driving force for the economy. So vital is this element, that a whole region, Silicon Valley, has been named after it, housing some of the most well known companies today. Although it may not be a household name, the role that silicon has played can be recognized as the building block of technology.