It has been a century since physicist Julius Edgar Lilienfeld, an immigrant to the United States, patented the idea of using a semiconductor material to make a transistor. A hundred years later, silicon microchips, some with tens of billions of transistors, are in everything from computers to cars to coffeemakers. They make our modern world possible. Now, with Artificial Intelligence, they are poised to run the world.
CHIPPED is divided into five parts. Each can be read independently, but there is an arc to the order.
It can take the better part of a year to make a microchip and every step of the way can go wrong. This is high stakes, high cost and highly subsidized.
In the late 1980s, the government of Taiwan saw an opportunity to dominate the global microchip market and began investing heavily in a company called Taiwan Semiconductor Manufacturing Company (TSMC) to kickstart its domestic industry. Within a generation, TMSC had become a powerhouse, a one-stop-shop “foundry” producing chips designed by other companies from all over the world. It was a business model in perfect sync with the MBA wisdom of the day: “just in time” production. To improve bottom lines and shareholder returns, companies cut labor and warehouse costs by offshoring manufacturing to countries where products could be made more cheaply, importing only what was needed, when it was needed.
Then came the COVID pandemic upending global supply chains. The savvy of the Taiwanese and the folly of everyone else soon became abundantly and uncomfortably apparent. By 2020, chips were in everything, not limited to phones and computers, but also in cars, coffee-makers, toys and televisions. According to a study by investment bank Goldman Sachs, 169 industries were impacted by chip shortages.
To the list of air, water, food, shelter and energy, add chips as a basic necessity of modern life.
With a harsh lesson learned, the US took a page from Taiwan and ramped up government spending through the CHIPS and Science Act to boost its own domestic production, providing tens of billions of dollars in government grants and loans to manufacturers. China, Korea and Japan have also spent vast sums of public money to ensure their positions as global chips players. Even so, one third of the world’s chips still come from TMSC in Taiwan, including 90% of the coveted Graphics Processing Unit (GPU) chips needed to run AI.
••••••••••••••••••
From the street, a chip factory (aka a “fab,” short for fabricator) looks like a giant warehouse: a boxy, nondescript windowless fortress. It fact, the building is a multi-billion dollar piece of cutting edge technology designed to keep those precious wafers of pure silicon safe from dust, humidity, heat, cold, power outages, earthquakes and prying eyes during the long, delicate metamorphosis that turns them into chips. Assembly lines are automated. Air in “clean rooms” is changed out as often as six times per minute. Behind the scenes, an army of HVAC technicians ensures conditions are always just right.
Extreme Ultraviolet Lithography (EUL) is used to etch tens of billions of transistors, each measuring just a few nanometers across, onto tiny squares of silicon the size of fingernails to make the fastest, most advanced, most expensive chips: GPUs.
EUL isn’t magic, but it’s close. First, a laser heats droplets of the metal tin to at least 10,000°F, which turns the droplets into a plasma emitting extreme ultraviolet light. (In addition to solid, liquid and gas, plasma is a fourth state of matter, mostly found in stars.) This light is focused into a beam, then filtered through lenses, redirected with mirrors and finally used to etch the tiny transistors. Only one company, Netherlands-based ASML, makes EUL machines, which cost $200 million or more, each.
Everything about chipmaking seems improbable and also vulnerable. A single country, China, dominates the polysilicon market. A couple of mines in rural North Carolina provide the special sand used to make the containers essential in the production of wafers. A speck of dust can wreck a wafer. One company, based the Netherlands, is the only source for the machines needed to etch nano transistors to make the most advanced chips. Another company, in Taiwan, has cornered the global market on the sale of those chips.
No matter how much money governments spend on domestic production, in materials and know-how, it takes a world to make a chip.
SUPER DUPER
After a journey that began with rock in a quarry, included three trips through the fire to rearrange atoms just so, then a spin through a fab lab where a beam of light emitted by a plasma made from tin etched transistors by the tens of billions onto slivers of silicon, the chips are finally ready for use.
Futurist Amy Webb sees these chips, and the AI they enable, as ushering in The Era of Living Intelligence. Others are more focused on a winner-takes-all race for superintelligence:
“…The country or the company that develops the system that is smarter than any human in the world—this is called superintelligence—can apply this to itself to get smarter and smarter and smarter. There are people who believe that such a system when it appears—and we believe it will appear somewhere in the next decade—will give…a country or company, an asymmetric power monopoly for decades to come.”
— Eric Schmidt, former CEO, Google, Amanpour and Company
“A country or a company.”
Last year, on the shores of the Mississippi River in Memphis, Tennessee, the world’s largest, fastest supercomputer, a dedicated AI training data cluster called “Colossus,” * was built in four months by Elon Musk’s company, xAI.
It has 100,000 Nvidia H100 GPUs and as least as many Computer Processing Units (CPUs) and all sorts of other chips and sensors as well. It has exabytes of data storage (one exabyte = one billion gigabytes) and an ethernet network that runs 400x faster than the fastest home network.
Colossus is a long way from ENIAC, the room-size computer with 18,000 vacuum tubes that dazzled the world nearly 80 years ago. Still, there is a family resemblance: inscrutable facades that mask a computational fury within and an organized chaos of color-coded wires on the backside.
Estimates for building “Phase One” start at $3 billion. In December, plans were announced to expand Colossus ten-fold to a million GPU chips.
xAI, a private company, is well on its way to developing Artificial General Intelligence (AGI). This is an AI with the capability to perform at a human cognitive level. Next comes superintelligence, an AI that outperforms humans.
If xAI becomes the first to develop superintelligence and Eric Schmidt is right, then a private individual, the world’s richest person, who has a majority stake in xAI, will have an “asymmetric power monopoly for decades to come.”
Colossus is one of several million-chip supercomputers currently in development, although the identities of the companies building the others, at least for now, have yet to be publicly revealed.
DATA CENTERS
Technically, Colossus is a data center: a building filled with computer servers, designed to keep those servers and the precious chips within dust-free and thermally happy. Most data centers though, are in the business of renting space to server farms, of which there can be several within a single center.
A server farm is simply a stack of servers, aka, “the cloud.” Server space is rented out to companies whose business operations require more computing power than their on-site computers can handle.
Many of these businesses, in turn, sell “Software as a Service” (SaaS), for example a tax accounting service or a streaming video network. Included in the fees these companies charge their customers is the rent for server space; rent for chips.
Digital infrastructure is not the glamorous side of tech, but it is a good business. With AI, it has become a great business, the investment opportunity of a lifetime, according to Sean Klimczak, Blackstone’s global head of infrastructure. The private equity giant owns more data centers than anyone else, with $70 billion in assets and another $100 billion in the pipeline.
“In our view, the intersection of digital infrastructure and the need for power is one of the most exciting and critical investment themes of our time. As AI continues to evolve, the demand for data centers and power will only grow, creating a wealth of opportunities…
…But it’s not just the sheer amount of data that’s growing—it’s the intensity of the data being processed. Traditional tools, like Google searches, are lightweight in terms of power consumption. Conversely, a ChatGPT query requires 10 times the power of a Google search and AI-generated images using tools like DALL-E require 50 times the power of a simple Google search. And if you ask SORA to create a video? We’re talking 10,000 times the power consumption.
To put that into perspective, creating a basic AI-generated video is the energy equivalent of charging your phone 119 times. As AI applications become more advanced and widespread, we are just beginning to scratch the surface of what I call the next wave of data intensity.” — Sean Klimczak, The Convergence of Data Centers and Power: A Generational Investment Opportunity | The Connection
The opportunity, then, is two-fold: data centers and energy. For Klimczak, it is all upside.
Or is it?
CHIPPED: Part 3 | data centers NIMBY, power play
* Did Elon Musk name his supercomputer Colossus as a nod to the Colossus of Rhodes, a massive statue considered one the seven wonders of the ancient world? Was he simply being literal? (Dictionary definition: “a person or thing of immense size or power.”) Could it be a reference to a set of computers (Collisi) developed by British codebreakers during WWII? Or is it possible he was inspired by Colossus, a science fiction novel by D.F. Jones published in 1966?
From the book blurb:
Charles Forbin has dedicated the last ten years of his life to the construction of his own supercomputer, Colossus, rejecting romantic and social endeavors in order to create the United States' very first Artificially Intelligent defense system…
No surprise, things do not go as planned, or there would be no story.
In 1970, the book was turned into a movie. Two years after Stanley Kubrick’s 2001: A Space Odyssey gave the world HAL, a psychotic, lip-reading AI that determined humans were the problem, Colossus: The Forbin Project followed up with an Earthbound supercomputer that quickly came to the same conclusion.
They built Colossus, a computer with a mind of its own. Then they had to fight it for the world.