Energy Realities At The Nexus Of TechnoOptimism
The annual gathering of technologists at the COSM meeting, hosted by George Gilda, gives us a glimpse of an exciting future for many, but a scary one for some. It is an optimistic view of the future, which is quite different from the view often offered by many short-sighted forecasts. Regardless of the technological perspective, one thing that will not be different in the future is the central and important role of energy.
Energy physics brings it all together. This also applies to the topics of this year's COSM meeting: "The development of artificial intelligence in the wild", the potential of graphene as a truly new and revolutionary class, and China's role on the world stage.
This may be obvious, but not only society, life as we know it, even the universe would not exist without energy. Not to be philosophical, but it is inescapable in Gilder's COSM, that all possible futures occur at the intersection of information, atoms, and energies. As George said earlier, the difference between our time and the Neanderthals is the only thing we know . Building blocks that include whatever we're feeling at the time. Today we have more information about "fair" atoms and the same forces that have always existed.
It is our ability to infinitely increase the amount of information to rearrange the atoms of nature in a unique and miraculous way that enables humanity to create all current and future products and services. But receiving and processing that information and using that information to rearrange atoms always requires the use of energy. Energy is consumed by every innovation, product and service that makes life interesting, safe, convenient, fun and even beautiful.
And throughout history, inventors have found many ways to make things that take more energy than to produce . The availability of materials such as hybrids, polymers, drugs or monocrystalline silicon has led to the search for new energy for their production. Likewise, the invention of machines made of these materials, such as cars, airplanes, or computers, created a new demand for energy.
An example of the inevitability of natural energy is that our computing machines are already very powerful. All software, even virtual reality, requires reality machines that take power to infer logic. This may seem obvious, but at the moment, roughly, the global cloud uses the same amount of energy as global aviation. And the first grows much faster than the second.
This session focuses on AI, a new way to drive silicon motors. Although AI has been around for a while, November 30, 2022, when ChatGPT was officially released, is the release date of AI.
AI represents the most energy efficient use of silicon. In terms of power, it is a transition from the age of steamships to jet aircraft. The latter promotes personal travel around the world because it is much better, more comfortable and therefore more efficient, i.e. it saves our most precious time in the universe. Of course, flight is a more energy-intensive way to move anything. The same is true of artificial intelligence.
Artificial intelligence involves a training phase, or machine learning, and then an evaluation phase that uses the learned knowledge. At least for those who are committed to reducing society's energy consumption, both education and speculation consume a lot of energy. For example, a simple machine learning algorithm trained several years ago to solve the puzzle, the Rubik's Cube, consumed enough electricity to drive a Tesla a million miles. And training for many real-world problems isn't a one-time event. Then after exercise comes the final phase, although in practice it is less energetic than exercise, it is performed repeatedly, sometimes continuously and results in a higher total energy expenditure than exercise. Consider comparing the energy needed to make aluminum and the energy needed to build an airplane and then the fuel needed to fly.
The full impact of AI on energy consumption remains to be seen, as software and hardware are still in their infancy. We are about the same age as standard calculators, say around 1980. Dystopians fear artificial intelligence, but it's an exciting and important new tool that will lead to all kinds of innovation, not just self-driving cars and robots. There were both new ways to increase productivity and original inventions, as well as many unimaginable ones. And the infrastructure that will be built and strengthened to democratize AI will eat the web, to borrow Andreessen Horowitz's phrase.
At a recent meeting of power company executives, Elon Musk politely criticized them for underestimating future electricity demand. It does not mean the power of electric cars, but, first of all, artificial intelligence. For example, the global cloud currently consumes 10 times more electricity than the world's electric cars combined. Even if electric vehicle adoption increases according to market forecasts, the cloud will outpace demand for electricity, especially as AI hardware is rapidly being incorporated into cloud infrastructure.
A regular response to observations about the power needs of computers, and especially artificial intelligence, makes silicon technology more efficient. Of course they will. However, efficiency does not reduce the growth of energy demand, in fact it encourages it. This fact is called Jevons paradox. Information systems are generally a striking example of this so-called paradox.
In the last 60 years, the energy efficiency of logic machines has increased more than a billion times . That's why today there are billions of smartphones and thousands of warehouse-sized data centers. In the year With the energy efficiency of computing in the 1980s, smartphones used more electricity than the buildings we live in today, and a single data center today requires the entire US electrical grid. In other words, without the incredible increase in computing power, there would be no smartphone or cloud era.
Now comes a rare discovery that happened two decades ago. An entirely new and revolutionary class called graphene, unimaginably thin sheets of pure carbon, consisting of only a few layers, with seemingly miraculous properties. Graphene is beginning to find applications in a number of commercial products. One option is to use graphene as a more efficient base material to replace silicon in computer chips. Count on us to get you up to speed on Jevon's Paradox.
Graphene has other obvious properties related to its structure and biology. In particular, it is stronger than steel. In other embodiments, the drug is biochemical, and can promote the failure of anterior nerve regrowth. And graphene is one of many new materials coming out of the lab, though perhaps the most unusual.
But back to our topic. The product of all elements includes energy. Compared to the centuries before modernity, almost all of them were built with less materials, mostly stone, wood and animal parts. On average, more than one kilogram of energy is needed to make the materials of our time. Changing from wood to polymer, which is widely used in medicine and is more versatile than wood, increases the energy consumption per kilogram produced by 10 times. If aluminum is used instead of polymer, the energy consumption to produce one kilogram increases by 10 times. And semiconductor silicon requires 30 times more energy than aluminum. It takes 100 times more energy to make one kilogram of silicon than to make one kilogram of steel. And the world produces kilotons of silicon, enough energy to produce megatons of metal, not just for computer chips, but also for solar cells.
As for graphene production, we are still in the early stages of figuring out how to produce it on a large scale. George Gilder argues that graphene production may precede the "aluminum moment"; This point in history dates back to 1886 when inventors found an efficient, high-volume way to produce attractive but expensive materials. : Pure aluminum used to cost more than gold.
However, according to the technical literature, the strength of graphene is more similar to silicon than aluminum. Therefore, I would argue that graphene is not at the "aluminum moment" but at the Czokralski edge. Currently, Polish metallurgist Jan Chochralski In the year In 1916, he accidentally discovered how to make single crystal silicon from a molten pool. The discovery led directly to the commercialization of today, completed at Bell Labs in 1949, 33 years after silicon was accidentally discovered. Without monocrystalline silicon, there would be no era of silicon computers. If graphene has taken this long to go from an accidental discovery to a viable commercial process, we'll have to wait another decade. But today, in an array of materials, machines, and data, modern artificial intelligence supercomputers can help shorten that timeline considerably.
The first companies and countries to achieve commercially viable graphene at scale will reap the real benefits. Which brings us to China's energy modernization, the third of COSM 2023's three themes.
We look at the situation of several key components, the production of which requires a large energy consumption and in parallel, they are fundamental to production and consumption machinery.
China produces more than 60 percent of the world's aluminum, refines more than half of the world's copper, 90% of the base material for electrical products, and 90% of the world's refined rare earths, useful for many electric motors. or for generators. And they are essential in many high-tech applications, including solar panels and wind turbines. On Earth, 90% pure gallium, the magical gallium arsenide semiconductors are the material used to make many technological products, especially lasers and light-emitting diodes; And 60% of the world's pure lithium, 80% of the world's pure graphite is used in all lithium batteries, and 50% to 90% of the many chemical formulas and polymer components needed to make lithium batteries. There is more; But you get the point.
China is not afraid of energy-intensive industries and decided to become their main supplier twenty years ago. This leadership emerges at the intersection of three types of policies that encourage and encourage engineers to study basic chemistry, electricity, and primitive materials, which are second and third priorities here, and second, policies that encourage and accelerate rather than resist and hinder. Industrial capacity to build large-scale chemical and energy facilities, as we do in the United States, and third, policies that ensure reliable low-level supplies. Energy costs to power these industrial plants. The latter means that two-thirds of the power grid in China is coal.
Now came the Inflation Act, the largest industrial policy spending package in American history. There is no mystery about the specific purpose of most IRA spending. It aims to reduce carbon dioxide emissions in the country by encouraging an energy transition away from the use of hydrocarbons. Regardless of what you know about climate change and carbon dioxide, there are two facts to remember at the intersection of technology, politics and energy.
The first data set.
If the government's estimate of the $2 trillion IRA has not yet deployed is correct, it would reduce US CO2 emissions by 1 gigaton. The inflation of these costs is aside, which is related to the theoretical emission effect, and in fact, China is still building more coal-fired power plants, and President Xi has made it clear that they will build more coal-based power plants. This means that China will continue to benefit from industrial energy consumption for oil-intensive industry in the next few decades. This means that when these additional coal-fired power plants are completed , they will increase the world's CO2 emissions by about 2 gigatonnes. Meanwhile, to eliminate 1 gigaton here, a large portion of the $2 trillion US taxpayer dollars targeted by the IRA will be used to purchase the essential energy content from China needed to build wind, solar and battery facilities.
And the second event is like remembering.
Keeping China and energy resources at the center of energy transition targets, two decades later and keeping global spending on wind and solar energy, $5 trillion from the same effort the world stands today, and hydrocarbon and coal extraction. , oil, natural gas.
These costs reduce the share of hydrocarbon energy, but only by two percentage points. Today, hydrocarbons still provide 82 percent of the world's energy. And the combined contribution of solar and wind energy currently provides less than 4% of the world's energy. For example, burning wood still provides 10% of the world's energy. Meanwhile, over the past twenty years, the absolute amount of hydrocarbons accepted by the world has increased to the equivalent of six Saudi Arabia's oil production.
Data shows that the cost of "energy transition" has so far produced bad results. We can spend more money on developing vehicles for non-carbon energy and perhaps it is clear in our policy. But the physics of energy materials or the main suppliers of such materials cannot be eliminated.
The 2023 COSM meeting has been announced , some of which are truly impressive and game-changing in both computing and materials. But these revolutions aren't re-energizing, they're a sign of creative expansion.
Important changes in the nature and extent of the power-generating machine still await some unknown future discovery, knowledge, or sudden invention. This kind of growth is inevitable, but, to quote Bill Gates, this revolutionary growth is "unpredictable."
However, it is safe to assume that the AI infrastructure will expand and soon someone will find ways to mass produce graphene. Given the geopolitical situation, we can expect that common sense to return to the field of art, perhaps even more so.