To stay up to date on all the news that you need to know, there's no better p...
And there's no better way to enjoy the DSR network than by becoming a member.
Members enjoy an ad-free listening experience, access to our Discord community, exclusive content, early episode access, and more. Use code in DSR 26 for a 25% off discount on sign up at the DSR network.com. That's code in DSR 26 at the DSR network.com/by. Thank you, and enjoy the show. Welcome to the AI Energy and Climate Podcast, a special series from the DSR network hosted by David Sandalot, inaugural fellow at Columbia University Center on Global Energy Policy.
Join us as we talk with leading experts to explore the intersection between these critical issues that will impact the future of each and every one of us.
I am David Sandalot, this is the AI Energy and Climate Podcast. The International Energy Agency in Paris just released a new report on energy and AI and it's a treasure trove of information.
“The report, which is called Key Questions on Energy and AI, builds on a landmark study released by the IEA last year, updating and expanding its analysis.”
The new report is global in scope and looks at how much energy AI is using where that energy is coming from and how AI is transforming the energy sector. It's an important resource, providing authoritative information and diving deep into topics including the role of natural gas, nuclear power and batteries and AI data centers. The strains caused by the clustering of data centers in certain locations, the future of physical AI such as robotics and autonomous vehicles and much more. The report was prepared by a large team led by two senior energy experts at the IEA, Thomas Spencer and Sid Hart Singh.
I recently had a chance to talk with the two of them about their new report. I hope you enjoy our conversation.
“Thomas Spencer and Sid Hart Singh, it's great to see you. Congratulations on the release of your new report called Key Questions on Energy and AI and welcome to the show.”
David, great to be with you. Thanks for the invitation. And thanks David, it's a pleasure to be here. Well, well, thanks for joining us. So your report covers a wide range of topics. You look at how much electricity data centers are consuming. Where that power is coming from, what impact the AI is having an energy systems and a variety of related topics. What would you say, what are some of your headline conclusions? You know, David, that's a dangerous question because it could become a long answer.
If you ask an open leading question like that to two researchers, they will bang on for a long time, but we'll try and distill it down.
“You know, I think the first key message from the report was really that AI is still moving incredibly rapidly.”
It's not slowing down and that's having implications for the energy sector. So whether it be in terms of, you know, the increase in capital expenditure by hyperscalers on data centers, progressing AI itself with the rise of AI agents and increasingly energy-intensive modes of AI or improvements in energy efficiency.
It's just really moving incredibly rapidly. And then, you know, we see this and this was the second key message coming out of the report.
We see this increasingly coming up against physical constraints. And some of these are in the energy sector, you know, in particular long-way times to connect to the grid, but also increasingly in energy technology supply chains and we can get into that a bit more if you're interested whether it be in terms of turbines or batteries or or transformers. But also, you know, physical constraints in the manufacturing supply chain for some of the IT equipment that needs to go into data centers.
And so when we add those two together, we see a huge pressure for innovation to get around these bottlenecks.
So I think a key finding from the report was that AI and data centers are als...
You know, we put in the report while still an energy-taker AI is increasingly also an energy-maker.
You know, the tech companies are becoming active participants in electricity markets through their procurement strategies, and really reshaping the sort of market and technology mix in the electricity sector. And so if we zoom out a bit and think about the broader implications of this, this really does offer an opportunity for, you know, the countries and companies that that consists on this to modernize the electricity system and to drive forward innovation in electricity technologies. And that's a big opportunity.
There's so much to dig into there Thomas. Let's just start with how much power data centers are using.
“And I think you say about 1.5% of total global power demand, if I recall, is being used by data centers.”
You'll elaborate on that and how that's distributed globally. It's not some pretty interesting observations on that. Sure. So David, as you rightly mentioned, about 1.5% of global electricity consumption currently goes to data centers. This is about approximately 500-about hours as of last year.
You also rightly point out that it's actually not very spatially distributed. So we did this geospatial analysis of all the commercial data centers globally. That's over 11,000 of them.
“And we found that there are a few interesting geospatial trends that we observe firstly.”
They tend to cluster around each other. There are obvious network effects and advantages that come from data centers being around infrastructure that caters do their needs including infrastructure for telecommunications, for electricity and so on. Data centers also tend to cluster around cities. So unlike a lot of other energy consumption infrastructure, which could be more spatially distributed.
For example, aluminum smellters or steel plants, data centers can consume as much electricity as them and still be close to each other and close to big cities which are already big centers for energy demand. So as a result of that, even though that 1.5% may seem small today, in areas where these data centers are clustering, it can be as high as 25 or 30% of electricity consumption going to data centers. That's where the challenge really lies. And we find that this clustering trend actually isn't slowing down.
So all the data centers that are currently in the pipeline, which means that the ones that are either under construction or the ones that have applied for initial permits but have not started construction yet, they will create new clusters as well. So the clustering phenomenon of data center only increases as we go forward from here. And these data centers of course are becoming larger and larger. So typical AI hyposcaler data center today might consume as much electricity as 100,000 households.
This increases to, you know, well over 2 million households for the largest ones that are under construction today.
Yeah, I thought one of the very valuable parts of your study is that it's global and you look everywhere and the figure of 1.5% of global electricity demand seems small, but initially not obvious why this phenomenon would be getting so much attention in some localities. If it's such a small percentage of global power demand, and you really bring out the importance of clustering. And although it's 1.5% of global power demand overall for data centers in number of geographies, it's very significantly more than that,
and putting significant strains on electric threads. You have very interesting discussion I thought about power density and the implications of technological advances in that area.
Something I've never seen before, which is the statement that with the newest generation of server racks that are coming out of the NVIDIA,
in particular with the there are a Ruben architecture.
“You find that the amount of power that is being used by a server rack about the size of a refrigerator, which it equals the amount of power being used by 65 households, which is I think a pretty striking binding.”
I wonder if you have anything to add on that front. I think David this is a good example of a trend that is being driven by AI that will impact on energy technologies and energy innovation.
First of all, why are data centers and AI data centers in particular becoming...
That's really driven by the needs of AI.
“So the largest models today are too big to fit on a single chip.”
They need multiple chips to work in tandem in order to train them and run them. And those chips need to be spatially proximate to each other in order to reduce data losses and data latency. So they can really work as a single computing unit. And that means you're getting more and more chips packed in together close together in these latest AI server racks. And so we show in the report that you know in 2020 before charging BT, the average power density of an AI rack was roughly 15 kilowatts per rack.
15 kilowatts is roughly equivalent to the power draw of about five. Evans running at the same at the same time. The next generation is going to move up to 600 kilowatts per rack. And the size of the rack hasn't changed. And as you mentioned, 600 kilowatts is, you know, nudging up towards the peak power draw of 65 households. And this really has implications for the power delivery architecture within within data centers.
We need to move up towards higher voltage delivery. And in some cases, even moving from alternating current to direct current voltage within the data center. And that has implications for the kind of transformers that are needed. But also for technologies like power electronics. So power electronics are semiconductors that are not used for information processing, but semiconductors that are used for power control and power quality.
So we can use semiconductors to step up the voltage of power distribution step down the voltage and so on. And as these data centers become more and more power dense, we will need these technologies to become more and more sophisticated. So this has implications, you know, it could help to commercialize technologies like solid state transformers, which can be very interesting elsewhere in the energy sector. Whether it be in terms of, you know, the next generation of distributed energy grids with batteries and solid state transformers at the distribution level.
But it also has implications for supply chains, because some of the power electronics that data centers need.
I semiconductor based on their based on critical minerals like like gallium, the supply chain for which is is very concentrated in a single country, namely China.
“So I think one of the big big findings from our report is that we need to pay more attention to this electricity technology supply chain.”
And the innovation stresses that data centers are playing on it, but also the kind of geopolitical risks that this could create. It's made a difference. Do you want to know what's in your community, but don't know how? You can now get to GOFANTMI.com and start your own spin-off business. Your next spin-off business doesn't have to be in the school or the kids. You can only start in less minutes in a GOFANTMI spin-off business.
The most important thing is that the human or the project is on the ground. GOFANTMI is the 4th spin-off business platform for money spin-offs, which are important.
If you are interested in your work and support yourself, you can do it right away. If you are creative, local, or your life is important, then it's important. There is a reason why GOFANTMI from million and support will continue and will continue from spin-off business. GOFANTMI makes many people a different difference and helps people to come together. Start in your own GOFANTMI spin-off business.
“GOFANTMI spin-off business platform for money spin-offs is the most important thing. You can now get to the point where you can do it.”
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for UAE-US tech cooperation. We thank them for their support. We thank everybody who is supporting this podcast for their support and we look forward to it developing and growing over time because the issue is so important.
GOFANTMI spin-off business platform for money spin-off business platform for ...
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
“GOFANTMI spin-off business platform for money spin-off business platform for money spin-off business platform for money spin-offs is the most important thing.”
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
“GOFANTMI spin-off business platform for money spin-offs is the most important thing.”
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
“GOFANTMI spin-off business platform for money spin-offs is the most important thing.”
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important...
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
“GOFANTMI spin-off business platform for money spin-offs is the most important thing.”
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing. GOFANTMI spin-off business platform for money spin-offs is the most important thing.
GOFANTMI spin-off business platform for money spin-offs is the most important thing.
A lot of the high-pascaler companies, so your Google, your Metas, your Microsoft, and so on, have pretty ambitious targets for reducing the emissions from their operations. And so they are looking to contract with power suppliers that provide low emissions electricity. Traditionally, that has been renewables and that is continuing as well. They do procure a lot of renewables.
But in the last 12 months, we've also seen a number of power procurement contracts for nuclear power. So, in terms of power procurement contracts, what we've seen so far is it's largely existing nuclear power plants, particularly in the United States, where data centers are contracting to offtake the power and in return for the demand certainty, the nuclear power plant operators are able to make investments in extending the lifetime in refurbishing the plants, even in reopening
them in some cases. So, we track about seven gigawatts of power purchase agreements that have been signed between data center operators and existing nuclear power plants.
And that's interesting because, you know, the IA has always said that
life extension of existing nuclear power plants in those economies where they currently operate
“as a very important strategy for energy security and data centers are providing an impetus to that.”
But we also see, you know, the data centers really driving forward the growing interest around small modular reactors. So, in our April report from last year in 2025, we tracked about 25 gigawatts of what we call conditional offtake agreements for small modular reactors. These are conditional in the sense that they're not legally binding on either party at this point
in time. They can be an expression of interest, they can be a memorandum of understanding, they can be a commitment to procure power, certain conditions are met, because small modular reactors are still a technology that is going through the commercialization phase. So, that was as of April last year, the interesting thing is since then we've seen the first firm
power procurement contracts between data centers and small modular reactors. So, about 750 megawatts, where this is now illegally binding contract on both sides to deliver the power and where construction is underway. So, that's quite a transformation. Of course, we don't expect them to come online in a big way, until around 2030.
But the other development is just a continued growth in the pipeline of conditional offtake agreements for small modular reactors. So, compared to 25 kW last year, as of April last year, in April this year, we now have 45 kW of conditional offtake agreements, which is really significant because, you know, the whole promise of small modular reactors was
that you can manufacture them in series and reduce the manufacturing costs. So, if you can get a big pipeline of projects, that can really help to commercialize this technology.
“Yeah, I think it's fair to say that this industry is driving the development of that technology,”
if SMR is more than any policies have, and it's very striking trend. You also talk about a very hot topic in my country, the United States, which is on site generation using natural gas. Talk a little bit about that because, say, what about that? Does it's getting so much attention here in the U.S.?
Yeah, so I'm happy to say a few words about, you know, some of the technical drivers for this trend and some of the challenges and maybe it can say something about the geospatial analysis that we did. So, you're right, you know, this is not a topic that we actually addressed in our report from April last year. It's something that's really emerging a big, big way since then with, you know, nearly 70 gigawatts of projects in the pipeline
in some form or another. And the big driver of this is, you know, first of all, grid connections take, take a long time and for data centers time to market is probably the key decision decision criteria. And so, a data centers are looking for alternatives to grid connections. And so, onsite prime power is one option. They're also exploring hybrid models where you have
Some onsite power, a grid connection, maybe a battery and that supplies the p...
data center or they're exploring a natural gas onsite power as a bridge strategy while they wait
for a grid connection to come. There's no doubt that this is happening, but there are also a number of challenges that we highlight in the report, you know, data centers particularly for AI, they need a very high up time. And so, to get the reliability that data centers need with onsite
“natural gas, you need to overbuild the natural gas facilities. So, that if one turbine goes out,”
you have some redundancy in the system that that can make sure that you maintain reliability. So, we estimate that you would need to overbuild by roughly 30 to 70% depending on the conditions to meet the reliability requirement of data centers. And at the same time, you know, something that we highlight in the report is that data center loads are increasingly variable with AI. You know, they're not stable based load anymore. They vary quite a lot. And so, this means that
complementary technologies are critical if onsite gas power is to is to power data centers.
They include energy storage in various forms within the data center, including batteries. And so, we're finding that, you know, with this rise in onsite gas, the power couple, if I can put it that way,
“is really onsite gas plus batteries to keep the level of reliability that is needed.”
And then the final, you know, thing that we highlight in the report is, if onsite gas is a way to achieve time to market faster than what grid connections can provide, problem may be the, the wait times for natural gas turbines. So, we highlight in the report, you know, natural gas turbine orders increased 70% last year. They're now at their highest level since 2000, which was right in the middle of the global dash for gas. And wait times extend out to 2030.
So, if you start developing an onsite natural gas project now, you might not get your turbine until until 2030 or beyond. And so, it's not necessarily the case that this would automatically be, you know, faster than writing for a grid connection. All of this to say, you know,
“this is a trend that's happening, but it, these caveats also mean that it's not an excuse to sort of”
slow down on on reforming the way that we permit data centers for connecting to a grid and accelerating the build out of transmission and so on, because onsite is not going to be a silver bullet either. I'll just add to what Thomas said. So, a couple of points, firstly, drawing on the fact that the gas turbine orders have really surge over the last couple of years. We find firstly that when we do it, didn't analysis of where these data, sorry, where these
gas turbines are located or where they're supposed to be located once they eventually are delivered. A large chunk of them tend to be in what we call as data center hotspots. So, these are states that currently account for over 70% of the capacity of data centers within the United States. But what's also interesting is that there's a non-trivial share of them, where the locational information is unknown. As we tried to dig into why that was, it was interesting for
us to find out that it was basically data center operators who were trying to keep their
give the information and locations hidden for a variety of reasons, including competitive reasons and so on. So, nonetheless, we try to draw on the analysis and do a little geospatial survey just to understand how quickly these onsite gas projects are progressing on ground. And we find that, in fact, over the last year or so, the share of total gas turbines in gigabot terms has remained pretty much constant. So, the share of completed projects has remained constant,
whereas over 80% of the total capacity that has been ordered still is in the pre-construction phases. So, they're still kind of looking at land acquisitions and clearing out land and so on. So, in other words, we find that even though gas turbine orders are surging, the action on the ground still, of course, follows the physical laws and regulation and the need for permitting and construction times which tend to be much slower. It made an interesting point in the report too
that although there's just big backlog in natural gas turbines, most of the orders are not actually from the data center sector that there's from other parts of the industrial world. I think this is an important point. You know, the United States was about 45% of gas turbine orders
Last year.
data centers are not a demand driver. And even in the United States, you know, there's other
reasons. There's generalized load growth. There's industrial reassuring. There's, you know, the fact that there's been a bit of a law and investment in dispatchable capacity. So, utilities are now seeing that, you know, there are reserve margins are shrinking and they need more dispatchable capacity, also because they've been retiring plants. So there's multiple reasons behind this growth in order books for natural gas turbines. And the manufacturers are at pains to stress that, you know,
data centers are not the only driver here either. You mentioned batteries and how they're paired with gas turbines, but you have a longer discussion about batteries and the reports and in the
report and particularly their role in handling spikes and power usage to data centers and related
“topics, even to say a word about that because of the particularly interesting. Sure. So the key”
driver for this is that AI workloads are spiky. So if you're training a model or if you're running it for, for inference, the tend to be periods of time where compute is very intensive, then periods of time when compute is much less intensive as data is being exchanged between different processes or saved for, for check pointing for the next part of the of the training run. And this spikiness is much faster than, you know, the electrical equipment within the data center can handle. And so it
needs to be smoothed out to preserve the electrical equipment within the data center. We actually heard, for example, that one of XAI's data centers that was powered with onsite natural gas saw the, the gas turbines damaged by this spiky electrical load. And it was only when they brought
“onsite a battery to help smooth that out, that, you know, they were able to continue the training”
operations. So there is a kind of cliche that the data centers have, have smooth based load powered amount. That's not what happens within an AI data center. It's powered amount is much more, much more spiky. And to smooth that both for, for the sake of the grid, because spiky load would disturb the grid, but also the electrical equipment within the, within the data centers. Energy
storage is becoming, is becoming really critical. In different kinds, you know, short term,
millisecond level capacitors to absorb the very, very fast, fast swings. And then, you know, batteries for longer durations, but also for reliability and and so on. So we, you know, in a way, batteries are becoming the Swiss Army knives of data centers. They help with this problem of spiky load within the data center. They can help integrate data centers in the electricity grid. If, you know, you can shave some of the data centers demand during time times of, of grid stress,
because you're running the data center on the, on the battery. So we do expect this to grow as a, as a trend in the next few years. Well, so maybe as we, as we start to wrap up, you have a whole discussion in your report about how AI is affecting the energy sector and the difference is
“making in a number different ways. Do you want to talk about that? Sure, David. In fact, I think”
that's one of the, you know, aspects of the AI and energy nexus that's often forgotten and relegated to, you know, the, the latter half of these discussions as, you know, as we are, even in this discussion today, AI, in fact, has tremendous potential to transform processes within the energy sector. But unfortunately, for a variety of reasons, they are currently only being deployed in very piecemeal, uh, fashions across, uh, different sub sectors and, uh, especially in, in startups,
rather than, uh, by some of the bigger players. So we see, uh, for example, that artificial intelligence applications can help, uh, improve, uh, weather forecasting, can help curtail, uh, can help reduce the curtailment of, uh, of renewables, uh, it can help, uh, optimize the grid, uh, capacity by ensuring that more electricity is able to flow in any given, uh, between any, uh, to given points, using, uh, some, uh, different types of, uh, optimizations. And of course, unlock various types of efficiencies
in the system as well. So we estimated that, uh, globally, uh, the potential for AI to reduce energy demand, uh, through efficiencies can be as high as 13 and a half exeges by, uh, 2035. So in 10 years, that's about, uh, as much as the, uh, energy consumption of Indonesian today. Uh, so that can, uh, we can pretty much avoid the electricity or the energy, uh, demand, rather, uh, of one of the, uh, you know, biggest economies on earth, uh, just, uh, by deploying some of these AI applications
That we currently know that are, that are, that are actually already being co...
but if we were able to really scale it up, uh, to the sector levels. And of course, uh,
“there are a lot of barriers that prevent that from happening. Uh, we have, uh, you know, uh, uh,”
in our conversations with these, uh, these energy companies, they often talk about the lack of appropriate skill sets that can bring together, uh, the, uh, the most cutting edge knowledge on artificial intelligence with the more, uh, you know, regulated kind of nature of the energy system, and bring and bring those two ex, uh, you know, uh, sets of expertise together, uh, towards the deployment of these, uh, solutions is where the biggest challenge lies.
Of course, aside from the fact that data still happens to sit behind silos and they're not really able to scale up, um, uh, the training of models, uh, that kind of really cater to these domains specific kind of applications. Uh, so those are the kind of challenges that if overcome can really help unlock, uh, some of these benefits on on a broader scale. So fascinating. And I still agree with your point about, uh, how this whole area doesn't get us not as much attention as it, uh, it merits
and AI is this cross cutting technology that's going to have impacts in every part of the energy sector. It's got potential to transform many, many things and, um, you, you get at this in your report and, yeah, like in this conversation, it's often saved until, uh, saved until the end of that, that topic. And, but it really deserves a lot of attention. One and David, if I can, sorry, just add, it's not just this conversation, even in our own report, we put it at the end. But I mean,
uh, there's a reason for that, of course, uh, you know, uh, this is not to kind of undermine the importance of, uh, of the, of the, of the physical need for, you know, energy and electricity
“specifically that the, the data center will really driving. But nonetheless, I think the broader”
point is that we must not also forget, uh, some of the more productive applications on the other side. One more question related to this, you talked some about the impact of AI, uh, AI, um, um, robotics and, I guess, physical AI more broadly. Maybe to say a word about that. Yeah. Uh, so, uh, in fact, uh, we had been trying to understand, uh, a little better or how AI can help, uh, uh, you know, uh, on the question on competitiveness. So that was our starting point of
this analysis. We wanted to see, uh, you know, where AI is currently being used in industries, uh, and then that kind of really broadened our approach, uh, to, to kind of look at physical AI more broadly, physically our, of course, deals with anything where, uh, you know, uh, where artificial intelligence can be embodied in some way. So this includes, of course, robotics, but also includes drones, also includes autonomous vehicles and so on. And we find that, uh, you know, as a result of
automation and physical AI, uh, especially on the industrial side firstly, uh, uh, for the most energy-intensive industries, they can be a significant reduction in energy consumption, uh, enough to
to really, uh, you know, matter at the, at the bottom line for these companies as energy, uh,
cost, uh, are often significant, uh, you know, for, for such industries. Uh, uh, and the second is that, uh, as we find, uh, the application of specialized robotics pick-up, uh, they might be implications on energy demand from, uh, these robots itself. Of course, that's a broader question, uh, that's a question that we are tracking, uh, you know, we have only very early information on the electricity consumption that, that, uh, that is driven by, uh, uh, by these autonomous vehicles, drones, uh,
robots and so on, uh, but it's an issue that we will be tracking more closely. And, uh, we often, uh, talk about this, but, uh, one of my kind of soft predictions within the agency is that, uh, we will soon have to work on a specialized report on the intersection of robotics and energy. Well, I look forward to that report. Well, in, in this one is so rich, and we've only started to scrunch the surface of the topics that you discuss in there, right? For anybody who's looking
for authority to be information on the intersection between energy and AI, AI, I highly recommend this, and of course, the landmark report that the IEA did about a year ago, that, that, that supplements.
Uh, Thomas and Sid, we always close this podcast asking each of our guests two questions.
“And the first question is, how are you using AI in your day-to-day life?”
So I don't describe it. Yeah, I wouldn't describe myself as a, as a power user, David. I mean, we use it a lot, uh, we at work, um, I think the, the best use is, you know, if you have a new topic that you want to learn quickly about, um, it's incredible for, for mapping that out quickly for you and understanding which resources to read, uh, more deeply. Um, but I'm definitely not, you know,
Running my own AI agent in the background that is managing my groceries and, ...
So I'm not, uh, I'm not what the tech companies call a token maxor. Sid, how about you? Uh, and so in my case, of course, just like Thomas, you know, this, uh,
what I like to call us and hand search. So I mean, we've always been searching, but, you know,
“way to really, uh, you know, hone in and, and kind of drill down to a specific issue. I think, uh,”
AI search engines really help me get to that information, uh, much quicker. But I think, uh, you know, I, I'd also use this opportunity to talk about how we have been using AI at work. And I think as an agency, we've been quite proactive with, uh, with, uh, you know, the applications, uh, but of course keeping in mind, you know, as an agency, we have security considerations and so on. But nonetheless, there have been some, uh, very interesting work that's been carried out,
including the stuff I mentioned on earth observation. That's actually not just earth observation. We use, uh, you know, a layer of AI on top of that to kind of sift through the vast amounts of data that we are able to capture. So, so yeah, it's a bit of both. I would say that in a personal level, it's, it's all the enhanced search, but, uh, at work, we are doing some pretty interesting things. A fascinating, well, then, then our final question, could you please recommend three books,
reports or articles to our listeners? It could be on any topic all their new, um, a favor from your childhood or something you read yesterday and maybe start with state and then
“close with Thomas. Oh, wonderful. So, I have three books to recommend. And I think, uh, uh,”
they're all quite topical and, and thematic, uh, you know, uh, the first one being
chip wall by Chris Miller. So, it really drills down into, uh, how the, uh, development of the semiconductor industry took place, uh, and, and where we see the world at today. Uh, the second is the ASML way. This, of course, talks about the company that manufactures the chips in the first place. It's, it's written by, uh, Mark Heijink and highly recommended, uh, into explaining how this, uh, this really surreal, uh, advanced technology, uh, you know, came into being and, and why it dominates
a chip production today. And the third is, uh, a book on the political history of technology in India. It's called Midnight's Machines. It's by someone in Arun Sukumar. Uh, it's, it, it provides a more overarching kind of view of, uh, how independent India, uh, adopted technologies, uh, and, uh, and, and where, uh, you know, and the state of play today. Uh, great. Could you spell out the, the surname of the last, uh, author there? Uh, said. Yes. Uh, that's Arun Sukumar,
SUKU, MMR. Terrific. Thank you. Uh, Thomas. Yeah. I, David, I love podcasts that, uh, that
asked this question at the end. Um, I always find, uh, people's answers, so, uh, so enriching. Um,
so for me, um, one book, uh, one short story and, and one article, uh, the book is, um, these strange new minds by Christopher Summerfield, uh, he's, uh, an academic at Oxford. He's a neuroscientist, uh, but also works a lot on AI and, uh, linguistics. And I find it, the best book that I've read sort of on the history of AI, uh, and how large language models really came out of out of nowhere from the perspective of, of, of linguistics, you know,
linguistics was on a completely different track and then large language models just came out of, out of nowhere and I find it, you know, it hits two topics that I'm very interested in AI and, and language and it's, it's a fantastic book. Uh, the short story is understand by, uh, Ted Chang, uh, so Ted Chang wrote the short story that the movie, uh, rival is based on, uh, and I love that movie, but this short story, I understand, I think, uh, is probably the best description of what it
might feel like to be an AI. Uh, so the premise of the story is that the CIA has a program, uh, that can boost human intelligence and it gets out of control and it's this guy describing what it feels like to get more and more intelligent. And I, it's, it's absolutely fascinating to read it was written in the early 90s and if you read it now, you just can't help feeling like, okay,
“that's what it feels like to be tracked EBT as it runs through a training run. Um, really strongly”
recommend it. Um, and in the article is, um, it's on a sub stack, uh, called crow's nest. It's information processing is information or commodity. Um, by a guy called Advait Arun, um, I find this article incredibly interesting in the sense that it tries to understand what information is as an economic commodity. And if AI is, uh, you know, going to improve our ability to generate and processing information, then we need to understand, you know, what is information as an
Economic commodity?
whether just improving our ability to, to process, uh, information and, and take intelligent
“decisions is actually going to lead to sort of economic transformation on the scale of some previous,”
some previous, um, innovations. It's, it's, it's, it's a really great read.
Thank you both for those. And I have to say Thomas, I love asking this question because I learned
“so much and of the six recommendations you both just made, I've only read one of them. I have”
to say, so I'm excited to dive into your recommendations. And I'm very grateful for the time that
you spent with spent with us here and delighted that you put out this report for anybody interested
“in AI and energy issues. This is a really important resource. So thank you to you and your team”
and to the whole leadership of the IEA, uh, for putting this out. And again, thanks for joining us today. Thomas Spencer and Sip thing, many thanks. Thank you David. This has been the AI Energy and Climate Podcast, a special production of the DSR network. [Music]
