The Daily
The Daily

The Workers Letting A.I. Do Their Jobs

6h ago36:306,609 words
0:000:00

Since the release of generative A.I., questions have been raised about how it would change our lives and jobs. Now, many software developers who were early adopters of the technology have outsourced s...

Transcript

EN

I'm Dame Brugler, I cover the NFL draft for the athletic.

Our draft guide picked up the name "The Beast," because of the crazy amount of information

that's included.

I'm looking at thousands of players putting together hundreds of scouting reports.

I've been covering this year's draft since last year's draft. There is a lot in the beast that you simply can't find anywhere else. This is the kind of in-depth, unique journalism you get from the athletic and in New York times. You can subscribe at NYTimes.com/subscrib. For the past few years, people all over the world have been asking how AI will change

their lives or affect their work. And the answers range from total salvation to absolute doom. At the front lines of all of this are software developers who are using artificial intelligence so much that it's already taking over many of their day-to-day tasks.

Today, I talked to Times Magazine writer Clive Thompson about his recent survey of the tech

industry to find out what it looks like when people invite AI to do their jobs. It's Tuesday, April 14th. Clive Thompson, legendary tech reporter, person whose work I have admired for a very long time. Welcome to the Daily.

Yeah, it's good to be here. So you are here because you've been covering extensively. The question of how much AI is affecting the workers who are really the backbone of Silicon Valley, programmers. The people who write the code that powers every piece of software we use.

This is a group of people you know well, not least of all because you wrote a book about them.

You spent a lot of time talking to them in recent months, so what did you find?

Talk us through that reporting, what it entailed, and what it unearthed. Sure. I've been following the arrival or the advent of AI as a tool that can write code for a couple years now, but it started to accelerate a lot last year and I really just wanted to find out what was going on in the everyday trenches of software development.

So I just hit the road and I talked to about 75 different software developers all around the country. 75. Yeah. That's a lot.

That's a lot of them. I might have over done it. But I really wanted to know what was going on kind of across the board because different software developers have very different types of jobs. So I wanted to talk to people who are doing consulting work for regional banks in Tennessee,

people who are doing buzzy little startups, just to them in Silicon Valley, trying to make something new. And then the people that are working at the big software giants like Google and Amazon, and Microsoft where you've got tens of thousands of developers having to take care of

these code bases that have been around for 20 years, right?

And what did they tell you? Well, it was really interesting because what I found was that a lot of the coders, the writing a lot less code, some of them are writing no code at all. They are having the AI write it for them. And this transition has happened really quickly.

I would say it began heavily in the last six months, it accelerated in the last three months as these AI coding tools have just gotten a lot better, and they have started to gain the trust of a lot of programmers, including ones that might have been a little skeptical before. And it's a really start change from what things looked like even a year or two ago.

So you're saying coders aren't coding, is that right? Well, not all of them, it's a gradation, but of the people that I spoke to, and majority of them were outsourcing a lot of their day-to-day programming to AI. There are definitely coders who are writing very little to zero code. Wow.

It's a sea change. It's a big sea change. And how do they feel about that?

When I first started the research, I kind of wondered whether some of them were going

to be uneasy or unhappy about it, because I had known from decades of talking to software developers that they often derived enormous pleasure from writing lines of code. It was like solving a bunch of little puzzles, and it was just delightful when they did it. And so I thought maybe this was going to be deflating or demoralizing, but in reality,

the great majority of everyone I spoke to was really kind of jazzed and excited about the new powers that the AI was giving them to be able to just say in plain language, here's what I want created, and then five, ten minutes later, have the working code back, and

They're looking at it.

What they all said was that they've always loved building things.

That's the fun of being a developer as you take an idea you have and through sweat and work, you turn these magic words into a machine that does things for you. And that feels like magic, right? That feels like something from Tolkien. Totally.

And they still feel that, even though they're not writing as much or any of the code themselves, they still feel like they're a sorcerer who's thinking about what needs to be made, and then using these tools to bring it into being really quickly. Some of them said they feel that loop of success more quickly, because the AI moves faster than they would have done.

So it was interesting, a lot of them were really, really pumped, really, really stoked. OK, I want to talk about that. I want to ask you about this excitement about this magic that also have to be this massive disruption in their industry.

But first, I want to just understand how big this shift actually is.

Yeah, if you want to understand just how big this change is, you sort of have to understand

a bit of the history of soffered development. So it's been around for maybe 50 or 60 years. It's essentially kind of a new field. Ever since we invented computers, we've been figuring out ways to talk to them, we're making those ways to talk to them, get a little easier, a little closer to human language.

So what's happening now with AI is probably the biggest change it's undergone yet. So the 1940s, you have the first computer, the Eniac computer here in the U.S. And the way that they programmed it, the programmers, it was a team of women and they had literally rewire the entire machine to do something different. So they're crawling around, sometimes crawling inside the machine and rewiring it to create

a new logical system to solve a problem, right? Amazing. Labor intensive coding. Very labor intensive. Yeah.

And of course, when computers started to go into industry, it was impractical to require people to crawl around and inside them and rewire them for every single new problem. Yeah. So they started making computer languages that you could type these commands and they would get translated into the instructions that are essentially like digital wiring.

But honestly, the early languages, I mean, I've hooked around in them and I've talked

to people that had to write them and they were really hard because back in those days, every single little thing in the process of asking you to do something, you had to be very specific. Put this number in your memory here and then put this other number in your memory here. Now, you're going to add them together and put the sum in here. And so it's a very nice layer on us.

It was like juggling 900 balls just to get it to the most basic piece of math. And then over the years, coders said, let's automate some of that labor. Let's make that easier. And they have a funny phrase for it. They call it adding a layer of abstraction.

So all these little finiquity steps that slowed you down a lot in the 60s and 70s, those are kind of gone. I can write code much more quickly. It's much easier to write, it's kind of more like human language. Right.

Every time they realize that they wanted to accelerate the pace at which they wrote

code, they would add another layer of abstraction.

So things just kept on getting a little easier every decade. It sounds like what you're describing is a process where you move from actual human beings doing real grunt work to make coding happen. People putting wires into different slots and computers to something much more sophisticated over time, but it's still sounding quite tedious.

You still have to write much of this and deal with your problems, et cetera. Yeah, absolutely. I mean, coding was really a grind. Okay. So what about now?

What's happening now is that people are using AI agents to write the code for them. So they've generally gotten sufficiently trustworthy, that a lot of the programmers I was speaking to were just using the automate a lot of their writing of lines of code. So with AI coding, they will have teams of agents, like you will ask the main agent, hey, write this feature for me.

Let's work on a plan. The AI will say, here's our plan. And when I say go, it will spawn other sub agents to do different parts of the task. You'll have one that's writing code and another one that is looking at that code and testing it.

And if there are errors, and there's almost always errors the first time, another agent will

look at those error messages and go, okay, let's change the code and test it again and keep on going in this little loop, like a little team of agents all working in a swarm. And only at the point in time where they're like, okay, it's passing its tests. It's not producing any errors, then it'll go back to the human and say, okay, here's the code.

Here's the test we wrote. Here's the results you can see our work and that's really interesting. There's really very few people having experiences like that other than computer programmers right now. So what you're describing sounds like an enormous leap from the progress that we've seen

over the course of time in coding.

How is it changing things practically for coders?

Yeah, well, it's making it a lot quicker to do a lot of things, right?

A lot of software developers will tell me that it's just dramatically accelerating the pace at which they work. And to give you a sense of just how fast it can be, if you're like a small startup, these two person shops and I visited some of them, they would tell me that they were moving up to 20 times faster than they would have if they were trying to build that company two

years ago, right? Whoa, that's crazy. As some of them said to me, you know, a request from a customer for a new feature that might have taken like a full day could be done in maybe half an hour. That's wild.

20 minutes to write it, 10 minutes just to look at over it and make sure it's good, right?

Now, I should say that this is true of a smaller company where you're kind of writing entirely new code and you don't have to worry about breaking something that already exists. So when you go to a really big mature companies, they're using AI, but their metabolism is a lot slower, they're being a more cautious and so when I went to Google, they were saying like at a small startup, you know, 100% of the lines of code are written by AI.

A Google, it's more like maybe 40 or 50%. And it is sped up their overall metabolism by really only like 10%. Although as the developer as a Google told me, like, hey, 10% for a company R size, that's a huge win. So the AI is making them faster at coding in some cases doing a lot of that work for

them, but it sounds like they still have their jobs. So for the coders who aren't doing coding anymore, what are their jobs actually look like now? Well, their job in a way is to think about what the software ought to be doing.

Now, that was always their job, right?

You know, they always had to think about the shape of the software and then slowly painstakingly make it happen, because they don't have to spend so much time on the slow

painstaking, they can spend more time experimentally iterating, right?

Like, I talked to a lot of developers who would say, now that I'm talking to the AI and it's doing the coding, we'll run through like 10 different possibilities, and I'll pick the absolute best one. They said it feels like being Steve Jobs, where you go to their minions, bring me nine designs of the iPod, and I will handle the meat and pick the best one, right?

The several them literally said the Steve Jobs comparison. So they're kind of becoming less like construction workers and more like architects, but on a deeper level, what they're really doing is just talking. They're having a lot of conversations with AI. Yes, with AI.

They're having conversations with AI, and having to be very clear, the thing about AI is like, it will go off and do the wrong thing if you're not incredibly clear. And how did they find the process of communicating all day long? Are they good at it? Do they have to develop that skill?

Definitely, I had some of them say that they were surprised to find that they were becoming

better communicators, like better communicators in English, right?

Like they're writing emails better, the fact that they've had to become better communicators to the AI has made them better communicators to the world in general. The bot has made them better at communicating and having human relationships pretend? Well, I don't know if human relationships would be the way, although... That's a bridge too far, God.

Well, although there is one coder, Manoeber, he's a software developer with a small new startup, a company called Heiper Spell. He sort of said, you know, kind of jokingly, but he said it, like, I wonder if we're

finally teaching all the nerds empathy, because they're having to do it.

I know, I know, nerds have empathy, I'm a nerd, I have empathy. This point was, you know, I think correct, that the job didn't used to require quite so much communication, and now it does. Talk to me about how Manoeber's work is changing. He's interesting, so Manoeber has been an developer for a long time, and he's worked

at very large companies, he's done his own startups. And when he first started using AI, he wasn't really sure about it. He was worried that it was going to hallucinate things that it would produce code that would be too flabby or inefficient. And what he said is that over the months, his concerns began to boil away.

He got more confident when he could see that it could do things reliably. So Manoeber does something that I actually heard several developers have settled on as an interesting technique, which is when they want to write a new feature or write a new function or improve some aspect of code. They will essentially get into a conversation like a socratic dialogue with their agent.

They'll say, okay, ask me questions about how this software features should work. And the agent's like, okay, what is this going to do? Should it do it this way? Is it going to be written in this language or in this language by having it interview the

Coder as it were?

It got them to think about what the software should really be doing. And then it was off to the races with having the AI agent do things. The problem is that I don't mean to anthropomorphize it, but it can sort of misbehave. Right. Manoeber would tell me there are times when the AI would go off and come back and say,

oh, well, I didn't do those tests. I didn't think they were that important. And he's like, wait a minute, those tests are completely important. And so they would have to figure out ways to sort of reprimand it or ways to control or punish it in some way.

How do you punish AI? Well, you yell at it basically. What Manoeber would do is he would write these very stern, list of instructions like a 10 commandments and he would have this file.

He would say, every time you do anything, you look at this file first and you always

follow these commandments.

Very stern commands like you must test the code in this way or that you must do these things.

You must not do that and suffer developers that they would show me these commandments. And they would be in upper case like they're yelling at it and they would repeat things over and over again like they were trying hypnotize the AI agent by sheer repetition. Or they would say things like, if you don't do these tests, I will be fired. They would have this very, wow, really raising the face in a very emotional language.

One of Manoeber's prompts would say that failure to do these tests is unacceptable and embarrassing. Embarrassing. Embarrassing, yeah. I asked him, I said, does that work?

Well, yeah. I was going to ask, aren't we supposed to be being nice to these AI's just in case the robots end up taking over the world? Yeah, exactly. Don't eat me.

I mean, in one sense, I understood why it works. It's because large language models, I mean, their language machines. And so they understand the meaning of language based on the company it keeps.

So if they see the word embarrassing, they understand that like, oh, that comes from a bad

neighborhood, right? Like there's bad things there and there's embarrassing humiliation, you know, whatnot. And so that just helps raise the stakes so that it grasps the import of those words. So it turns out that actually emotional language and stern and even harsh language probably does have an effect, right?

Like it seems kind of nuts on the surface. But when you think about the way large language models work, it makes sense. OK. You have done a very good job of describing why coders like this new way of working. From what you said, it seems pretty obvious that there are many upsides of this for the

people doing this work.

It sounds like they just feel much more powerful overall.

Yeah, most of the coders I spoke to were just sort of astonished at how much more productive they felt. They would say things like, you know, I've had this to do list of things haunting me for years and I'm knocking it all off. I'm getting it done.

So overall, there is a real sense of excitement, even getting us at times. But there are some concerns. They're worried about losing their skills if they've relied too much an AI. They're worried about jobs being taken away by AI and they worry that they're moving too quickly.

We'll be right back. Hey, I'm Joelle. And I'm Juliette from New York Times Games. And we're out here. I'm going to be about games.

Do you play New York Times Games?

Yes, every day. Do you have a favorite? Connections. Makes you think. I feel like it gives me a elasticity.

We ate four groups of four. Hmm. It's actually a pretty cool game. What's your favorite game? Very cross-match.

The cross-match. I did it in my brother. We can't say it sometime. But I didn't. I couldn't do it in my eye.

I feel like I'm learning. I feel like I'm accomplishing something. I like the do do do do do do do do do do when you finish it. My family does word on me. I have a huge group chat like my grandma does word on like your grandma does word on every day.

Yeah. Do you have a word on hot take?

You should start with the word that strategically bad to make it more fun.

Com slash games for a special offer. Okay, Clive, I want to talk about those fears. Can we dig into them? Are they right to be worried about the things that you laid out? Yeah, absolutely.

The concern about deskilling is really interesting because a lot of the people that I was talking to for the story were, they're a little more senior. So they're the generation that knows code really well because they still had to do it

By hand.

And so they would tell me that it's great for us to have the AI agents because if they produce

something wrong or flabby, we have the experience to look at that and go, that's not good to do it again.

And they would all say, well, you know, what about the next generation?

Are we going to discover five or ten years now that the next generation of software developers simply don't have that deep code sense that lets them be really really good engineers. And a related concern there is that because they're not writing code so much anymore or at all in some cases, they worry that they're losing a bit of that code sense, right? If you don't use it, you're going to lose it.

One person I talked to who was worried about the way AI coding tools were deskilling her was Pia Torian. She was a reasonably newer developer and so she was in some of her early jobs from players who were like, we want you to use copilot by Microsoft to write code. And she was doing hundreds of prompts a day for months.

And she started feeling, wow, this is actually degrading my own knowledge of code. I feel like I'm losing my ability to code. That's what she told me. Okay, that sounds unsettling for her.

But why is it a problem that she's losing her coding skills if AI can do that for her?

I don't know how to type set. I don't do calligraphy, right?

It's never been an issue for me.

Aren't there certain things that we can just say, okay, we've moved past this as a society? Is this one of those things? Well, this is the great debate and I don't have an answer for that. I can tell you that there are two camps of this in the world of software and they are heatedly opposed to each other.

There is what I would say is probably the majority of the developers who are a little less worried about that. They think that the AI is good enough and will continue to be good enough that it will actually be better at doing a lot of this code than the humans are because it won't make the stupid human mistakes.

People think of code as like this magical thing where you have to work really hard at it.

But a lot of it is just doing the same thing over and over and over again, right? Right. Extremely wrote tedious work. It's wrote in tedious and when humans do it, we make mistakes and the robots don't. So you can make an argument and this cohort of developers do that the software will actually

be more reliable because the agents are doing all of it. But then there's a very strong argument by a smaller cohort that say, no, when you write new code, it's not just something you build and it stays up, you have to maintain it. As you add new things to it, adjacent to it that interact with it, there might be interactions that are better weird.

So the code that the AI is writing might look good right now, but there's a potential that down the line, it could cause really difficult or nasty interactions with other parts of the code base. There could be subtle bugs that we don't see right now that really start to pile up in five years now, you've got a huge mess.

So there's a cohort of a coders are saying, we shouldn't be using this stuff, not at scale, not the way they're using it right now. OK, I want to return to the question of job loss, which was another fear that you mentioned, do you think that all of this means that junior coders right now are actually somewhat replaceable in this field and that maybe in the near future, there will be no such thing

as a junior coder at all because that entire job category will disappear in the same way that the job of typesetter did? I think there's definitely a danger that demand for new junior hires is going to soften. And I think we've already seen that, you know, if you look at the research of Eric Brinjolson at Stanford, he analyzed job postings, job hearings and he found that in software

developers, it was down by 16% and that was already happening just in the last year or so. So if that's happening when the AI coding tools are really just going from a crawl to a walk to a run, what might happen when they're sprinting. And the other problem of course is, you know, this is capitalism, right?

I mean, all of these large firms are always looking for ways to save money and at high

tech companies, some of the most expensive money are the salaries of these developers, you know. So the idea of like, oh, we can replace even a chunk of them with AI, that's really compelling. And we're seeing this across all forms of white color labor right now, right? All the sea sweet folks love the idea of being able to either lay people off because they

can replace them with the AI or threaten to do so, right? Because even if you're not replaced by AI, if you're deskilling, devalue the job, it just gets easier for the owners to push you around. Clive, if coding is at the vanguard of AI affecting work, the amount of it, the quality

Of it.

These all of this mean for the rest of us who don't work as coders, will white color

or blue color workers and other fields see AI take over in a similar way?

What developers are experiencing right now is something that maybe seems a little paradoxical, which is that they had spent years developing these very, very hard technical skills. And it turns out those are some of the easiest things to automate, right? The hard stuff to automate is like talking to our colleagues and our customers and figuring out what should we be building, right?

Setting priorities, setting strategy, AI can't do that, right? There are still truly human skills. There are still truly human skills, and so I began to wonder if that's a pattern we might see in other forms of white color work. Like back in the '80s, it seemed like chess was so hard to play, there was no way a computer

was going to do it. But it seemed like speaking, that's easy, surely a computer should be able to speak. But it turned out the chess was actually easier for computers to do than to learn to speak. They conquered chess in the '90s, and it took like two decades more to learn how to talk like a human.

So one of the things I think we see with AI is that things that we thought maybe were like,

"Oh, this is my big skill." And that's not really your skill, your skill lies elsewhere. OK, so the process that you're talking about is essentially one in which in different fields we kind of learn the thing that is not automatable that a bot can't do and where people sort of focus on that.

How long do you think it's going to take for those transformations to hit jobs on mass? My crystal ball was that it's going to be longer than we think for the following reason. What we learn from the history of computers is that it can take things a lot longer to have an impact on corporate life than we would expect. So back in the '80s and early '90s you get the advent of the personal computer, a company

can now instead of having someone type mellows on a typewriter, they can do it on a computer. And there's this assumption that it's going to dramatically increase the productivity of companies.

And at first it doesn't, and economists are kind of baffled by this, and it was because

to actually increase productivity or efficiency of companies the company had to reorganize

the way it did business around the computers, right?

They had to start going, well, we don't need you to just treat the computer like a typewriter and print up a memo and then send it to everyone, just email it to someone. That means everyone at all our regional offices can all be reading the same stuff at the same time. And so once they began to reorganize the way that information and decisions flowed around

the affordances of computers and the internet, then you began to see changes in efficiency productivity and GDP, but it took a long time. And my suspicion would be that it's going to be the same way with AI. Well, yeah, I want to ask about that, because it feels to me as though everything has been on a really accelerated timeline recently, I mean, the pace of innovation, it truly

feels like super speed on a different level. Is that wrong? Is that just me? I mean, it has, but a lot of stuff that has changed doesn't have industrial impact. Right?

Like whenever someone lays off a bunch of people saying we're going to replace them with AI, often they discover, if you fall them six months later, they had to rehire a bunch of them because it just didn't work. So yes, there is a massive acceleration on a cultural level that can often be quite alarming. But if you go around and talk to companies, big companies, like the ones that are, you know,

moving the economy, it hasn't really happened there. And you can even see that, of course, in my reporting, right? Small little companies start up, yeah, they're moving really fast. Google, moving 10% faster. And I think that's closer to the impact you might see if you look at what color, work,

writ large. There is something comforting about that. I have to just said, man. So we've been talking a lot about the shift in the work of the people who are making software. I want to ask you about what they're actually creating and how the innovation that we've

been talking about might change that. Look, what is the upshot of all of this for those of us who will mostly interact with it by interacting with the products that these people make. In other words, what is the upside for the rest of us of all of this? I guess one of the upsides is that it feels like we live in a world where there's tons of

software, right? Just stuff that didn't exist 20 years ago. Sure does. You know, text messaging, but if you look at the world's of work, there is just a massive

amount of things that have really never been helped out by software at all.

I can't tell you how many companies I talked to.

They're like $15 million, you know, concrete mixing firm.

And they would like to have better software to run their company.

And they don't really have anything because it sounds weird for a $50 million company.

But they're not big enough to be able to hire like five software people to make custom software. They can't afford to do that. So they just toddler long running their entire company on three Excel spreadsheets on a Windows XP computer that they're afraid to update because that will break everything

that they're using to track all their expenses. An astonishingly large number of these mid-size firms are horribly underserved by technology. And if you work at one of these companies, you know what it's like. So if it becomes easier to write software. And if you could have a world where like, okay, you know, I've got my, you know, statin island

concrete mixing company 50 million a year, I'd love software to make life better for my employees.

But I've never been able to have it because I can't hire five people to $100,000 a year.

But what if one person came along and said, okay, for 60 grand, 70 grand, I can build that for you to maintain it because that's just how much more productive I am a software developer. You could start to see a lot of improvements in these kind of actually good ways in everyday life for a lot of people at work.

That's one area where I think you might see that happen.

And I guess at the highest level, what's going to happen is that software stops being something that is precious and rare. It reminds me maybe a little bit of what happened with like this is going to sound really weird but with paper. So paper used to be incredibly rare.

You go back to pre-revolutionary Pennsylvania and the average person had access to like four pieces of paper a year.

And then suddenly it becomes a lot cheaper on all over the place and you've got weird things,

like post-it notes, which are these really weird forms of paper that just transform the way that you live your life in this fight. Totally. Love post-it notes, by the way. I wonder if people were scared of this proliferation of paper at the time.

Oh, they definitely thought it was bad.

I mean, that's why the comm stock laws existed, right?

They were worried about young people writing smarty letters to one another. So I think something very weird is going to happen as software stops being something that is special and difficult and becomes almost like a post-it note where it is ubiquitous. We call it into being for short term reasons. It changes aspects of the way we communicate and the way we deal with other people in

ways that I can't really predict. But I do think that that is what we're looking at. But here's another parallel word processing back in the 70s and 80s. It was really hard to do. You had a machine and someone spent hours designing a document and you went through many

iterations to make sure it was right before you hit print. And then in the 1980s and 1990s, the Macintosh said, "Okay, anyone can make a document." And suddenly people are like creating really ugly flyers for their birthdays and Zine's one and this weird explosion.

And if you'd said to someone in 1983, what exactly is this word processor going to do?

You would not have predicted right girl Zine's in 1996, right? Totally. And that is kind of I think this transformation that's about to hit us. I don't really have the ability to predict what that's going to mean for us. But I think it's going to be incredibly weird in the same way that those previous transformations

are incredibly weird. No, I mean, I think your answer is kind of, this is going to be both awesome and weird and potentially bad in some ways we really don't know. And that's just kind of the deal. You know, there's a phrase in the world of technology that more is not just more and more

it's different, different behaviors emerge. And so that is something that I think we are likely to experience, socially, civically, as software goes from being something that is hard and difficult to something that is trivially easy to summon into being. What I'm hearing is that whatever happens, this change is going to reflect us.

We will get the technology that we deserve. Yeah, exactly. It is going to catalyze all of the human desires, the malign ones, the delicious ones, the terrible ones, the beautiful ones. We've got, you know, Shakespearean Sonnets about your birthday.

We've got people who are using them to analyze and understand personal medical records. We've got disinformation and wholesale cheating on essays at college and high school. All that stuff is going to be on offer at scale, as they say, in Silicon Valley, as the AI coding revolution goes forward. Well, five, you know what is truly a human skill still being a great guest on the daily.

So thanks for being here. Glad to hear that I still had a human edge.

We'll be right back.

Here's what else you need to know today.

Two congressmen, representatives Eric Swallwell of California and Tony Gonzalez of Texas,

resigned with an hours of each other on Monday. Both men had faced allegations of sexual misconduct and calls for them to step down or face expulsion from the house.

Meanwhile, a Democrat announced his decision after the San Francisco Chronicle and CNN published

the accounts of several women accusing him of sexual assault or misconduct on Friday.

His resignation comes on the heels of his decision Sunday to drop out of the California

governor race. Gonzalez, a Republican, was accused of a coercive relationship with a staff member who later killed herself. Gonzalez first denied that there had been a sexual relationship, then later admitted a mistake and had been fighting calls to quit his post for months.

And a brewing conflict has ratcheted up between President Trump and Pope Leo XIV, who's

been one of the most powerful critics of the US War with Iran.

On Sunday, Trump lashed out at the Pope in a social media post, accusing him of being "weak on crime, terrible for foreign policy, and catering to the radical left." "I have no fear either the Trump administration or speaking out loudly about him as soon the gospel." The Pope responded on Monday, saying he wasn't scared of the Trump administration and

that he would continue to speak out against the war.

Too many innocent people who he feel, and I think someone has to stand up and say, "There's

a better way to put this." Today's episode was produced by Diana Win, Nina Feldman, and Michael Simon Johnson. It was edited by Brendan Klinkenberg with help from Paige Cowett, and contains music by Dan Powell, Pat McCusker, and Michael Simon Johnson. Our theme music is by Wanderley.

This episode was engineered by Chris Wood. That's it for the Daily. I'm Natalie Kitroleth, see you tomorrow.

Compare and Explore