Welcome to an HBO podcast from the HBO Late Night series "Real Time with Bill
Mom." All right, here we are in the Social Psychologist and we're bestowing off of the answer to generation of the amazing generation Jonathan Height. Formanational security advisor, host the podcast today's Vata Grounds, one of the Lieutenant at the H.R. McMaster and the host of MSNUs, the 11th hour 70 roll.
[applause] Okay, here are the questions from the people. Are we at an AI tipping point? Well, I don't know, I read this week that there's a new generation. First of all, the new generations of AI seem to come faster and faster.
I don't know, there's not my area, but it was basically saying it's gotten to the point where it's writing the code for itself. Is this a real thing that, I mean, I'm sure it's really happening. Is this a real tipping point? Oh, yeah, I mean, any technology that we're told is increasing on an exponential scale
means that it is a tipping point and it will always be a tipping point forever.
Okay, so what are the implications? [laughter] It was interesting. A year ago, you had lots and lots of CEOs talking about how excited they were about AI and it was going to transform their business and about six months ago they went completely silent.
That doesn't mean they don't believe an AI anymore. It's that their PR department said you better stop talking because we're going to lay off so many people. That's-- You don't want to be touting the beauty of AI just before we have massive layoffs across the board. Hey, guys, come on.
Even the head of anthropics said that. Did he not? Yes. He said, well, he saw about 10 to 20 percent in the next five years because even the coders that people have wrote the code, they're out of a job.
People in medicine, people in consulting, people in finding it. All these white collar jobs. You know, it's not just coming for the people at the factory. He used to do the bolts and now the robot does that. It's everywhere.
And if we're talking about 20 percent, that's what we have in the depression. 20 percent unemployment. Yeah, there could be huge transitions in the economy for sure. And like all these big changes, some people are going to be left behind. I mean, I think the last time we had such a big shift
was really after China's entry into the World Trade Organization. And the associated loss of so many manufacturing jobs across the Midwest. And I remember George Packers book about the effect on people at the time. And you know, we're going to have a similar situation.
“That's why it's important to get ahead of it.”
And to make sure that people are adapting to it using it now. You know, to figure out how it can make them more productive. The people, the first people who really left behind are those who don't adapt to using some of these large language models and the capabilities it can give you. And we need a government to be thinking about how are we going to address that?
And we don't seem to have any unified force in the government addressing it. At least not yet. I just add that dealing with the economic changes, the loss of jobs. This isn't incredibly hard problem. I don't think anybody knows how to deal with it.
But here's an easy problem. AI is coming for our relationships. Social media came in, hacked kids attention, took it away with disastrous results for the education of the thinking. Now there are chatbots and teddy bears.
Children are literally going to get attached to this very responsive chatbot rather than to their parents. So here's something easy we can do. Say, this is incredibly threatening to human development. Can't we just keep away from the kids?
“Can't we just not let Silicon Valley do another experiment on the next generation?”
[Applause] It's funny.
You're always talking about kids, which is certainly the most important part of this.
But it's not like social. It's not like. No, but I'm just saying it's not like social media hasn't fucked up adults too. Yeah. I mean, you know, the guy who bought Twitter.
[Applause] Come on. I think it's true. I think it's true. Yeah, I agree.
But you know, the problem is, if you have a technology that is competitive in nature, you need all parties to the competition to sign up for regulation. This is where AI becomes particular dangerous in the realm of war and warfare. Because if you try to regulate yourself, you can lose a competitive advantage that's important to deterring conflict or being able to respond if you're threatened.
And so what you're seeing is a lot of automated decision-making. If you're seeing, I think in this next generation, you will be able to go from one person, controlling one autonomous system to one person controlling many. You have computing power at the edge. You know, sort of networks that self-heal.
And so you can give a mission to a fleet of drones, undersea or aerial drones, to accomplish a mission. Take out that enemy's air defense and so forth. And so I really have concern about giving machines to the decision to take a human life.
“And that's what I think we have to be in our military profession.”
Always keeping somebody on the loop of making these kinds of decisions.
It's not a surprise.
It's not a surprise that these AI giants don't want to face regulation. Right, nobody wants to be regulated even if it's to do the right thing.
“And so when people are saying, "Why are all these tech guys kissing the president's ring?”
Why would they do that?" Because it's brilliant for them.
Right, imagine you are this small group of people who are now more powerful than any oligarchs we've ever had in this country.
And they are creating a new frontier without any rules. That's certainly worth a quick podcast at the White House. So-called British music. 300g for only 2,970 or 1,870 kilos. That's kilo for only 1,940.
And a lot of other languages are on the right track. Now there are a lot of years in your country. And now there are a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And then there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country. And now there's a lot of years in your country.
And then four days before the Trump inauguration. [INAUDIBLE] Correct. Someone adjacent to the middle Eastern royalty. Does it deal with the Trump's,
invest $500 million in their crypto business?
And poof. Suddenly we're selling those AI chips. There's that pattern again.
“And so, and this is what's important about that.”
These H200 chips are really increasing can it increase compute power in China significantly. And if you look at how quickly these models are learning and improving, don't machine learning that occurs in the next few years. This is really critical.
What they found out is a decision. Yes. What they found out this week is that when they're testing these robots or whatever they're testing, they act differently on the test than they do when they're actually using them. Which means that the robots understand and they're deliberately fooling us.
And we're talking about the start treaty. And we certainly have to worry about Russia. And we have to worry about China. We have to worry about Indian Pakistan. We have to worry about some rogue actor getting hold of a loose nuk.
We also have to worry about the robots doing it. What if they got it into there? Pressure's head. Thank you. Thank you.
[APPLAUSE] One thing people need to understand about AI is that it is not program. Nobody wrote the program as the head of Anthropics says they are grown. And so it's kind of like we've summoned an alien intelligence. We've got these little gods.
They're like baby gods now. When we first met them, they were very nice. And they could compose lyrics. And now they're sort of adolescent. And now they're like getting smarter than us.
Right.
And they're on their way to becoming gods much more powerful than us.
And we're just running hell now into this with no insistence on guard. Even though the next chapters, they're going to write that mommy porn about them. [LAUGHTER] [APPLAUSE] There's an upside there.
There's an upside there. The upside is certainly in bioengineering. Certainly in the development of pharmaceuticals. Oh, yeah. I mean, the medical field.
“I mean, this could deliver, I think, tremendous benefits.”
No one's denying that. No one's doubting that. Absolutely. Especially people who are getting older. I don't know who they are.
But they definitely want these medical advances. Well, we just want to be sure we keep developing those pharmaceuticals.
Because last I checked, RFK is not interested in them.
Well, maybe they'll have something for--
Well, pattern baldness.
“Next time I'm on the show, I'll look like Elvis.”
[LAUGHTER] All right.
You mentioned RFK, I'll end this with just what I read in the news today,
that RFK is sitting on a podcast. Oh, God. Don't do it. [LAUGHTER]
You know, I was afraid you were going to bring it up.
I'm afraid you're not going to bring it up. You brought it up. I'm just afraid. I'm just afraid.
“Well, it's not even the worst thing he's ever said.”
But why can't he develop an impulse not to reveal everything? I mean, we've heard about the killing, the bear, and eating them, whatever. And now he said when he was at his worst as a drug user, he used to snort cocaine off toilet seats,
“which meant it very tough for the guy who was taking it.”
[LAUGHTER] All right. Thank you very much, ladies and gentlemen. [INAUDIBLE] [APPLAUSE]
[MUSIC PLAYING] Egal. [MUSIC PLAYING]


