Will is not only all of the things you mentioned,
he's like the kindest person you can find.
This man is not only a genius, but is the kindest man I know.
“I think we'll be better humans to humans.”
And I got to need some like artificial thing telling us how to be human. We're going to come to that conclusion in reality ourselves. Intellectuals, philosophize, fantasize, technology, and what it is we're going to do to us. But the common man somehow is not so conversant with it.
So, through the show, we are trying to bring literacy into the world. Right when we made stuff, we still had a control that stuff. We still had to operate that stuff. But, I mean, 2029. That knowledge will remote control in that.
Our special guest is going to share some of his incredible energy and advice
from my dear friend Shaco, who's going to interview him from the tomorrow today podcast. And it's our honor to have you Shaco with us and thank you for all your support for Abraham House. So, this is a very special moment for us for many years. I've wanted to have him on this stage, and this is going to happen today.
“So, can we have a big scream as loud as he can to welcome the wonderful Will I Am?”
Hey, your body. I think one of the things that Daniel did not mention is that Will is not only all of the things you mentioned. He's like the kindest person you can find. So, I actually had my team after the CES. Go visit the project he lived and talk to a lot of people in the neighborhood.
You were to my neighborhood? Yeah, thank you so much. So, I think actually Sally can show you the video. So, we actually shot where he lived and very played and all of the stuff that he did. And people talk like really highly of this guy.
So, this man is not only a genius, but is the kindest man I know. Oh, thank you so much. I was, I'm a product of my neighborhood, so the people in my neighborhood are special to me, but they're really nice. I grew up in an all Mexican neighborhood. And it was, it's a village.
It still is a village. I go there every Sunday when I'm in town. And it's like, I really, really mean to be here. I really mean to be here. And neighborhood is so, I love, I'll start crying just thinking about like the people that I grew up with that are still there.
For example, as a kid from TikTok. I don't really check my DMs, but I was sleepless and bed, and I was growing. Let me read it to you, Hector Jr. Hey, will I'm Hector from Holland back middle school? Am I grandfather named Leo? He would tell us he would call you Hollywood. And I'm reaching out to you because I knew you went to Holland back middle school and you love music and you like helping your community.
I'm hearing that my teacher at Holland back might lose his job from minimum funds that I feel sad. Today they had a meeting council at the library and they're talking about budgets and I'm reaching out to you to see if you can help my school in any way. And so I replied to Jr. I love to help. You know your grandfather helped my family out a lot when we were really poor.
Leo would give us free ham and we didn't have money. I love your grandfather.
He was always nice to me and showed me so much love when I lived in the projects.
He took care of us with his kind heart, with his big grandchild smile.
“And he would say, hey, Hollywood, hey, Hollywood, where are you going?”
And I would say, I'm going to Hollywood to make my dreams come true. Your grandfather encouraged me with my dream and he would see me practicing in front of my house and he would call me Hollywood. He would also see me come home from the studios late coming from Hollywood. Your grandfather was an angel to me and my family and I would love to help you solve your Holland back budget situation. Anyways, I love my commute.
I love it, love it, love it, love it. They love you, too. Yeah. They love you, too.
We actually, like this is my second episode with Bill with the first one we s...
And that's going to get aired when Arizona State University, the master's program you're going to be teaching. Sally already has that. So we have shots of where you lived, you know, all of that stuff. And let's get into the show a bit, Bill.
So the way we've organized the show is, you know, this is rated number three podcast on iTunes and basically also Spotify.
We have three segments that we have in the show. The first is really like an imaginary world in the future. And it's like a, you know, a shotgun or, you know, black mirror type video that we play. The second part of the show is basically deep diving into the thing that we are talking about. Like, you know, what we showed just now.
“You know, what is the implication of the technology?”
And how does the society get changed or morphed or what needs to happen to make it come to life? And the third is to actually go to the street and talk to the common man. Because we, as like, you know, intellectuals philosophize, fantasize technology and what it is going to do to us. But the common man, some of them somehow is like not so conversant with it.
So the through the show, we are trying to bring literacy into the world.
That's amazing. It's very much similar to what you do. So this video and I'll just play a few clips. I know that like, you know, you don't like videos and then like talk about the topic because you're not. No, okay, great.
Well, that was a great setup. So let's play the video. This is actually in 2470, an imaginary world where AI has taken over the world. So will. Yeah.
What do you think about the movie? Your initial reactions. What are you, is that? 2470, 300 years, 350 years from now.
“The first thing was the material of the orb would not be that.”
Good one. Yes. The clothes of the kid would want to be that. Then. Then AI is like the human.
Overlord. Like police. System. Deciding like what best for us, I don't like that. Then we've been reduced to like.
That's not humanity. Humanity is like. I think something else happens.
“I think will be better humans to humans.”
We're not going to need some like artificial thing telling us how to be human. We're going to come to that conclusion and reality ourselves. Because humans are awesome. Like we've tamed the wildest beast.
We've domesticated powerful animals.
We've understood the weather. We're freaking great. And I don't think it's going to get to a point where it's like that's the new god. That ain't it. I think that's it.
I think we're going to arrive at a heightened elevated spiritual self. But it's going to take that to wake us up. So you think that would be the instigator for us to wake up? It's about everything that we've saw. Like there was something that created.
We were inspired. We are inspired by the world. Mother Nature was creating and still is creating. And we forced us to figure it out. To the point where when we see dragonflies, we make helicopters.
And we do idiotic things and hurt each other with the things that we make. We see eagle soaring and we make airplanes. And then we do stupid things to one another with the things that we make. Like we look up to the sky and a lightning storm. And we're like, wow, ponder like, what is that?
Let's plasma and we then we figure out how to harness that stuff.
And then we do stupid stuff with the stuff that we figure it out.
And we've to fight gravity.
“And then we do stupid stuff with the stuff we figure it out.”
So what makes you think that we're not going to do the stupid thing again in 2470. To give up the power to AI. Because all the things up to this point. Electricity is not reasoning. That's the difference between where we are now.
And where we were then. Like when we made stuff, we still had to control that stuff. We still had to operate that stuff. We had the agency. Now these things.
Like 2026, that thing just somebody's remote control and it's somewhere.
But by 2090, I mean, 2029.
That no one's remote control in that. I saw somebody walking behind it and it was like. But eventually, that thing's going to be walking around. Eventually, one of those guys is going to be sitting here. That tripod holding that camera is not going to be there.
They're going to be positioned bots that have cameras broadcasted sitting amongst us. And then we're not going to like that. We're going to be like, hey, the French are going to do the French thing. Think of her to French. They're going to protest.
And then they're going to freaking like, nah. Come on. You know that's what's going to happen. Yes.
French are going to be like, nah, nah, nah, nah, nah, nah.
No. And then a new liberty will come. And a new like enlightenment of human. We're going to elevate.
“At least that's what I hope because it doesn't.”
It doesn't in well. It doesn't in well, like. With the way that it is going. Because for that to have happened, like. For that to have happened, there was some crazy word that happened.
Is it not already happening. No, bro. That's an happening. Why are we fighting about like land, countries, identity. Like who we are, all that stuff.
Why is that happening? Why is that crazy shit happening right now? Why it happened in the past? Just greed. Inhumane.
Like hate. For differences. But it wasn't. Was it, was it the indigenous people? Turns out it wasn't, wasn't those cats.
They were the ones that just got like. You know, they got, they got a handle. What was that indigenous people didn't do that? A, I didn't do that. That was just like human, like.
Something dark. And. Yeah, so. And that we can't let that happen. We can't let like. That can't be the new like.
Conquisitor. Well, I hope it can be the next like colonizer. Well, like, a lot of people are actually talking about the colonization of algorithms in the future. In the sense is that like, you know, we were fighting for land. We were fighting for resources.
“We were fighting for like all kinds of crazy shit, right?”
Like, you know, I want to have a bigger empire or whatever it is. And people have come and gone. But I think what is the persistent theme in the future with the with the algorithms and AI and the intelligence layer that is getting killed. It created. It is going to remain.
And it's going to remain invisible amongst us. I'll give you an example, okay. Let's walk back in my example. Subconsciously we do a lot of that we don't know what we are doing, right? So, I very much grew up in slums like you project, right? Like, you know, I grew up in slums.
Till age, like, so I used to walk. My brother and I used to hold hand and education was like a temple for us. We used to go, like, it's like going to a temple every day because my mom fought for 365 days to get me into the school. She stood in front of the head minister's office. And so we used to walk 45 days, 45 minutes every one way and then come back 45 minutes. Okay. And then people thought, you know, these kids, like, out of slums, you know, they are working hard.
Someone, like, you know, like, the guy who talked about to pity on us and said, like, you know, we buy this guy like a bicycle.
So I said, okay, like, great, you know, more efficient for me, not 45 minutes...
Like, buy, go, come back.
So the same point A to point B is now 15 minutes versus 45 minutes. Made me a little efficient, okay. Then came the world of, right sharing before technology, which is an autoric showing India. You just wave your fricking hand and then you get into an autoric show with, like, ten other unknown people. Every day it's different. And then you go from point A to point B.
And you making choices then, that's your mind, fricking mind telling, recommendation. I'm lazy today. Let me not like take the bike. Let me, like, give five rupees and hop on and go. Then I came to United States. I have to just hold my steering wheel. Sometimes break and sometimes accelerate.
More efficient. And then I get into a Tesla. I don't have to use my fricking brain. It's controlling me. So gradually we've been giving up the agency of our mind to algorithm.
What makes you think that's not going to happen across everything because happening, by the way.
“Like, you have a manager who probably is coming and telling you what you need to do it.”
No, no, I don't do that. You don't do that. I manage myself. Okay, great. But.
But that's been since my career. Yeah. But I, I've co-managed a piece. I'm a control freak. But, but it's a way to make sure you get to a destination, your destiny without compromising your dream.
But like, you know, you see this everywhere. Netflix, you know, that keeps giving you movies. Like, you go to an ad.
It keeps giving you the ads that you want to see.
And more and more of that. So subconsciously, we are like leaving our brains to the side. And we are believing that the algorithm somehow knows better than. We do ourselves. How do we rectify that word?
I'm going to make you sense to you. Yeah, I don't think it's like.
“You, you, you fast forwarded through a lot of stuff.”
Yeah. But I don't, 42 years. Now you're going to see accelerated brain rot. Now there's, there's accelerated brain rot. That's, that's popping off.
You're going to be like psychological conversational dysmorphia. Because people are going to be introverts. Because they can't perform at the speed at which they've been communicating with some, agent, some, you know, village bot. We're all going to people are out now right now sharing the same.
We're sharing the communal AI. And giving yourself to a company's system. All the while. And then you're going to go and communicate with humans. And you're going to get impatient.
Because humans, like, wait, wait. And, uh, uh, you make, hurry up, give me the goddamn answer.
“Because you're so used to talking to like, it's rapid, um, you know,”
non-human entity that's spitting out things fast. And humans don't communicate that way. So people are going to get impatient with humans. And because that same impatient they have with human responses, they are going to feel like inadequate and have psychological conversation with dysmorphia.
Because they too can't respond as fast as the agent they've been talking to or AI. They've been talking to all the while to solve their, you know, anguish their depression, their company, you know, banter at work that's making them feel a certain way. Right.
We haven't yet grappled with the problems that we're going to have in the next two or three years that are taking shape now. Uh, and then in walks to robot. Um, those ones are cool through metallic. Eventually they're going to be silicone.
And then the moment they're silicone, that's going to mess up relationships. And it's going to make you feel like a king or a queen. And we have to grapple with that in 2029. And there's going to be people that are like,
we get married. And in parents like, what do you talk about? It knows me more than you know me. It knows me better than you do. You know, and then like, Ben.
But who, how do we question that? Absolutely. Because we can't even question like the human thing. Where two politically like configured that we can't even have. Who, who makes babies?
That's a, that's a politically touchy conversation in some countries. Like, you know, what's a woman?
That's a, that's a political like.
Wait, what? How, how did that happen?
And the next thing, you know, it's just going to be having babies with the artificial wound. Give me all of the next like 20 years. Before I am 70 years old, that will be the reality. There's going to be some silicone thing
that knows everything about everything that has an artificial wound. That's married to someone, to someone. You know, they're like, wait, what? And then some companies going to lobby to make sure that that has the same rights as us. Then it's going to walk by some homeless human.
And then we're going to be like, what? How the hell we get here? And it's that the world you want. Because when I saw that video, I'm like, oh, all that had to have happened. For the, see, the little kid with the little or welcome to a freaking like limestone cave,
like somebody put that flag here.
“How am I, where's the worry about no flag about the reality?”
You just said, yeah, yeah, it gets weird real fast.
So do you think we need nations in the first place?
Do we need a nation in the first place? Our identity is a nation or a house or a tribe or a community or whatever it is. Do we need nations? Or are we better off without it? Okay, this question sounds like a deep one now.
Okay, oh, that's the right thing to say. Hell, yeah, we need nations. Why do you need that? Because you don't want them to make the nations worse. We need nations.
We got here because of nations.
We are here at Davos, 2026 in Switzerland, by the way, which is a beautiful nation.
And I love Brazil. That's a beautiful culture, too. And with all the trials and tribulations that Brazil had to go through to become Brazil. Unfortunately, they went through some pain. And now Brazil is a joyful place.
And it's going to get through their problems, too. You know, ask that question, do we need seasons? Humans are complex that way, just like nature is complex. And certain countries have four seasons, some have two, some have one.
“And yeah, you need nations. You need tribes. The question you want to ask is, do we need to be more empathetic?”
Do we need more, more collaborative? Do we need to appreciate more? That, yes. While we have rich, freaking cultures. Because the concept of nations have changed because now when I go to freaking, I remember going at Amsterdam, their money used to look on nice.
It still does. But I remember going to Europe, and every single country had different currency. That was like, wow, these duch francs, these French francs. And then the euros nice will give me wrong. I'm not complaining. But when that happened, now everything star books.
And I like star books, but I used to like the local coffee. I liked that local restaurant. That's what makes traveling awesome. So yeah, you need that differentiation. Just like when you go to like Philippines, there's like certain fruit.
It's there that isn't over there. The moment you get rid of identity and like, you go to this place and get that, then you're going to expect the same from nature. And nature's diverse and so is humanity. And all the things that make us tick and how we click and how we stick together.
It's a beautiful thing. But we got to like appreciate it the same way we appreciate going to the desert for vacation. People went to the desert for vacation. And now that's a freaking like one of the best spots to go to. Like Dubai, the UAE, that's a beautiful place 40 years old.
That's younger than my mom. That thing erecting up to be what it is now. That's inspirational.
“That's inspirational for Nigeria to learn from Kenya to learn from from Ghana to learn from how do they do that?”
They benefited from the resources on their planet in their country. Well, the Congo should be able to do that too.
Because in here came from there.
How come they can't be as, you know, vibrant as a UAE from their resources?
So yeah, nations are great. And nations benefiting from their rare minerals and resources is even greater. So which is better? The world is a nation or nations themselves. Wait, what? Yeah, you said the world is a nation.
Yeah, like, you know, that let's assume that there's no boundaries. There's a song is like, one nation under a group. That was parliament of Funkadelic. George Clinton, not Bill Clinton, not parliament from the UK. The funky is funkster of mall, one nation under a group.
So what was the question again? So is one nation good for the world or like multiple nations?
How did it in 95 of those? Because we're fighting for those these days.
We don't have to fight like that's the right answer. But like somehow like the human rational says, we've got to fight for it. Okay, so you think by having one nation automatically magically anger being no fighting, I don't know. I'm just asking you. I know.
You're the futurist. I'm not the futurist.
“I think have you seen a beautiful orchestra in action?”
There's a lot of instruments there. You have the conductor. So in your case, there's one conductor. Is that your vision of like one nation? One orchestra. The one guy and then everyone is playing the music on it.
That's the video.
The AI saying what to do, you don't like that.
No, as much as I love AI. That's not the way. That's not the AI. I like the AI in Star Wars. And how does it feel? There's Jedi's bro.
The Jedi is thinking about R2D2 and fucking 3CPO. Would you're telling me in your movie that R2D2's got that? No, Jedi's bro. Super heightened spiritual human beings of its highest frequency. Like, do not.
As much as we are technocrats, do not surrender. You're a no-one network and how awesome the human being is as far as power efficiency. That month, it means a nuclear plant in power. It's brave. 12 watts, bro. Absolutely.
Excuse me? Come on, man. You can't add that one. Why? We're going to give up now. We even started here. You see, you know, bro. I hope that ate the one you try to do.
That ate the one I love tech. But I love humans more. So how do you build compassion and empathy and everything that makes a human a human?
“Like, because that's what makes an extraordinary human.”
Because you're not capturing that. Like, the world is filled with a lot of garbage data, a rant of reddit and, like, you know, like, what happens? Like, there was, like, someone wrote something on reddit, like, two years ago. Like, if the glue, if the, if the cheese is falling off, then put some glue on it. And everyone thought that that is actually the answer. And Google started recommending to everyone that you should put glue so that the cheese doesn't fall off the pizza.
Oh, that works when you're taking photos of cheese to put on the internet. Yeah. But that doesn't work if you're eating pizza. True. But we assume that is intelligence these days because that's what, like, AI is telling you.
Whoa. So see that again? So, the world's wisdom is the garbage land out there, the data which is sitting out there. And we are assuming that is the human identity. No, that's a corporate marketing campaign for you to think that that is the most intelligent thing in the world.
That is the most, like, awesome calculator in the world, pattern, matcher in the world. Intelligence is imagination. Einstein was the person who told you what, what, the highest level of intelligence. This is the imagination. If it's an imagination.
They're great pattern matches.
“You think that can imagine one day that you can teach imagination to one day?”
We're imagining all days that they get to that. I, I, I, I, what do you, what will we not give to the machine? That's a better, that's a better question. Yes, you tell me. What are, what are the things that you would not want to optimize?
There should be some things that are sacred to us.
What?
Like the imagination.
And what about your heart?
Love? Hmm. It will never love. Hmm. And it can argue back to us.
You have never loved.
“And then we will say, and let me show you.”
And then that's when we are going to activate that power. Bringing that song. When I said that power out for some reason, Give me the power of love. When in my mind we'll fast.
[laughter] But yeah, we got it. We are. It's getting goofy. Right now.
On what's happening in the society.
What we tolerate. How we've been disincentized. Things that we do to one another. You know, it's going to learn from everything we've done. And are we going to tolerate it repeating the same things we do to one another.
And in the name of like, you know, gain and, and like profit.
“Or are we going to be like, but it's a petroleum dollar company.”
It's a petroleum, it could drill a whole lot of good money. [laughter] Company. And is it, are we going to be okay with that? If it repeats the same thing we do to each other.
I hope that isn't the case, you know? Yeah, I think so. But at the rate that it's going, it could be one. We would be like, yeah, I got shares in there. Yeah.
Because look at what we do, look at what we allow us to do to one another. That shouldn't be. That shouldn't be the case. It can't be profit over people. That's like, that's the wickedness of the wicked.
Absolutely. Absolutely. So let me ask you the futureist question because you're the futureist. You know, I've seen, I've seen your moon like, you know, songs in 2007 where you were talking about AI then. Like no one was talking about the AI creating music and all.
So, so you've been always thinking 20 years ahead.
What is your aspiration for technology?
“And where should it like you mentioned a lot of these?”
Like now imagine a world that you can create with technology. What would that look like? What is the Vilaiam's version of it? There's a lot of problems that are not solved. And the problems that are not solved are in cities were, you know.
Underdeveloped communities and that is in the global south. And we're and folks that live in the global south seem to have all the resources. But then riddle with the whole lot of problems where they can't benefit from the resources. And now technology could help them solve those problems once and for a. So that's using technology to solve problems.
Right now AI to make music is awesome as it is. There's really no problem in music. For that to be the use for like AI in music. And I like it. I think it's dope.
Wait, what? It's cool, it's cool. Hey, see? There. So this is 2020.
So we could be using all that processing power, all that energy, all that water, all that power. To be solving some real problems. Do every time like it is millions of people like prompting to make music. And it's awesome as it is. There was nothing wrong with humans making music.
It wasn't like man. It does DSP. It's a problem, man. Right now. Ain't no more new music, man.
It only. It only. There was an AI to make me some music right now. That wasn't the case. So why is all that power?
All that water being utilized before you could use it to solve real things. To solve real problems. It's compute. Comput is compute. You could use that compute to solve some real problems.
We're not even using our power correctly.
We're prioritizing the compute and it's a lot of compute to use that stuff.
To make an images are awesome. I like it. It saved me a lot of time.
“But then there's still a lot of people that could utilize.”
Not only that electricity, but that water. And use that compute to solve problems. So we're not even acknowledging the problems that you could solve that going to create new industries. Because every time you solve a problem, new industry sprout, new jobs are created. And as a matter of fact, while we're creating this new technology for problems that we're not even problems,
folks that we're living pretty awesome lives are losing jobs. So you have this mass joblessness where people were making a living that went to school for it for crying out loud marketers, designers, illustrators, when there was really no problem for AI to come and take fucking jobs.
Meanwhile, water and power is not even being used for the compute to solve real problems.
So now you've got problems on top of problems. And it's only 20, 20, and six January. Yeah. Whole lot of jobs got obliterated in 2025. Yes.
Absolutely. Pretty skilled people. The world was pretty cool in 2018. Remember that? But for a comment, shoot.
Same pretty fresh. Apple was like king of the freakin mountain. Google was still cool. Porsche was Porsche. No one saw show me how did these.
Because outdo and Samsung and Porsche during COVID. Gonna amazing, bro. What's happening in China? Wowsers.
“How could the whole world learn and be inspired by how China's rocking?”
Just take some pieces, the good pieces, and apply that to Congo. Apply that to freakin, Rwanda, apply that to Nigeria. Apply that to Uruguay, Paraguay, Chile, Nicaragua. You name it. Apply the best pieces that are coming from China to areas to where you're creating dignity.
Awesome factories that are building regardless of the amount. Maybe those those those factories have some employees. But why can't those autonomous factories that are making awesome things? That people want be done in Chile. Or Brooklyn Bronx.
You know, Chennai, Bangalore. You know, China's doing some pretty awesome things. And the robots are, those robots is flipping, bro. They got flipping ass robots, damn, yo. They're robots flip.
Ninja poses, all types of dope ass. Amazingness.
“I want that, I want that to come from my neighborhood.”
You know, but they're not the same time. We can't lose our humanity. Because I know humans are flipped over. He's going to just be like, oh, robots flipping. I know you can flip, sit there, Bobby.
Look at the robot flip. Mom, I can flip too. Yeah, yeah, yeah. I'm going to spend $20,000 on a flipping ass robot. Meanwhile, Bobby can't even get a job.
Flip it. Sorry. Sad. So it was a truly inspiring conversation. Because, I started off as obsolescence of nations using technology.
And what basically came out of this is augmentation of nations using technology.
Is that a good summarization? Because you're talking about like lifting up economies and countries that could actually benefit from technology. Yes. Yes. You want a supercharge in the visuals and their communities and their nations.
But still, starting with the person. You got to, because the company is going to just, you know, data scraped them anyhow. And then next thing you know, there's some version of you in their like configuration. Because you got access to some free app.
And then there's like some modified version of you. And it sells the way you sell. You know, you worked at Nordstrom for 15 years.
You hit your quotas.
And to save cost because all sea level, you know, employees bottom line is equal.
“So the CSOs bottom line is equal to the CFO and the CFO's bottom line is equal to the CTO.”
And the chief staff, chief of staff is like, you know what, I'm going to reduce the staff. And the CTO is like, yeah, I'm going to reduce the staff too. And now when everyone's bottom line is equal to save cost, people's jobs a lot. And it replaced with efficiency. And this is just the beginning.
It's going to get like turbulent by 2030. We're going to be like scratcher I had. We should have been talking about it since 2018. We should have kept the conversation going in 2017 here at Weft when we had the AI council.
So in 2016, 2018, I sat on the AI council here at Weft.
We had the same conversations then. And here we are. Yeah, again, we're talking about the same thing. Talk about the same thing, but not in a granular detail. And so yeah, the individual needs to own their data.
“Up to this point, my data in this phone, how do I actually actualize it?”
This phone was made for me to swipe and tap in the camera. And now, companies have access to my camera, my mic and my GPS location without me even knowing. I can't even say, hey, Siri, who got access to my mic and my camera and my location right now? Siri, don't know.
And if Siri was to tell me, I couldn't say like, okay, disconnect them.
Because it's not configured that way. Because it's a layer on OS. And so I don't even know how we haven't even been given a product to leverage our data for us. And so in this world that you're talking about there, that kid's data is in that machine. That kid does not have, or is that, is that that kid's machine?
In this story that you showed, it's a replica of the guy, of the kid. The data exists in the bot. But who owns that? The world owns it. Okay.
Oh, man, I'm lactose intolerant. I don't vegan. My stomach hurts. And as you do me favor. Yes.
Can you go take for me? You can't. You can't. Why? Because my digestive system is mine.
Yours is yours. My immune system is mine. Yours is yours.
“No matter how much you care about me, you couldn't loan me your immune system.”
So in this world, my AI system should be mine. Not the countries. My country, I love America, but it doesn't own my digestive system. The only system that a country tries to own is a woman's reproductive system. And that shouldn't be this.
They should own that, too. Absolutely. Like, so from that perspective, your AI system, your data should be yours. It's a human right, like all your other systems. Just no one ever told us that we were going to have this other system.
That you receive when you enter this society. You know? It's math. And the math that I make and configure as I'm living in life should be obtained to me. Just like my, my particles.
I don't own them, but they make up my physicality. My particles create my atoms and my atoms create my cells. And it's tied to my DNA. My fingerprint here. I'm accountable for my actions.
And you can trace it back to my fingerprint.
It's still me.
And my digital fingerprint is still here in this digital verse that we're in.
“And if that's the case, why can't I own it?”
Why is it to some companies? And why is it going to go to these month? That ain't the word you want. I know. Hopefully not.
So you guys have seen the real real. He's like, this guy is not just a great artist. But he's intellectually at a different plane. No. You are.
And I'm just curious. And I, and creative. And use my imagination. But to spell it that way. No.
Like, well, the audience also feels exactly. Don't you guys think so? Thank you. Thank you.
I'm not, I'm not making it up.
No, because of. There's, when people say like genius and things like that, I know some real geniuses. I don't know. Like, Demis is a real.
I got nice. I sat there watching them today. I'm like, wow. Dude's amazing. Demis is an a nice guy.
And that's, that's the genius. And humble. It has good intentions. And yeah. We got to use that word like correctly.
Clever. I'm clever. I'm creative. I'll take that one. But genius.
Demis is a genius. So I'm getting like the call from your team saying that you've got to run for your next appointment. So we love the conversation. And it's going to air ready soon. Oh, wait.
Yeah. It's going to go on. Yeah. Oscar. [ Laughter ]
Thank you so much, Bill. Hey, what company is this? Well, that was a pretty dope. Yeah, that's dope. Do they flip?
Yeah, those are-- Those are dope. Those are-- those are-- those are dope. And this is real dog. This is not like a robotic dog.
That's-- that's cute. That's a cute dog. But thank you guys so much for listening to our combo. Thank you. Thank you, Bill.
[ Applause ] [ Music ] Oh, well, I am, you know, my cousin were really close friends.
“Honestly, my cousin used to be the male care where we're here at.”
They still have the cord some projects. But this had a kind of here, you know, projects. And, uh, hopefully what I know here, actually, um, sent talk for trucks. He has sent, you know, our ice cream trucks.
Presence, you know, to a little kid to the boys and girls. Clubs here at PS, contributing a lot to the boys and girls club. So he has done a lot, like special education. What? A lot of people don't really see it because he won't like to show it up.
He does it more, like, on his own way. Then I've been a part of college life for the past four years. One of his programs. And I think the biggest thing is how easy it has made, like, my process of applying to college.
Since most of the programs in, like, being ready for college are, like, exposure to different programs. They've been normally, if we don't really know about it. It's, like, good exposure to different things. One of the biggest things they did was my sophomore year.
They took us to North Arizona University, where we dormed there for about three, four days. I know his interactions with the community, due to the facts that I coached. I rose about high school for a year, about three years ago.
So his college track program. I saw the way it inspired some girls, how they made sure they had to get to the program. And how the program was an enrichment in their life for the college career.
And also what it offers after. I mean, tutoring. I mean, obviously, the financials, the financials, but they made sure they had to go to it all the time. I had accommodated in my schedule.
I want the full name would be, I am called a shark.
“And I think I didn't know he actually ran it.”
And so one day I saw him there.
But I think it's a amazing program.
Probably there's about, I'd say you give or take 100, 200 kids that go. I can see you see from fresh runs out to seniors. And I think kids don't have clothing like they have sponsors from like the prom and all that stuff.
Then from when I heard I haven't been in that program,
but like the problem he put out there,
has a lot of fit for kids that can be afforded. And you know, here, nowadays, they're especially now. It's hard to pay through college. And you know, he has this program from when I heard and they've worked for a lot of kids.
So not only do they help with college, that's like they ensure that there's experiences of like, making sure that the kids are not only going in one path, but they're branching out to whatever they want to go and achieve. And he's done a lot of great things.
Just like he's on cause of a great person. He comes from a great family. To give them the props, you know, he deserves it.
“Honestly, this guy, you know, has made his,”
has made his, you know, big and put us out there, especially, you know, where he came from, you know, a lot of people.
I mean, for what I know a lot of people don't even know.
They he grew up in this area of the projects. And just to know, you know, he made it big. You know, they really made us happy. So he didn't just leave the community. He's still within the community.
And he's helping out all the kids within the community. I believe he has a robotics program, too. Right here in Australia courts. Major. I want to go and so don't major in finance and accounting.
So I can go into investment making. No, and every time he comes by, you know, he's, I mean, he passes by. I mean, obviously he doesn't tell anybody that he's going to pass by, but when he passes by,
you can straight until he's, that is him. We know because of the car that he drives. And the pop, well, I don't want to say pop a roti. But sometimes he does pass by one of the cars.
“That's why, you know, talk to everybody.”
Like, you know, like, just like a number of person. And, um, that happened our later and our later, you know, you see a park right here. Like, oh, you know, free tacos or everybody. Oh, next, you know, he sends up a pussy lady.
We're on the other side of the projects in the corner. And like, oh, hopefully that's where everybody. And everybody's making a line of it. Like, oh, who sponsored this? Oh, who's an anonymous.
But we already know what's in it. So I think God created the most perfect machine with us. But we chose to use my language. We chose to get up into certain things. And that AI park, I can't rely on a machine.
Personally, I think AI is great. But on my point is, like, everything electronics in it is going to be our downfall someday. It's great. Everybody's just getting lazy and later because I'm a coach.
I'm a mentor. And I see the way we rely on it. We rely on it too much. And I can see the generations that I coach now. They're totally different than before because they want everything.
Right in their hands. Yeah, but yeah, I took over a lot. I've been listening along. What's it? It's helpful.
“But at the same time, I believe that is making people lazy.”
They feel to do certain things. I think there is a place for AI in our society. But it needs balance. It cannot take over society or else. People won't have jobs.
We won't have work for anybody. And people will go hungry. And that's a big problem.
First society that we are not ready for.
It's what everyone's scared about right now, right? Like job security, the AI, right? Like how do I live, right? But if you went to school all the way, then you have a job now. And like that job, not something that could not last.
Or like even more parents or immigrants, right? They became here becoming nurses. And even that is not a stable job anymore. I don't think AI should replace us to... ...govern ourselves in a jurisdiction or form.
I think it would remove the humanity we have and the freedom. And responsibility, we have to dictate and govern ourselves. The way AI works, it just feeds from everything that people created. I don't know what the margin of America look like on something that delivers justice. Like that.
There's a lot of factors that go in. You know, even mercy too, which I feel like an AI wouldn't even be able to comprehend. Showing mercy on anybody. No. Prime, just tell me that something like that.
That's what I feel. If AI could do that, to make justice for people that deserve freedom. And machine would be very factual, but I don't know where. Sometimes instincts, humanity comes involved in making decisions. So I can't see where a machine would be able to do that, effectively.
So, yeah, so I don't know. I don't see where an AI controlled government would definitely be where I could enter to them. So, if like, what I pledged an agent to a machine, like a country that was run by a machine, I wouldn't ask. If AI was controlled, I think it's still you still have the idea of having like a nation as like your personality or something.
You took your culture. I, I use AI. I definitely, like, I use it for like all of my school stuff. Like, I trust in AI. I don't know in like that type of way if I would trust it.
Then probably trusting AI, trust in this controlling it. Yeah, it was controlling. And I think it's one of the biggest things to psychologically.
Anyone, any human being who has a lot of power ends up being corrupt.
And for the whole reasons, and I think that's the top part.
“And I think people are trying to put AI in charge of that,”
so that we don't have that corruption. But then it's like, do we trust this AI that's there? Obviously, there's just like a lot of complications that comes with it.
So I don't know if you could put everything into that at the same time,
because it's still new. It's still something that's developing. And at like any moment, like something go wrong and then boom, it's over. Yeah, that's something with like, trust in it.
Because you don't know what's behind it, you know?
If something that's not human can solve humanity's problems, I'd be down for it. But I don't think that I don't think coming into that place where I would trust that.
“You know, I think there's a lot of, you know, I do think there's like benefits of AI,”
but I do think there's like a lot of still issues with it. And I still think there's, you know, a lot of the problems of AI is like the black box. It's concept that you don't, you know, you aren't fully sure and how it's getting to certain answers and how it's getting to certain things. But I do think people put way too much weight in AI.
“It's just like, it's not going to be able to solve world peace.”
It's not going to be able to like cure anything, like humanity-wise. Yeah, it's just, I don't think it doesn't. I don't trust AI. Finally. Even though I know the AI is not working in the way that most of the people went.
But for me, it's like, it's like a new humanity doing things for our world. Because I don't know, but I was breathing about AI in Japan, that they have AI robots taking care of the older people, but the older people was saying that they can not listen in with feelings. Now they are working, making robots, that they can cry with the people that they care of.
So it's something, you know, very crazy. But if they can do things that we don't want anymore in this world, it's okay. I think it's worth it.


