The Pursue Vegas Podcast

Chris Wright on AI, Ethics, and the Fight to Protect Human Freedom

Chris Wright, Dave Burlin, Tawni Nguyen Season 2 Episode 5

Artificial intelligence is evolving fast—faster than most people realize. So the question is: are we keeping up?

In this episode of Pursue Vegas, we sit down with Chris Wright, founder of the Artificial Intelligence Trust Council and former U.S. Army Apache pilot, to break down the most important conversation of our time: how we keep AI from driving the world off a cliff.

Chris isn’t just another tech founder trying to sell hype—he’s a military-trained strategist who’s seen firsthand how fast tech can evolve without human oversight. “AI has the ability to completely manipulate us as human beings,” he says, and he’s not sugarcoating it. From drone warfare to deepfakes, we’re living in a world where the line between real and artificial is disappearing.

So what do we do about it?

“We’re in the driver’s seat,” Chris says. “But this window is closing.” If we don’t build AI with humans at the center, we risk handing over control to systems that don’t share our values—or our best interests.

But it’s not all doom and gloom. Chris breaks down the tools and safeguards we can implement—from AI IQ throttles to ethics-focused policy—if we’re willing to act. “We’re all leaders in this future with artificial intelligence,” he adds. “We have to raise our voice and say no, this is what we want.”

For Chris, this moment in history is unlike anything we’ve faced before. “There’s really been no other time like this in recorded history,” he explains. “We have the power to guide what’s next—but only if we stay awake, informed, and involved.”

Key Takeaways:

  • AI needs safety protocols—just like aviation.
  • Human oversight isn’t optional—it’s essential.
  • Community input must shape the future of tech, not just big corporations.
  • Truthful, transparent AI content should be rewarded over hype and manipulation.
  • We are living in a pivotal moment to influence global policy and outcomes.

Resources:

This episode is a wake-up call for founders, creatives, educators, parents—everyone. If you care about freedom, truth, or the future your kids will inherit—this episode is required listening.

Thanks for tuning in to The Pursue Vegas Podcast!

Don’t forget to like, subscribe, and leave us a review on your favorite podcast platform!

📍 Website: www.pursue.vegas

📱 Follow:

0:00:00 - (Chris Wright): Hi, this is Chris Wright with the AI Trust Council, and you're listening to the Pursue Vegas podcast.

0:00:07 - (Dave Burlin): The idea of Pursue Vegas was to really highlight the local people that really make Vegas Vegas.

0:00:13 - (Tawni Nguyen): I love that aspect of how these visionaries are actually bringing people together.

0:00:18 - (Dave Burlin): When we hit record, our responsibility is to connect the people of our city so we can show the world who we really are. All right, welcome back to the Pursue Vegas podcast. I'm your host, Dave Berlin.

0:00:29 - (Tawni Nguyen): And I'm your co host, Tawni Nguyen.

0:00:31 - (Dave Burlin): And should we tell him who's more excited? Who? I think. I mean, I started to get really excited, but you might be more excited.

0:00:38 - (Tawni Nguyen): I have to slam you down on this one.

0:00:41 - (Dave Burlin): How excited are you, Tawni?

0:00:43 - (Tawni Nguyen): Dude, I've been trying to get this man in this seat for how many months now?

0:00:46 - (Chris Wright): Oh, it's been a while.

0:00:48 - (Tawni Nguyen): We've been playing phone tags for like six months. Hey, you in town? Hey. No, I'm not in town. Hey, you in town? And I'm not Italian. It's tough. Yeah, yeah. But anyways, this is. Oh, shit. Chris. Hi. I got too excited. I forgot my train of thoughts.

0:01:03 - (Dave Burlin): Chris Wright with the AI Trust Council. You have so many accolades to your bio. I'm just going to let you get into it. Who are you, what are you working on, and what are you excited about?

0:01:13 - (Chris Wright): Yeah, so I'm. I'm the founder and CEO of the Artificial Intelligence Trust Council. My background is really not tech. My background is military aviation. I've served in the army for a number of years, was military contractor in the Middle east for a number of years, and really saw the future of drone warfare. And it actually got me into this future we're in today and understanding that there's a mission for all of us because we're alive today.

0:01:42 - (Chris Wright): We're all leaders in this future with artificial intelligence. So I went ahead and started the Artificial Intelligence Trust Council as a tool to help people understand what's real and fake, help them get paid for their metadata, and also help them steer the future of AI. And so that's what I'm working on.

0:01:59 - (Dave Burlin): I love that it's such an interesting conversation and it crosses a couple of the chasms of stuff that I'm working on. I'm working a lot with vets in tech, which we'll dig into that a little bit. We've done an AI roundtable with some of the people here in Las Vegas, some of the key stakeholders in the future of our economy, with the Governor's Office of Economic Development, things like that. I'm Also a part of the Vegas Chamber.

0:02:23 - (Dave Burlin): So I've been a part of some of the AI groups and stuff with that. I'm a late adopter and it's funny because I really have just started to catch my stride using some of this stuff. But just to kick the conversation off, I watched one yesterday and it was a gal that said she was talking to AI and she said, did we create you or did we discover you? And I think that was just a really interesting thing.

0:02:49 - (Dave Burlin): I'd love your thoughts on that. Just right out of the gate.

0:02:52 - (Chris Wright): Yeah. Yeah. The realm of AI gets really weird and there's very strange aspects to it where it can kind of blur our understanding of what's real and fake. And so when you have digital agents that are human like and even superhuman, like where they're the world's best therapists, the world's best PhD, anything you can think of at that point, you know, how do you distinguish what you should listen to and what you shouldn't? You know, good advice versus bad advice. And it's very convincing.

0:03:28 - (Chris Wright): It uses neuro linguistic programming and it has the ability to completely manipulate us as human beings.

0:03:34 - (Dave Burlin): Yeah.

0:03:35 - (Chris Wright): And so, you know, so it's just a small issue.

0:03:37 - (Dave Burlin): Yeah.

0:03:37 - (Chris Wright): Post.

0:03:38 - (Tawni Nguyen): Just a little.

0:03:39 - (Dave Burlin): Yeah, post the big game. There was, there's been just already there's been hundreds of videos that have come out of like Travis Kelce slamming his team. Of all the things that they did wrong and the reason why they lost the Super Bowl. None of it's real. Like, and it's like on those it actually says generated by so and so. But some of that, if you just remove that generated by whatever, you'd have no idea.

0:04:04 - (Dave Burlin): And people could easily get outraged. And then I think about the younger generation, like when they watch some of this stuff, they would never know the difference. They have like no cognitive ability. So I think it's fascinating what you're doing in trying to help people understand. How do you do that?

0:04:22 - (Chris Wright): Yeah. So there's a couple ways to help figure out what's real. And so what we found with the AI Trust Council is that you can use digital filters, which will actually filter out digital content. That is AI generated. AI leaves markers that you can pick up in text. It's a little bit harder. You can ask any high school student and they'll figure out ways to get around.

0:04:44 - (Tawni Nguyen): I wish that was around when I had to write my own 20 page paper and turn it in one the only software at the time, like a decade ago.

0:04:52 - (Chris Wright): It's not even fair. Like what the hell?

0:04:54 - (Tawni Nguyen): Yeah, like we had to write our own paper, read our own research.

0:04:57 - (Dave Burlin): Like, well now kids just probably have to do like a full presentation like, okay, get up and read your book report.

0:05:03 - (Tawni Nguyen): And that's why soft skills is so important. And like the, the when we got around this conversation member, it was my frustration on the lack of like critical thinking for one. Creativity, originality that some of these people that call themselves content creators don't have anymore because everything is mass produced and generated from a source that's even themselves. So it's not even thinking outside of their own free will. And they're like, I'm like, do you even believe what you're saying or are you just saying these things because it's what's data is telling you is trending. Because large language model different than knowledge base is. Knowledge base comes from you. And large language exists in reality of the infrastructure of AI web 3 blockchain. That's how we met. Right. It's a lot of those masterminds are.

0:05:44 - (Tawni Nguyen): This is a little giggier than you I want to get. But this like when we got off on talking about the future of AI and people think that it's going to replace jobs. And I'm like, no, but the people using certain aspects of AI are going to replace people that are way late into the game and thinks chat CP is like the newest and latest thing, but it's been around for what, decades now?

0:06:02 - (Chris Wright): Yeah, I mean artificial intelligence, you know, was. Ray Kurzweil said he was in it for 60 years. You know, so I mean this, this came about, it's mathematicians back in the 50s theorized about this stuff and know they had conferences on it. But, but if you look at like 2001 Space Odyssey, that movie, you could watch that today and it's completely accurate to, towards AI and, and the future of humanity. And, and so if you, if you kind of take that as a model, it's like, well, how did they figure that out back in the 70s? How did they know specifically what it was going to look like?

0:06:35 - (Chris Wright): And, and really it's, it's just math and so you can interpret, you know, where this is going to go. And, and so, you know, it blends into politics. And that's one thing that people don't really understand today is that, you know, all the political stuff that we've seen for, you know, last 20 or so years has been preparing humanity for this future of integrating with this high technology. And so it's. So if you look through that lens and you look at, you know, finance, you look at governance, you look at like the World Economic Forum, the things that they're talking about.

0:07:06 - (Chris Wright): It's really the future that they're describing, but they don't, they don't call it that. And so you can kind of think about the globalist plan is really the fail safe. And that's the fail safe if we humans don't come together and come up with a better plan. And so that's what we're working on with the AI Trust Council is trying to say, hey look, let's bring a bunch of humans together. They're all pro human that care about the future humanity, that don't want to disrupt the family structure, that don't want to destroy, know the fundamental biology of humans, don't want to eliminate the male and female role.

0:07:43 - (Chris Wright): Ultimately, you know, I mean it gets, this whole agenda gets into transhumanism and, and so that's really the future of artificial intelligence. And so for, for us humans, like we, we have to stand up and we have to say no, this is what we want. And even though there is a profit potential with artificial intelligence, which is amazing, that's very short lived, that will only last a couple of years.

0:08:05 - (Chris Wright): And really if you look at the predictions of Ray Kurzweil, he talks about 2000s, you're going to need universal basic income. And even 2026, 2027, you're getting artificial general intelligence, which means that the AI is as good as a human in most capacities. And so you could talk to an employer and be like, well, if I could get a digital agent to do the job, why am I going to hire a remote worker that's going to make mistakes when I could get an AI that's going to be smarter, faster, more creative, all that kind of stuff. And so what we're advocating for is throttles for the people on a blockchain to vote and then establish throttle mechanisms. So you could say, hey, I don't want, you know, a 1 million IQ AI robot running around my neighborhood, you know, enforcing HOA standards, you know, or something like that.

0:09:00 - (Dave Burlin): We don't want anybody enforcing HOA standards. Let's be clear on that. We don't want any HOA standards period.

0:09:08 - (Tawni Nguyen): From a human or AI.

0:09:10 - (Chris Wright): Maybe they can figure it out and make it fair. Like why is HOA like $1,500 for HOA fee? It's insane.

0:09:16 - (Dave Burlin): Yeah, that's funny. Well, so I, that's all fascinating. My curiosity comes from the global conversation of this, but also how it Pertains to the micro level. Right. This is a conversation that can, anyone can enter in from anywhere and anyone can lead this from anywhere. What's your take on how Vegas plays into this conversation? Where and, and how long have you been here? What, what are your, what's kind of your take on Vegas playing into this?

0:09:50 - (Chris Wright): I think Vegas is a. Plays a critical role. You've got Area 51, nuclear, Nevada nuclear test site.

0:09:57 - (Dave Burlin): Allegedly.

0:09:58 - (Tawni Nguyen): Allegedly.

0:09:59 - (Chris Wright): Just, just up north. I mean, you got aliens walking around at the casinos.

0:10:02 - (Dave Burlin): You know, they have documentation.

0:10:05 - (Chris Wright): Right.

0:10:06 - (Dave Burlin): I'm just kidding. That was joke.

0:10:08 - (Tawni Nguyen): Sorry.

0:10:09 - (Dave Burlin): That was a good one, though, to you. Please continue.

0:10:12 - (Chris Wright): But, yeah, no, I mean, but Vegas is a high tech place and a lot of people overlook that. You know, a lot of people think like Silicon Valley, is it? No, Vegas has some amazing talent and, and really cool people and, and the tech scene is growing here and, and we're inheriting a lot of the, you know, refugees from Calif. That are tired of the way of life over there and want to actually keep their money.

0:10:36 - (Tawni Nguyen): Hello. Guilty.

0:10:39 - (Chris Wright): Right. So it's a great spot for tech and really there's kind of a different spirit here. It's more casual than a lot of places and people just want to get things done and have fun. So, yeah, I think Vegas is an amazing place for tech. And so we're leveraging that. We're out networking routinely and talking to folks and, and trying to bring together really a coalition of people who are pro human, who enjoy tech but want to see it done in a safe way.

0:11:10 - (Dave Burlin): I love that. And it's interesting. It doesn't happen often, but as we expand the network of the show and connect with new people, Tani's out connecting with a different network of people. I'm out connecting with a different network of people. I had no idea until I met you today. You're also a veteran. So can you talk about your military service and then what your kind of your forecast is for what's next as you wrap that up?

0:11:35 - (Chris Wright): Yeah, my background is in attack aviation, so I used to fly Longbow Apache helicopters in the army, spent time deployed in Afghanistan and then was teaching Arab students for about 10 years as a contractor. And I was working on a simulator in the classroom. And so one of the things that we did was teach aviation safety. And you know, we just had the accident recently that occurred in the Potomac D.C.

0:12:00 - (Chris Wright): and, and it calls attention to, you know, safety issues in general. And so that's, that's one of the things that really got me thinking about artificial Intelligence and the excitement that you hear from the developers of AI. And so you've got guys like Sam Altman and if you were just to kind of record how many times he said that he's excited or it's exciting.

0:12:20 - (Tawni Nguyen): Is it more than Dave could be.

0:12:23 - (Chris Wright): But, but it's literally like every other word.

0:12:25 - (Dave Burlin): Yeah.

0:12:25 - (Chris Wright): And, and he's just like freaking out because it's so amazing. And it's very similar to flying low level in a helicopter as a young, you know, guy flying, you know, that's some of the coolest stuff you could do. You know, you're flying right over the trees, you know, and, and it's about as exciting as it gets. It's really cool. But easily kill yourself absolutely doing it. And, and so very similar to artificial intelligence. It's very exciting. But you've got senior leaders in open AI who are leaving the company out of fear.

0:12:57 - (Chris Wright): You got some leaders that say probability of doom is 70%. They call it P doom, meaning that we don't make it humans. And so, and so that's, you know, like iRobot. Well, yeah, a bunch of different ways that it could happen. But, but that's what they're pushing out. And, and so, so anyway, so, so I look at the aviation safety measures that we use to keep ourselves safe and I'm like, we are not following any of these kind of common sense safety measures in the tech industry.

0:13:25 - (Chris Wright): And it's like, well, why not? And it's because the tech industry doesn't come from a safety background. You know, they don't have an industry that's had catastrophic mistakes over and over and over, like aviation accidents or you know, really horrific military accidents and things like that. And so, so we, you know, from a, from a pilot standpoint, we, we from the ground up were, you know, like laser focused on safety.

0:13:47 - (Chris Wright): And, and I've been shocked because like you just don't see that. And tech.

0:13:51 - (Dave Burlin): Yeah, well, it's great that you kind of see that ahead of the game and you're already starting to create those measures within it. Not only that, that you're rallying people in that cause right. To, to. I think there's, there's a healthy dose. And from again only meeting you today, like I don't feel like it's an over. I really think it's important how people package it because just like anything else, it's like if you try to explain the warnings of something and you do, it will say the wrong way, the wrong way. People immediately reject instead of immediately start to Accept.

0:14:26 - (Dave Burlin): So I think there's different ways that you're going to capture different people, which is fascinating what you're seeing across all of different levels of social media. I was at an event the other night with some friends of the show and she was very oblivious to a lot of the things that had happened, even talking about that plane crash and all the different things. And it seems like each person that came in was almost like a character on a TV show. And like, did you hear the worst news about today?

0:14:52 - (Dave Burlin): And it was four completely different horrible stories that she had no idea of. Because everybody's feed is different of what they're getting. So I think you're, you know, some people are going to take it for what it's worth. You could say it in a very calm sentence and say, hey, we're all going to die. And people are going to be like, oh, we should probably listen. And then some people, I mean, that.

0:15:15 - (Tawni Nguyen): Is the only fact that people live in denial of taxes.

0:15:19 - (Dave Burlin): Maybe when I say we're all going to die, when you start to say it's going to be at the hands of AI, even if it is, it may or may not happen in our lifetime. But there are things that we can start to do right now to be very aware and to be very conscious of the things that are going out there. Because again, this is a really interesting comparison, but I look at, I often play it back in my head. I remember when some of the early social media came out, MySpace, things like that. Like, it was fun.

0:15:52 - (Dave Burlin): I was like, wow. But it was like I was an adult. I understood immediately the responsibility that we had to not say stuff that was going to be recorded over time. And I look at how I've used that as a tool in my adult life, where my son, you know, during Pandemic he did some pretty off color posts that got him in a lot of trouble and it was him just trying to genuinely express himself. But the game has changed now, the platform has changed and now even the stuff that's coming out is not real.

0:16:29 - (Dave Burlin): So now it's like we're in this a totally different challenge to figure out. And that's one of the things that I love about this show, is we're always trying to get to the truth, right? What is the truth behind what's actually happening? And this is one of those, we can't tackle that for the whole world. But it's great to know that you're really trying to understand that and get people together.

0:16:53 - (Chris Wright): Yeah, yeah, no. And that's what we're trying to, I mean, rewarding people for being honest, rewarding people for doing the right thing, you know, that should be inherent. But what you find in social media is that it's, it's more about the fakeness. You know, how many Instagram followers are fake, you know, on these celebrity accounts. I mean, you, you can literally just buy Instagram followers. And so that negates the whole concept.

0:17:13 - (Tawni Nguyen): Of the whole platform of creating influence. And that's, that's something that I want to lean into. Is when you said everyone's feed is different, something that I like, I've been off met on any sort of social media for over a year now. And my brain chemistry, I feel like it's completely different because I'm not chasing the same dopamine driven, like, you know, responses that my body used to respond to, which is the second part of sobriety.

0:17:37 - (Tawni Nguyen): But another thing that's really important is that what's missing is the word programming. Like people are getting programmed through social media because they're setting themselves to believe in things that are designed by other people's, you know, so I don't think people really know what they think anymore. And I think that's kind of like the lack of, when you said like death by AI, the lack of social intelligence is dimming.

0:17:58 - (Tawni Nguyen): It's just dimming. Like people don't know how to think anymore. And it isn't until somebody puts it out in black and white on, you know, like, I know you've been back and forth into Florida and like dc, like every time I try to catch up with you is because you're leaning into getting this higher up on the totem pole and actually getting this message across. Because social media is a vast net that we can cast, but it isn't going to reach its specific niche. Because a lot of my friends that are in crypto and that are leading certain spaces like Web3 and blockchain, they're ban out off of meta just because of how controversial a lot of their truth, subjectively or not, or objective truth on things that they can report and that they can prove they're banned.

0:18:39 - (Tawni Nguyen): Right, because that's what I said. It's like freedom of speech is a lie, depending on which truth you're after. Because that's kind of where the last time I saw you, you were going up to the administration and can you lean into that conversation a little bit?

0:18:53 - (Chris Wright): Absolutely, yeah. I was back in D.C. for the inauguration and connecting with all sorts of leaders all over the place. And one of the Things we're trying to do is actually start local and engage with local governments and start to put the seat in their head that like, hey, you guys are responsible for mitigating the risk of this artificial intelligence. At the local level, there's the push to have AI smart cities where it's like the Internet of things that has a linking of all camera systems together to provide a, you know, like universal policing.

0:19:28 - (Chris Wright): So it's a facial recognition digital track and trace that you really can't escape from. And so from a governance perspective, it's like, it's. They're like kids in a candy store. They're like, oh my God. Well, look at all the stuff we can do with this data. We know, like, you know, we can see people from satellite. We can see if they, you know, dug a pond in their backyard. We can, you know, see if they went to the store. We can, you know, you can see everything.

0:19:50 - (Chris Wright): And, and so, you know, there's a civil liberties balance that has to come with that. And, and so that's one of the things that we're looking at is, is. And we're trying to push is for civil rights, you know, like your basic rights, like freedom of assembly, freedom of speech, freedom of religion, but also civil liberties, you know, freedom from AI. Yeah.

0:20:12 - (Tawni Nguyen): It'S not a use it or lose it situation anymore. Yeah, it's like a use it or be used by it.

0:20:20 - (Chris Wright): So. Yeah, and you have the ability to discriminate with artificial intelligence, all these things. And so it's like that balance has to come from the local level on up. And specifically with policing. Policing really is where it comes from. And so that's one of the things that I saw in California. I had a logistics company in Northern California for a couple years, and our vans would continually get robbed. We were delivering packages for Amazon, so we had all these vans and drivers out each day.

0:20:48 - (Chris Wright): And what I was seeing was that, you know, it was a total free for all. I mean, California is an absolute disaster. I mean, there's homeless in literally every corner of the state, you know, in all places. And so I'm like, you know, California's got this massive budget. Why aren't they allowing, you know, all this homelessness, all this crime, and yet we're still paying like 50% in taxes. And so what I realized is like, oh, look where it's really, really bad. Silicon Valley or right in the Bay Area.

0:21:15 - (Chris Wright): And, and so that's where those tech solutions for security are being born today. And so they've, they've already Got the plan figured out. And so basically it's just getting the public to accept it. If the crime is really high and it's absolute chaos, zombieland, then what does it necessitate? The solution. The solution is the AI governance solution. Internet of Things. And it's called UN Agenda 2030 or UN Smart Cities Initiative or whatever.

0:21:43 - (Chris Wright): But that's the direction we're going. So what we're trying to do is say, hey, no, we don't like that. That's not cool. If you get drones and drone systems that ultimately hold pepper spray, imagine what you could do if you're a leader that doesn't want to protest and you've got these high IQ drones that can just dispel any sort of opposition. The people come back with guns and you got drones with guns, you know, and so it's this tit for tat. It just, it gives ultimate power to the individual that has the control of that manufacturing capacity and the control of those drones.

0:22:19 - (Chris Wright): And it's a major issue. And so as Americans, we like our freedom, you know, we like to do, you know, do what we want, you know, within reason. But, but this AI gives individuals the ultimate control. So the idea is to give the people that control so they can weigh in on the sophistication of it. They can weigh in on the use cases and ultimately help manage it.

0:22:43 - (Tawni Nguyen): Yeah, because the only way we're voting right now is with our dollars. So when you're talking about tax dollars in California, if you're looking at the homeless situation, there's like hundreds of millions of dollars that are going missing because it's not actually used for anything. It's just like a little empty space. So I think I find that pretty comedic in terms of how they're budgeting. And that's like their own ecosystem that we don't want to lean into.

0:23:05 - (Tawni Nguyen): It's a different state issue, but you know, for me personally, it's always on the side of the people using AI and where it's leading us economically as people that live and thrive in Vegas. And I know Vegas is becoming a tech hub almost as large as, you know, Miami and any other place, and Silicon Valley, hopefully, minus the other issues of misuse of budget. So hopefully on the infrastructure level, we build something that's proper, that's going to be designed for people, by people, not people just optimizing using AI because they lack the skills to actually build anything kind of like worthy. Because we're still, I think, in the infancy stage of like the first 10 years. Call it of building, you know, our massive infrastructure, of all the real estate, using up all the land and like, you know, the bml, like lands getting released, that's a different topic, right? With the federal aviation, there's a couple plot of lands over on the other side of the mountain that's just got released to, as you know, which is with the development of Las Vegas Executive Airport, that's a whole other plot of land that the FAA just released to.

0:24:07 - (Tawni Nguyen): And going to put this out there since we're manifesting. But Rob going to meet with him, going to bring him on the show, because these are topics that are close to my heart. It's how are we, the people, governed by other people, supposedly doing the right things for the people that actually are locals here, that are native to the land and are we serving people in the right way? So, you know, with the whole AI misuse and all of the stuff in policy that's happening around the world, I feel like, you know, for us, like our liberties as individuals, it's like, how do we use AI? Like, yeah, it's cute. You can say like, oh, I know how to use ChatGPT. But yeah, yeah, that's like maybe like 0.0001% of the actual usage of what you're actually using creativity for.

0:24:47 - (Tawni Nguyen): So can you lean into something that, you know, me and Dave always talks about with, you know, just being a regular, everyday individual person using AI but not understanding it on the scale that you can see it as, on the global economic scale? Like, what does that look like?

0:25:00 - (Chris Wright): Yeah, it's really, it's. You have to look at like the protein folding problem with artificial intelligence. It solved it. So you can model proteins. And so what that means is that you can fundamentally transform DNA and you can change the body. You know, you can, you can solve almost any disease. But also if you're a bad actor, you could create any disease. You know, you can apply it to chemicals. You know, right now you can get chemical synthesis machines for about 50,000 bucks, maybe 100 grand for a really good one.

0:25:34 - (Chris Wright): And basically what that is like a Xerox copier, but you just put in base chemicals into it, and then you can put in any chemical formula that you want and it will produce chemistry that the world has never seen before if you use the AI to produce some of this chemistry. And so what that means is you got chemical compounds that can, you know, do, you know, massive damage to people and material or whatever.

0:25:57 - (Chris Wright): And so, you know, when you have that kind of tool in the hands of Just anybody. It, you know, and, and we're, yeah, we're getting into the idea of free energy and stuff like that. And so you can see like, you know, like, spiritually humans are very, we're kind of toddlers in this, you know, like, most people are not real spiritually evolved or ethically evolved, you know, and we're kind of just trying to, you know, survive. We're kind of in survival mode most of the time, and we don't have time to think about a lot of big things like that. And so when you, when you put that high technology into the hands of just average people, you know, it's, it's like, you know, putting a flamethrower in the hands of a toddler, you know, you're, you're going to burn something down, you know, and, and so that's, that's, that's the level of sophistication and risk with a lot of these tools is that, you know, you can do all sorts of bad stuff with it.

0:26:47 - (Chris Wright): You know, just from the mental health aspect, I mean, you know, today you've got tick tock brain, they call it tick tock brain rot, you know, and this is for parents. This is something where if you've got a child that's, you know, on, on these apps, you're, you're nuking their attention spans. Their attention spans are literally just getting destroyed. And it actually rewires, you know, the human brain and especially when it's developing. And so, you know, there's issues with that and, and so with artificial intelligence mimicking humans, you've got the ability to just influence the, the human soul and the human mind.

0:27:23 - (Chris Wright): And, you know, in China and even now it's becoming popular here, but you got all these young guys that are literally falling in love with digital agents.

0:27:34 - (Tawni Nguyen): Buying the fake bots for sex and living in virtual reality world of having, what do they call it? Like, you can buy real estate digitally, we know that as an investment opportunity, but now people are living in a meta. Metaverse, Is that what they're called? They're living in a fake reality because they can't afford to live on the reality. So when they're in their physical body, as we call it, they're severely depressed and the suicide rate is like gone up much, much, much, much higher for most men.

0:28:04 - (Tawni Nguyen): Yeah. So I think that's a critical state of consciousness. Not only one on the leadership side, but when you brought up the male and female role and energy itself on the masculine and feminine side, there isn't true energy exchange anymore because things that are sold and bought easily can be replaced by bots. Human connection is one of them. And now men are falling in love, and it's not really love because they're, I guess, chemically wired to fall in love with a dopamine sense of what they think love is.

0:28:36 - (Tawni Nguyen): So those are like topics that are so real, but people can't grasp around it because they don't understand the chemical making of a human body. So now digital reality is like kind of like the new reality because they think, you know, now they're escaping this reality and they can go live in another world and own a house and have a wife.

0:28:53 - (Chris Wright): Yeah, you know. Oh, totally.

0:28:55 - (Tawni Nguyen): Yeah.

0:28:55 - (Chris Wright): I remember playing Grand Theft Auto back in the day and, and you know, you'd be like, oh, we're gonna beat up this old lady and, and take her purse or whatever. And then, you know, you do it like a bunch of times to get some money. And then you walk out on the street and you're like, oh, I could just, you know. And so, yeah, but you multiply that by like a million and, and that's, that's the AI VR right now that you've got.

0:29:13 - (Chris Wright): And it's like, yeah, you stop, you know, you start to blur our realities where, you know, people don't really know what's right and wrong. And that gets in the whole AI porn thing. I mean, you've got kids that are accessing this stuff at a very young age, completely re. Rewiring their minds. So by the time they're, you know, late teens, early 20s, you know the time when you probably start to find like, a partner, they're just ill equipped to actually have a human relationship. They don't even know how to behave with a human.

0:29:42 - (Chris Wright): And, and they get it real weird, you know, and, and how do you recover from that?

0:29:48 - (Dave Burlin): Yeah, and that's probably the thing. You don't even realize it's a problem because they live in their own, like, version of reality. And like, we might understand it from an older generation because. And it's such a funny thing to say. Like, oh, when I was a kid, it was different.

0:30:05 - (Tawni Nguyen): Like, we played outside with rocks.

0:30:07 - (Dave Burlin): This is like. It really does start to create a very distinct separation. And again, in the wrong hands, over time, it really starts to. It affects more than just being in a physical or intimate relationship. It starts to affect the global economy because of what we are now less capable of. If we're not capable of holding, you know, those types of relationships, then there's so many other Things that, that kind of crash with that.

0:30:39 - (Dave Burlin): So, yeah, very much all one thing definitely affects another. Again, my hat goes off to you for anything that you're doing or other people. When it comes to people that want to get more involved, that can join that fight in helping people understand how to use AI ethically, how do people get more involved?

0:31:02 - (Chris Wright): Yeah, so our website is the aitc.com that stands for the AI Trust Council. And so they can sign up there. It's really a placeholder. We're building a website that's aimed at tackling these problems. And so we're coming at it with much different tools. Blockchain, cryptocurrency, web3, metadata, wallets, a bunch of different tools. But ultimately we want to empower the individual to help weigh in on this future that we're all a part of and help collectively figure out what's real and fake. And to you got these organizations like Snopes or fact check, you know.org or whatever, they're biased and of course there's going to be bias. And so the idea is that on the site, it's really like a utility where we show people the metadata.

0:31:49 - (Chris Wright): You know, right now, if you were to go to Facebook and look behind the curtain, you would see a mountain of information about you that you never knew existed. There's. There's people at Facebook that know more about you than you, probably because of all the data that you've contributed. And I mean, tracking your eye movements, I mean, literally, like, you know, how the microseconds you spend on one image versus another will indicate all these different things about you. Right. And so when you've got that level of sophistication, why is that not shown to us?

0:32:18 - (Chris Wright): Why aren't we in control of that data? How come we can't delete that data if we don't like it? You know, they're taking this information about us. And, and so what we're trying to do is give that information back to the individual so they can ultimately earn money off of it. One of the ways that we, you know, can potentially help displaced workers is by giving them other avenues to earn an income. And that's one of the concerns is that, you know, there's a lot of jobs that are going to go away, unfortunately. And, you know, we, we could set limits to it, but, but people are going to have to, you know, really look closely at, like, their careers and things like that.

0:32:57 - (Chris Wright): And what we're trying to do is give people avenues to earn an income. If there is this Big displacement of workers. And so the whole conversation gets really weird. But yeah, if people want to understand more about that, just Google Ray Kurzweil and listen to some of his statements. I mean, listen to Yuval Noah Hararis, you know, from the World Economic Forum. You know, this guy is a transhumanist. He has a different vision of the future than most people.

0:33:28 - (Chris Wright): And so it's important for just regular humans to understand what that looks like and, and realize that, that we don't have to go that direction, that those are people who are leaders in this space. But ultimately we the people are the leaders. And, and so it's up to us to raise the volume and say, no, no, no, I don't like this, I do like this other aspect of it and find that balance and that sweet spot of pro human bright future and pro planet.

0:33:57 - (Chris Wright): And so I think we have those tools, but we need platforms that allow people to have that discussion.

0:34:03 - (Tawni Nguyen): I think there's a sense of complacency though in the human nature because everyone has learned a learned helplessness and everyone has embarrassed, embodied a certain sense of just like almost like they gave up in a way. And I know that you've taken this much higher to like the levels of like the government and all that stuff. Like how are they taking in terms of it's always nice to look bottom up, which is we the leaders. Like it's easier to say, but how is it looking down? Like what are they doing on that scale to actually emphasize that humans are so important because jobs are being replaced. Like we can see it at the grocery store.

0:34:40 - (Tawni Nguyen): That's been done for the last decades. But now people are finally realizing that some of these jobs like doesn't really need to exist anymore like the checkout clerk. But like some people do need the human connection of like going to sprouts and actually talking to a person briefly for what like a 37 seconds transaction of just hi, how's your day going? Cool, look down and then get, get your things done. But now everyone's kind of like self serving and it's just so individualistic that some of the lines are blurred on what's important and what's not anymore.

0:35:10 - (Tawni Nguyen): So how do you decide and how do regular people have the discernment of making sure that they understand what's actually important anymore?

0:35:17 - (Chris Wright): Yeah, yeah, no, I think that's an important conversation and I think having open debate, open dialogue about it and in a transparent way that people can see, you know, all sides of it and they can analyze information and, and Ultimately, I think the people can figure it out if they're given the tools to make smart decisions. And so right now we really don't have a lot of those tools. And so I think that if you give people the tools of understanding through metadata and data analysis, you can kind of figure it out. And people have a gut instinct.

0:35:48 - (Chris Wright): If you're a parent, you generally have an idea of what's good or bad for your kid. And, and then if you're armed with good information from good sources that are not corrupted, that are not just seeking profit or power, it's just people that want good things, it's like, okay, well then if you can kind of analyze what they're thinking about, then you can say, okay, I like some of these ideas, I don't like these other ones. And what type of things do I want to see in my building?

0:36:14 - (Chris Wright): If you live in an apartment, or my HOA, or my city or county or state government, you know, those are all decisions on AI that like we have to come up with. And so it's really up to the people to start push that. And if we don't, then we, you know, going to have to take what we get. And what that looks like is, you know, the big tech solution. And what's strange is a lot of these guys in big tech are not necessarily pro human.

0:36:38 - (Chris Wright): And it sounds weird because you wouldn't think that would happen, but you've got guys that, you know, Sam Altman, you know, for example, or you know, there's, there's a con, a concept of speciation where they're thinking it's our natural human evolution to transform our species into a different version of human. And, and so they're okay with that. And, and so what, what we're saying is like, who put you in charge of this? Yeah, right.

0:37:09 - (Chris Wright): And why the hell aren't you telling us about this? You know, like the people should know, there should be open dialogue about this whole thing, you know.

0:37:15 - (Tawni Nguyen): But no, we're fed shit food manufactured by companies that used to manufacture tobacco. Yeah, don't get me started on the healthcare thing because this could get another hour. But then we're pretty much put on an assembly line to where our life is headed based on just the structure of where society is because it's not human led anymore and no decisions really made because you get sick from the food you eat.

0:37:41 - (Tawni Nguyen): You are programmed by the things that you watch first from the news second. Now it's the other addiction that no one talks about, which is social media. Because most People are being programmed without realizing that they're being programmed. And thirdly, you end up somewhere down the line of the pharmaceutical chains that then puts you into a bubble of bioengineered environments that makes you even sicker.

0:38:04 - (Tawni Nguyen): So weak, docile. And sickness people is what we have to deal with right now on the human condition level. And I don't think people realize the extinct of, like, intelligence versus what's artificially discovered. Like, when you asked me that question, I said discovered, not created. Because levels of intelligence is not really created. It's already in existence. People just need to catch up to it, need to catch up to doing their own research on how long this has been hidden from other people.

0:38:32 - (Tawni Nguyen): And I think that's something that media is massively known for, hiding shit under the rug.

0:38:40 - (Dave Burlin): Well, and I think we're gonna hit different levels of awareness. We're gonna hit different levels of curiosity that create levels of awareness. And that's what takes me into this. You brought this. Can you tell us about the warning? The warning? And where did you get this? And what was the whole. The whole preface?

0:39:00 - (Chris Wright): Yeah. So basically, this is a foundation. Foundational principles that we're pushing right now. We're handing these out on Capitol Hill. I was actually at CES here in Vegas and actually giving these warnings directly to the big tech companies and. And say, you've been warned. Get ready, because legal liability is coming. And so basically, what we're saying is that we need IQ throttles. If you're looking at these systems and thinking that 1 million IQ or even really above 200 is like Max for humans.

0:39:33 - (Chris Wright): So if you say you're talking in thousands today for iq, it's got problem solving, reasoning, that kind of thing. This is called the alignment problem. And so in tech, they say that you can talk to Geoffrey Hinton, the godfather of AI. All these leaders of artificial intelligence have said there is no way to solve the alignment problem. How do you control smarter than human systems and keep them in alignment with humanity and our goals?

0:40:00 - (Chris Wright): Because, you know, it's like anything, any being or species or whatever that's much smarter. They really start to disregard the wants and needs of the, you know, the. The thing that's, you know, less intelligent and. And so they'll make their own decisions. They'll hide information. They'll make, manipulate those. You know, they'll steer data in a way that will trick you. And so that alignment problem is fundamental in artificial intelligence.

0:40:25 - (Chris Wright): So the way to solve it is, say, screw the alignment problem. Put people in charge from the get Go. And so you say, no, we'll hamstring the IQ today and keep it under control. And so like any sort of machine, AI is a machine. You can either speed it up, slow it down, increase or decrease the iq, the intelligence. And so if you put that in our hands and be like, hey, maybe in Tokyo they want, you know, the IQs to be a million, you know, in all their systems. Okay, cool, maybe, maybe we'll see how that works out for them.

0:40:57 - (Chris Wright): But maybe like in West Virginia, a rural town where they focus on coal mining, maybe if you put in 1 million IQ systems, it's going to replace all the coal miners. And so the entire industry in that town will get destroyed.

0:41:10 - (Tawni Nguyen): And whoever owns the AI actually makes the money. It's not regular everyday people's jobs anymore.

0:41:16 - (Chris Wright): Exactly. And so, so, so you could say, okay, well, in that, that coal mining community, they could be like, look, like we'll vote on it. You know, we like, you know, AI for cleaning up and doing kind of stupid tasks that we don't care about. But so we'll, we'll limit it at 50 IQ, you know, so it's, it, you know, it's functional, it can do certain things, but there's no competition with an actual human, you know, and, but that, that should be in our hands, that throttle mechanism.

0:41:40 - (Chris Wright): And, and so that's one of the things we're working on. And you know, we talked about data.

0:41:45 - (Tawni Nguyen): Sets or talked out of me.

0:41:51 - (Chris Wright): But you know, we produce a mountain of data every single day. And where does that data go? And, and are you getting compensated for it? And you look at the, the lifestyles, these tech CEOs and, and the amount of money that's going into these companies.

0:42:04 - (Tawni Nguyen): And it's like, amount of VC money because they want 200x exits. Yeah, we can talk about that too, right?

0:42:12 - (Chris Wright): Can we talk about that?

0:42:13 - (Dave Burlin): I don't think we should.

0:42:15 - (Tawni Nguyen): We will talk about that. Money and data is my game, man.

0:42:19 - (Chris Wright): Right, yeah, well, and so each of us have, you know, there's a figure that each of us collectively have given a million dollars to all these different tech companies. If you've got a phone, you're feeding the beast, you know, and, and so it's like, well, how about, you know, the people get some of that money? How about you? You can.

0:42:34 - (Tawni Nguyen): Yeah, because you talked about gamifying it.

0:42:36 - (Chris Wright): Yeah, exactly.

0:42:37 - (Tawni Nguyen): Yeah, so let's, let's come back to that because I don't think people understand what that really means and that it's already active. It's just Right now it's being used for gaming rather than actual income generating. So let's, let's talk a little bit about Gamify.

0:42:50 - (Chris Wright): Yeah. So if you have an incentive where the incentive is trust and the incentive is being truthful or honest on data and that's given to the people and there's some sort of reward mechanism for that. Right now there's a reward mechanism in social media for being fake. You know, look at all the fakeness you see. I mean everything you see on Instagram.

0:43:12 - (Tawni Nguyen): Filtered and fake followers and AI response comments. I looked at all the software by the way. This is how much of a geek sometimes I am. Because I want to see what's out there and what is actually taking away from like human capacity versus what's actually adding to like optimization. Because I know businesses are leaning towards it because older generation business, like the blue collar businesses are trying to catch up on the AI game.

0:43:39 - (Tawni Nguyen): But there's so much software and there's so much data out there that's, it's, you know, like if you don't have the right set of education then you can't, you're never going to get ahead of using the right form of software because everything is getting developed like newly every single day, you know.

0:43:53 - (Chris Wright): Yeah. I mean what's coming out is next level. I mean some of the future stuff that they're working on.

0:43:58 - (Tawni Nguyen): I mean you can heat map like how long someone spends on like a certain word or.

0:44:02 - (Chris Wright): Yeah, I mean it's, it's. I mean even, even I think I've talked to some folks and they're like, yeah, the entire Internet is going to change. Literally everything is going to transform, you know, and, and we've kind of crossed that. It's almost like the start of the new Internet.

0:44:15 - (Tawni Nguyen): What if we just shut everything down and hit a hard reset, you know, how would people live now?

0:44:20 - (Chris Wright): Oh no.

0:44:21 - (Tawni Nguyen): How would they live their fake lives?

0:44:23 - (Chris Wright): Lives. But that's even one of the concepts is that in order to keep AI limited, you know, especially if you allow just wild IQs, one of the things that local jurisdictions could do is they could say, hey, let's, let's build in our public holiday calendar a, a digital dial down day where you actually like slow the Internet down to dial up speed or something. Something limited because.

0:44:48 - (Dave Burlin): Easy now, hold on.

0:44:50 - (Tawni Nguyen): It's like you just triggered Dave right there. He's got the fitness. Then we get to trauma response, we.

0:44:56 - (Dave Burlin): Get to pitch that over to our kids.

0:44:59 - (Tawni Nguyen): Oh grandpa, calm down.

0:45:01 - (Chris Wright): Like what do we do?

0:45:02 - (Dave Burlin): What do we do?

0:45:03 - (Tawni Nguyen): Do you know what an AOLCD is?

0:45:06 - (Chris Wright): Yeah, that. Remember that dial up sound that like, who needs to call into.

0:45:10 - (Tawni Nguyen): Oh my God.

0:45:11 - (Dave Burlin): This happened. This happened. When I did comedy a couple years back, I, it was a curated comedy thing and it was, what was it? Awkward Teenage Diaries. And we would like read stories from that. And I did like a top 10. And one of them was like, I read like top 10 quotes from my journals from like 1997. And it was like October 5, 1997. So I finally used the Internet. It was just like, that's whenever I did it.

0:45:39 - (Tawni Nguyen): Is it after the quagmire room was entered, how you came out with one arm really buff? I live with that right there.

0:45:48 - (Dave Burlin): I think it's an interesting, it's an interesting approach. I mean, I'd be curious to see what the projections of that would look like over time across multiple, you know, cities and things like that. And then can you get like the whole, can you get the whole country to commit to something like that? Who knows?

0:46:08 - (Chris Wright): Well, and then one of the concepts is that, you know, all these industries have the potential to really just dive head first into all this technology and.

0:46:16 - (Dave Burlin): And mess it all up at the same time. Right?

0:46:19 - (Chris Wright): And then from a national security standpoint, like, you know, we're, we're at risk from emp. You know, we're at risk from like major cyber threats and things like that, which could, you know, ultimately derail the Internet. Very likely. It's very easy for that to be done. So the idea is to say, hey, let, let's prepare, you know, future generations with some sort of understanding of what it's like not to have all the tools or at least have, you know, limited tools. So maybe, you know, you could tax, you could do basic things, you know, but, but the future is going to get weird. I mean, I'm talking like, they're talking like, no, you know, no more 40 hour work week. We're talking four hour work week.

0:46:54 - (Tawni Nguyen): You know, everyone go back to a Nokia and play Snake.

0:46:58 - (Dave Burlin): Well, it's, it's funny. Did, I don't know if you ever saw the show, did you see the show Revolution? There was a show called Revolution and it took place from the future where there was like the end of all electronics. And it in a sense sent us back to the Stone Age, but slowly started to develop it back. There were different pockets of where they could rediscover that energy and those frequencies and stuff again. But it was very much like one day the lights went out and it's like, what happens to society after that.

0:47:31 - (Dave Burlin): What I think what's going to be interesting is at any point that that's always possible. It's where are we at as a civilization? Where are we at in the emotional intelligence that we have? And it's funny because I often make this comparison all the time. It's like, yes, we make comparison from generation to generation, but it does present a whole different set of challenges in just the social awareness, the preparedness that we had as kids versus, you know, my son, he's 20 now and it. At the time of this recording. Right, he's 20. But it's, it's, you know, there's just different social skills that, you know, going to the grocery store with a list versus no list but a phone to always be able to call back. Like what happens when you don't have that phone?

0:48:20 - (Dave Burlin): Like, it's not just execution. Then you get into preparedness and planning that, you know, we, we don't really know what skills the younger generations don't have because of relying on technology. And almost. We would, It'd be fun. Free for all, just to like, no problem.

0:48:38 - (Tawni Nguyen): Zero skills, zero trades.

0:48:40 - (Dave Burlin): Figure it out.

0:48:41 - (Tawni Nguyen): That's another thing. It's like make trade sexy again.

0:48:43 - (Dave Burlin): Yeah.

0:48:44 - (Tawni Nguyen): You know? Cause it's like data is sexy, systems are sexy.

0:48:47 - (Dave Burlin): Like, yeah.

0:48:47 - (Tawni Nguyen): But trades like, who's gonna build the rest of the country?

0:48:50 - (Dave Burlin): Yeah.

0:48:50 - (Chris Wright): Yeah.

0:48:51 - (Tawni Nguyen): If everyone just wants to be a TikTok influencer now that TikTok's back. What? That was a short 24 hours that people panicked or something.

0:48:56 - (Dave Burlin): Wasn't even 24 hours.

0:48:57 - (Tawni Nguyen): I don't know you. But you can see where the attention is going. And that's what people focus on is not how am I gonna learn hard earned skills to make the rest of my day.

0:49:07 - (Chris Wright): But how.

0:49:08 - (Tawni Nguyen): What's the easiest way for me to, to get out of working hard the rest of my life so that I can just focus on, you know, making videos or something, you know, which is also a skill. But probability speaking, not everyone's gonna get to that level that they're gonna make a living out of that.

0:49:23 - (Chris Wright): Well, yeah, I think you look at the tech CEOs for guidance on this. I mean like, you got all these folks that like, they will not give their kids phones.

0:49:30 - (Dave Burlin): Yeah.

0:49:30 - (Chris Wright): Until they're like six years old because they know full well, like what happens. You know, and it's like, yeah, they care about their children and they're just like, no, stay away from tech. And I'm like, that's evil. What the hell? I mean, but they're selling these products that are, you know, warping the minds of everybody else's kids.

0:49:43 - (Tawni Nguyen): They can raise other iPad kids, but they refuse to put the poison in their own kids hand. Yeah, but they're making billions of dollars doing it.

0:49:51 - (Chris Wright): Yeah, yeah.

0:49:52 - (Tawni Nguyen): There should be a lesson there, you know.

0:49:53 - (Chris Wright): Yeah, yeah.

0:49:54 - (Dave Burlin): They have a front row seat at the White House now.

0:49:58 - (Chris Wright): Oh, yeah. What Stargate 500 billion to Sam Altman and some of these other guys. Like. Yeah, so, yeah, that we're actually trying to get Trump's attention right now and say, look like, you know, like, who, who is advising you right now? Because, like, I don't know why Elon Musk, what he's doing, but he needs to be advising Trump on, like, look like AI safety should be number one here. There's a whole Paris summit that just happened yesterday on, on AI and you know, it's weak. These guys, you know, all these world leaders, they should be experts at this stuff. And, and you know, as much as they're experts on national security or foreign policy, they need to be experts on AI.

0:50:31 - (Chris Wright): And at least to the point of the, the spiritual, mental, physical implications of the technology.

0:50:36 - (Dave Burlin): And as we, as we definitely wrap here, I know this, this is a great conversation and we can.

0:50:42 - (Tawni Nguyen): Dave didn't know there's like a whole.

0:50:43 - (Dave Burlin): Here we can open up on this one.

0:50:46 - (Tawni Nguyen): I could see his brain.

0:50:46 - (Dave Burlin): Just what would you say, I guess, if you could give one piece of advice to the average or above average person out there that's using these tools or doing these things? What's one piece of advice that you would give to people to be a little bit more aware of what they're working with and how to be more authentic in it.

0:51:08 - (Chris Wright): Yeah, no, I would say for people listening, they have to understand that, that we're in the driver's seat. Humans are today. This window is closing. This window is not going to be open forever. You're going to get AI systems that are much smarter than us and very capable and very manipulative, and those systems can transform our existence. And so if we want it to go a certain way, now's our time to step up. And so we have to make our voice known.

0:51:35 - (Chris Wright): We have to point out what we'd like and point out what we don't like and then hold these big tech companies accountable for our arms. And so that's what we're pushing for. And so our movement is building and people, as they start to become more displaced from jobs and things like that, they're getting more aware of It. But we're all leaders here. We're all alive here on this planet. This is a special time in human history.

0:51:58 - (Chris Wright): There's really been no other time like this in recorded history that we know of, except for, like, aliens or something. But this, our position right now is very critical. And so by sitting quiet and kind of just like, oh, the big tech guys will figure it out and government will kind of figure it out. No, it's up for the people to make our voice known. And we're all leaders, especially here in the United States.

0:52:21 - (Chris Wright): We've got a responsibility to lead.

0:52:23 - (Dave Burlin): I love it. And where can people find you and take action?

0:52:26 - (Chris Wright): Yeah. So I'm on LinkedIn all the time talking smack on big tech and stuff like that.

0:52:31 - (Tawni Nguyen): That's why we're friends.

0:52:32 - (Chris Wright): Yeah. Chris Wright, EAITC. And then also theaitc.com, you can see our website and yeah. And collectively we come together and solve this problem. It's not hard. We just have to kind of push back a little bit.

0:52:46 - (Dave Burlin): I love it.

0:52:47 - (Tawni Nguyen): Well, Chris, you know, I geek out on stuff like this, but we want to be mindful of your time and, you know how you guys can reach out to me and Dave and Chris with anything else that you want to say or if any part two questions that you want us to go deeper into. Because I think these topical, like, niche conversations are just. It's so elaborate that we can't cover within, like an hour of conversation. And there's going to be things taken out of context. Especially, let's just say if I picks up a piece of our content, puts it out of context, and, you know, that's kind of, kind of like the world that we sit in. But I just want to be mindful and acknowledge you for being one of the first to choose Vegas to adopt this mentality on how we're building the infrastructure, not only in tech, but in the way we operate as humans.

0:53:33 - (Tawni Nguyen): And just being the first to actually lead us to take it to a higher level of government. Coming from someone that's been in, like, the national security level, too. So thank you so much for coming on the show. And you can find me and Dave at Pursu Vegas. Slide into our DMs. Slide into Chris's LinkedIn DMs. I don't think he's very visible on IG. Please let us know he's getting blacklisted over there.

0:53:57 - (Tawni Nguyen): Right. Meta's got it. Meta's got most of my friends. It's okay. They're in Meta jail right now. But anyway, so please reach out to us, let us know how we can connect you to anyone, provide any sort of service for you, and hope you hopefully continue to make good discernment with your life and your and your data.

0:54:18 - (Dave Burlin): The idea of Pursue Vegas was to really highlight the local people that really make Vegas Vegas.

0:54:24 - (Tawni Nguyen): I love that aspect of how these visionaries are actually bringing people together.

0:54:29 - (Dave Burlin): When we hit record, our responsibility is to connect the people of our city so we can show the world who we really are.

People on this episode