
INSOMNIA STREAM: WATCHING YOU EDITION.mp3
09/06/2025German Numbers Lady
00:00:00 Through the right, yes, yes.00:00:52 6.
00:00:54 Units.
00:00:59 I.
00:01:02 Even the right.
00:01:10 6.
00:01:12 You know.
00:01:24 Fear.
00:01:29 Hi.
00:01:32 669.
The Police – Every Breath You Take
00:01:34 Well.00:01:52 You take.
00:02:00 Every step you take.
00:02:07 Every single day, every word you say.
00:02:15 Can you play every night? You stink. I'll be watching you.
00:02:23 Who cares? You see you to me.
00:02:34 Lakes beauty.
00:02:40 Every move you.
00:02:52 I'll be watching you.
00:03:01 Chase.
00:03:16 Baby.
00:03:53 You will.
00:04:07 You make.
00:04:10 Volume.
00:04:18 I'll be watching you.
00:04:22 Remove.
00:04:26 I'll be watching.
Rockwell – Somebody's Watching Me
00:06:00 I'm just an average man with an average life. I went from 95 pay Hell, I pay the price. All I want is to be left alone in my efforts. But why do I always feel like I'm in the twilight zone, always feel.00:06:18 Somebody.
00:06:29 Tell me when I come home. It's now.
00:06:36 That's still really tired.
00:06:39 People call me on the phone. I'm trying to avoid looking the people on TV see me. Or am I just paranoid when I'm in the shower? I'm afraid to wash my hair because I'm going for my eyes and thank someone standing there.
00:06:57 Just a little touch.
00:07:02 That's why.
00:07:04 Hi.
00:07:05 Somebody.
00:07:16 Who's playing tricks on me?
00:07:52 Of the neighbors watching it.
00:07:55 Well, it's the mailman watching man.
00:07:58 And I don't feel safe anymore. But what a mess. I wonder who's watching me now? The IRS.
00:08:11 And I have no privacy.
00:08:14 You know, somebody's watching me.
00:08:20 Dream.
00:08:21 I always think that somebody.
00:08:34 Who's playing tricks on me?
Devon Stack
00:09:08 Welcome to the insomnia stream.00:09:11 You can tell the last time I used OBSI was in an interview, so now I gotta.
00:09:17 Uh, fix this stuff up. Loads the insomnia stream watching you edition. I'm your host, of course. Devon Stack. If you have a feeling that somebody's watching you, even if you always have that feeling.
00:09:32 Well, chances are you are right. You are correct Sir.
00:09:39 And yeah, we're gonna go over a little bit tonight as to how and why that's true.
00:09:46 How and why that's true?
00:09:49 We talk a lot or I, I guess I don't really talk a lot about it here, but I guess on the dissident right here, a lot of.
00:09:57 Nora.
00:09:58 Says about Silicon Valley donors influencing Trump influencing the decision not to regulate AI, for example. And you hear a lot about Elon Musk and hear a lot about just this.
00:10:16 Ambiguous Silicon Valley thing, right?
00:10:20 And I wanted to look at perhaps.
00:10:23 A couple of these people that you know, we hear about Palantir, you know, we hear about Peter Thiel and and we hear about, you know, his Jewish business partner and whatnot. But what we don't hear a lot about.
00:10:38 Is this this fucking maroon looking motherfucker?
00:10:42 Look at that head. Look at that head.
00:10:46 Oh my God.
00:10:48 Look at that head. So this is Marc Andreessen.
00:10:52 Marc Andreessen.
00:10:54 Who, together with his partner.
00:10:57 Ben Horowitz.
00:10:59 Marc Andreessen and Ben Horowitz, who were very instrumental in getting Trump elected. They collectively, I think, from their personal accounts, donated over $11 million to Trump and his super PACs, and then they they they a lot of other in kind type service donations that are hard to.
00:11:20 Figure out exactly what that adds up to.
00:11:24 And of course their their their work on the campaign itself. They are big, big money, big, big dick money, Silicon Valley investors.
00:11:38 And they run Andreessen and Horowitz.
00:11:42 Which has a terrifying motto of.
00:11:47 Software is eating the world. Oh.
00:11:50 Isn't that nice?
00:11:53 Why is that your motto exactly? Software is eating the world because all you all you invest in is software companies.
00:12:06 OK, well I whatever I guess. So software is eating the world over at Andreessen Horowitz and I wanted to take a look at the kinds of companies that perhaps they invest in the kind of companies that they get.
00:12:23 Uh.
00:12:24 Off the ground and and they have their fingers in too. And if there's maybe a.
00:12:29 Theme and while I'm not going to cover that they their entire portfolio, I did notice a theme when I started looking at some of the companies that they bankrolled a a theme of of well of surveillance A-Team or a theme of surveillance and data collection. And I don't think the average American.
00:12:50 Or even the average person really is aware of how much data.
00:12:57 People like the well, like Andreessen and, well, Peter Thiel and and Elon Musk, and well, Donald Trump. Really how much data these people have on you, and no matter how hard you try to stay anonymous, even if you try to stay anonymous online, like there's a lot of people.
00:13:17 Who think they're safe if they they post edgy memes anonymously online that that's not actually tied to you because you've been extra careful, right? You've been extra careful. Maybe you've installed, you know, you use a VPN.
00:13:31 Then.
00:13:32 You have a, maybe a computer. That's all it's used for, right?
00:13:36 Where it's just used for edgy memes, use a VPN and there's no possible way they would ever know who you are.
00:13:46 Well, you're wrong. They know who you are. And I just wanted to highlight this that in terms of at least law enforcement, IRL activism, and online activism.
00:14:00 Really doesn't really matter. You're, you know, whether they're taking pictures of you while you're marching around in some kind of protest, or whether they are looking at your posts on 4 Chan or on X or or wherever. In some ways, it's actually easier for them to track you if you all your.
00:14:20 Active activism is online.
00:14:23 And because they don't, they don't have to leave the house. They don't have to do anything. They don't have to send anyone to the protest with cameras now.
00:14:32 That said, increasingly, you know, we're not like the UK yet, but increasingly we are starting to have cameras more and more and more covering every inch of especially urban areas in the United.
00:14:47 States and a lot of this, frustratingly, is consumers choosing to do it much like the data collection is usually, or at least especially how it started. It was consumers voluntarily giving up their data in trade for some free service.
00:15:04 And you know, social media, all the all, basically almost every kind of of Internet service that including gaming. I mean it really doesn't matter if you're accessing the Internet, even if you're watching Netflix.
00:15:20 Or just online shopping.
00:15:22 You are giving them surveillance data about yourself.
00:15:26 No.
00:15:27 No matter what you do, no matter what you do online, you are donating surveillance data to your profile that they they have on you. That is probably a lot thicker than you think, probably a lot thicker than you think.
00:15:44 So let's let's let's dive into maybe some of the companies that Andreessen and Horowitz have bankrolled over the years and are heavily invested in. And let's think of maybe the types of data that that gets collected and and maybe how that data could be used.
00:16:00 How it's probably already being used. How we know that it's been used in the past and how it will be used in the future.
00:16:06 Sure. And by whom?
00:16:09 So let's take a look at maybe some.
00:16:11 Of.
00:16:11 These these companies where their software.
00:16:16 Is eating the world.
00:16:18 It's eating the world, so here's an article here talking about when they first donated to Trump's campaign a a Paula Poultry 2.5 million apiece. Back then that was in 2024. Then they they opted to. So that was that was 5 million they they threw in an extra 3,000,000 each.
00:16:39 The following year. So here's here's one company called Mixpanel.
00:16:47 Mix panel that sounds that sounds friendly.
00:16:50 Mix panel.
00:16:52 So mix panel is is much like a it's an AB testing.
00:16:58 Uh.
00:17:00 Service that basically.
00:17:04 Profiles you in real time.
00:17:06 Time.
00:17:07 And when you're shopping for something.
00:17:10 And based on your mouse movements based on your user input, it decides whether or not to show you one version of a page or another version of a page. And then if you seem interested in that version, it's almost like when you go to the eye doctor, right, and they say.
00:17:30 Does this one look better or does this one look better and you're like the first one? And there alright, how about this one or this one? And they and they eventually.
00:17:37 End up on your prescription well, it's the same sort of process. You go to a website and it tries this version or this version and then if you pick one, that's how about this one or this one. And eventually it it. It figures out how to mind fuck you into buying whatever product it is that they're that.
00:17:55 They're selling you.
00:17:57 In collecting all this data from your your mouse and from your keyboard and possibly other things.
00:18:07 It it accidentally, quote UN quote, accidentally sucked up user passwords because a lot of their software would still be running in the background no matter what websites you were on. And this is this is typical for a lot of these these advertising advertiser.
00:18:26 Data collection services, where they're just trying to monitor your your every action will explain why it's so important. For example, that mouse movements are collected and and using this to try to identify you and then.
00:18:44 Once they've identified you or profiled you, categorize you, and then try to influence you with a campaign or a strategy that is tailor made for your category. So this is one company. Then there's another one called.
00:19:02 Optimizely which is basically the same.
00:19:04 Thing, it embeds tracking scripts like JavaScript tags into websites and so you might not even know this is the thing. You might not even know that you're using the software and this is also often the case with these programs. You don't even know that you're using the software, you just happen to go to a website and and they're using the software or you happen to download a free app.
00:19:25 And it's using the software.
00:19:27 And so it it. It decides what variants of the page that you're you're looking at based on you know, it looks at your cookies that you have on your browser. If it's on a mobile app, it looks at you know whether apps you have installed it. It collects all the data that it possibly can and then to get you into one of these categories, it looks at what you scroll past, how fast you scroll.
00:19:51 What? What products you look at the most, what you add to cart, what you abandon after adding to the cart and never go back to.
00:19:59 In fact, you might notice this something like this, where maybe you've you've gone to a website, added something to cart and then you I don't really need it. And then you go back maybe a week later that same website and it's advertising just that at you the entire time, it's because it knows it remembers that you added that to your cart.
00:20:19 And it it's trying to get you to buy it this time. Most people don't think about what's going in, like what, what's going into the website making that decision to, to to present it that way to you. We'll go, we'll go into more detail here in a little bit.
00:20:35 But there's lots of lots of companies use this this particular software, for example adblockplus.org even uses it, which is a little bit odd, I guess, to find out who doesn't like ads, who knows?
00:20:53 But there's lots of high traffic websites that use this sort of thing. Lots of companies, big companies, Xerox, HP, Zoom, United Airlines, Bloomberg, Shell, Mazda Cannon, American Express, Toyota, SeaWorld, etcetera, etcetera, etcetera, etcetera.
00:21:13 Are they got 4 square?
00:21:16 4 square used to be a little check in app back, I think 15 years ago or so. This is a company that started out as a social media app designed to collect data voluntarily from people where people would, for whatever reason. I don't know why, mostly girls.
00:21:35 I think would want to tell everyone where they were going all the time and so you would check in that Foursquare. Yeah. I I guess it was like a weird way of maybe bragging that you were going to a certain place. I don't know what the. I I honestly don't understand why people used it for this, but they did. They would go to, you know, the mall.
00:21:56 I'm checking I'm at the mall now and.
00:21:59 The whole time 4 square is collecting data on this person and where they're going, how long they stay there and various other data points and it it it didn't really become unpopular.
00:22:14 Until the a stalker app was developed and apparently allowed in the App Store that used the Foursquare data to alert creepy guys as to when girls would check into certain areas, it would it it some I I think it's somehow plugged into 4 squares.
00:22:35 Data and then it would link to the the Facebook page of the.
00:22:40 So if like a good looking girl came in to like an area that you were in, it would be like, oh, this chick just checked in here and you can go like, I don't know. But apparently that that was a big deal and it kind of tanked the popularity of four square, but it's still around because they have all that geolocation technology.
00:23:00 So now it's not a social media app that you voluntarily check in with. It's a tracking app because that was basically the beta test. They were beta testing their technology, and that's how a lot of these companies also operate. It starts off like, oh, it's a fun little app. And you know, to have some fun.
00:23:17 That next thing you know it's it's using it to guide missiles or something, but the this technology is in very popular apps that that you probably have on your on your phone right now it it's on the Uber uses it. Twitter used to use it. I'm assuming they still do Snapchat.
00:23:38 Airbnb, apple maps. Samsung, AccuWeather, Tinder. They all use the Foursquare technology in order to geolocate now. The problem with this and this is also going to be a an ongoing thing.
00:23:57 Is that when you have a third party handling data like this and this is all these companies for the most part? Or maybe I think all of them that we're going to talk about tonight. This is the case. This is the relationship that they have and this is how your data gets consolidated into dubious databases that can be accessed.
00:24:18 Only by FBI and other law enforcement. The the way that this works because it's a third party.
00:24:26 Even if they tell you, oh, Foursquare doesn't actually retain any data, you know, we just in a lot of these companies, won't even say that. But let's just say that and I don't know if Foursquare is one of them that tries to act like they have privacy in mind or something like that, but because they're a third party, they're processing all that data. So really, if they recorded it.
00:24:46 Who would be the wiser and no one would know there? There's not like, it's not like they're being audited by anyone. They're not. And so.
00:24:52 If they have a a a digital trail a a digital data trail of your location from Uber, Twitter, Snapchat, Airbnb, Apple Maps, the Samsung apps, AccuWeather and Tinder.
00:25:09 That's a lot of data. That's a lot of location data from a lot of users, and it's outside the.
00:25:16 The server farm that's used by Uber, Twitter and all these other companies, it's all being processed by Foursquare and all they have to do is record it to a log file and to retain it. And again, I don't know if they claim that they don't. Some of these companies do claim that they don't. Some of these companies don't don't claim anything and you have no idea.
00:25:37 So that's that's uh, that's one company here. Hang on one second. Why does this keep popping up? I have this fucking.
00:25:45 Fucking grok page that keeps wanting to maximize and refresh. I'm not sure why. Probably getting a I attacked right now.
00:25:54 So then you've got.
00:25:58 This company here, which is smart car.
00:26:02 Smart car, which is another. It's another third party. It acts as a go between. It's an API for companies that want to write apps that get data from your.
00:26:14 Car. So they use the smart car API and they same thing. They process all the data on their servers so they get all this information and again whether the whether or not they say they're saving it the the potential is definitely there and they hand that once they process the data that comes the raw data that comes from your cars.
00:26:35 Because whether or not you know it or not, any new car these days is just as much as a tracking device as your phone.
00:26:44 In fact, if you you can look at how a lot of these crimes are getting solved these days, and and you'll see that a lot of the evidence is GPS data from from vehicles. You know they can tell. Well, you know, you said you were over here at this time, but actually we we downloaded all the data in your Ford F-150.
00:27:04 And it says here at 4:00 in the morning, you started it up and drove 5 miles. What was that about you?
00:27:09 And it's not just that the these, these cars and these, these are fucking cars. They're doing this. These cars don't just have the GPS data, they're also pinging Wi-Fi networks. So as they drive through a neighborhood, they're they're ping pinging Wi-Fi networks that are local, that it has access to. I mean, it's an incredible amount of tracking data.
00:27:29 That is all embedded in your fucking.
00:27:32 Car and how much you hit the gas pedal, how hard you hit the brakes. I mean, anything and everything you can think of anything that you do in that car or anything. The engines doing. You know the location whether you turn on the headlights like any anything at all that you that you manipulate in that car.
00:27:52 Is going to be sensed. In fact, whether or not someone's sitting in the chair.
00:27:56 They might have sensors in the seats in order to activate or deactivate certain airbags, right? So it might sense that you could say, well, I was alone in my truck and then, well, how come the passenger seat had someone sitting in it? You know, stuff like that. All that data is stored inside of your modern vehicle.
00:28:16 And if you get one of these stupid apps that you want to use to talk to your car to remotely, and that's nothing remotely star.
00:28:24 The car. Then there is a little bit of a security feature built into it, right? And in order to use the API that smart car writes in order to talk to your car remotely, start it and whatever on your on your little.
00:28:38 App.
00:28:39 You have to basically give those security credentials over to smart card and so smart.
00:28:45 Are now in theory, again, they would tell you that this is against their terms of service, I'm sure, but smart car could now remotely, you know, turn your car off.
00:28:55 Or turn it on.
00:28:58 Or turn off your headlights or do whatever, but they've got that security key to access the functionality of your car remotely, as long as it has Internet access and look a lot of these cars even have, you know, the ability to connect cellularly. So it's not even an issue of of having Wi-Fi or something like that.
00:29:18 So that's that's smart car. They have all that data for all and again any app that uses you know the the connectivity between your phone and your vehicle is many of these apps. I don't know every app but many of these apps are using the Smart Car API. Smart card is doing a lot of the processing and getting all the the.
00:29:39 Handshake information and all the the information that it would want from your vehicle, it could remotely access it and get, you know, the tracking data and all this other stuff.
00:29:50 So that's smart car.
00:29:53 Let's see here. What else do we got? What else do we got?
00:29:57 There's a company called Pin Drop.
00:29:59 Pin drop.
00:30:02 What does pin drop do? Well, you know, as many of these companies do, they market themselves as a something like a a way of fighting fraud, fraudulent data or protecting you against, you know, digital attacks or something like that.
00:30:22 This is a product that is used by major bad.
00:30:26 Where if you call up and even if you don't get a human being and you're talking in that menu and you know they want you now instead of pressing one, say this or say that.
00:30:37 Every time you talk, it's creating a voice print because it knows who you are. It knows the number that you're calling from and it's after you access your account. It knows you know your bank details and everything else, and so the banks will roll out pin drop in their phone systems. It's not just banks. There's other companies that use this.
00:30:57 Banks are just like probably the most popular customer.
00:31:01 And it's sold as a security feature.
00:31:04 So that some rando doesn't call up with, even if it's with your phone, they they can't call up and access your bank details because the system will recognize that it's not your voice on the phone, but this stuff is is pretty sophisticated. It doesn't just create a voice print, which again, this is.
00:31:25 This is all going into this giant bucket that is you. It is your digital footprint. You know that this all this stuff adds up into a giant.
00:31:33 Plant giant file that they of information that they have on you. So you call up and you might not even know, right? You haven't consented to have your voice print created on a database that could potentially get into the hands of law enforcement or or literally anybody else if there's or if there's a data breach, it could just end up on the dark web someway.
00:31:56 You're getting your voice print, but not just your voice print the the technology is so sophisticated it can determine other things like for example, it can listen for artifacts in the audio and determine with some degree of accuracy the the model of phone you.
00:32:15 That you have.
00:32:16 Because, Oh well, that sounds that sounds like an iPhone 10.
00:32:20 Or or whatever, right it can. It can calculate based on the way the audio quality. Sometimes it can make guesses on your carrier. You can make guesses on your location by the background noise. It can listen to like let's say you're in a Starbucks.
00:32:40 And it hears, you know, the the sounds of a a of a cafe in the background. It'll it'll make assumptions about your location based on that, it might even.
00:32:52 Implement some of these locations as part of your voice print and and flag it as unusual. If the background noise is is different than it usually like. If you're always calling in from from home and it always hears you know X, you know XY and Z background sounds and it doesn't hear at this time. Then it it might.
00:33:09 Flag it as unusual.
00:33:10 But this is another technology that Andreessen and Horowitz has in.
00:33:16 Then and that it is is adding a a broader, A broader digital picture of your of you when you call up and again, you might not even have you don't you don't agree to this, you're not consenting to this. It might not even be your bank. You might be calling customer service for a random company.
00:33:36 And they employ this technology or a technology like this and your voice print is now, it's now saved.
00:33:44 And it's it's associated with you forever.
00:33:50 Now we've got.
00:33:52 Unify ID, which is recently been acquired by a company called prove.
00:33:58 But they do the same sort of work, just under a different name now, but unify ID. This is. This is where it starts to get a little where people don't think about this sort of thing. It's a little weird. They they playing that they were trying to create frictionless authentication systems yet again you see.
00:34:19 An instance where they're these companies are attempting.
00:34:22 To solve a problem that really they're they've, they've kind of had a hand in creating by having all your data online and having these data breaches and whatever.
00:34:34 They try to make it sound as if.
00:34:37 You're you'll. You'll never have to actually type in a password again. It's it's all for convenience. It's all for convenience. Well, then you might ask yourself. Well, how is it going to know that it's me?
00:34:51 How if I don't have to type in a pastor?
00:34:52 How is it? Well, there's lots of ways.
00:34:55 Including ways you might not think of some of them are obvious, like front facing cameras and stuff like that.
00:35:00 Some of them not so obvious. Authentication systems that run 24 hours a day in the background and and determine who is holding the phone by your gate and the meaning the way that you walk.
00:35:17 How you move?
00:35:19 How you handle your device? Like how the I'm not. I'm not exaggerating. This is part of the unified ID software. How you take the phone out of your pocket it it looks at the accelerometer data and after it because of AI and its ability to look at large.
00:35:39 Bits of data and and look and and find Pat.
00:35:42 Turns it can start to find consistencies between different people or or between or. They can tell what when it's you just based on how you walk around how the phone is bump bumping up and down in your pocket and how you handle it when you reach in to your pocket and pull it out, that's.
00:36:03 That's actually one of the ways it determines who you are, and in the research studies they had a a accuracy rate of over 95%. Determining people like who they were just based on that metric alone.
00:36:17 There's, but it's not just that metric alone. There's over 100.
00:36:22 Other behavioral and environmental signals, they call them that confirm your identity so it's constantly learning your behavior just by the accelerometer data by the microphone, data by the camera data, the.
00:36:43 The compass, I forget. It's called like the magnometer or whatever it's called.
00:36:46 That's inside your and yeah, it might sound weird, but it's it that that sensor can actually sense electromagnetic fields in all the electronics in your house have electronic or electrical electromagnetic fields. So like your.
00:37:06 The refrigerator. Yeah, your microwave. Your computer.
00:37:10 And as you walk near these devices that magnometer, if that's what it's called, I forget what it's called. It can actually detect signatures essentially from different devices. So this is so this is also part of what they were developing when they got acquired by proof.
00:37:28 So yeah, pretty intense stuff. This is. And again, this might you might not know that this is even going on. It might just be a feature that's been implemented in an app that you wanted for free because it did some it made silly. Oh, look, it puts silly cat ears on me, you know? Like some gay shit like.
00:37:45 That and people are downloading this kind of crap all the time. They're downloading these kinds of apps that are free and they think that the annoying part about them is just the ads that pop up during their use, or that they have to pay $0.99 to unlock the thing they actually wanted the app to do, or whatever.
00:38:03 But there's also these things that are kind of rolled into these apps that that you're not even aware of, that you agreed to, by the way, because their terms of service was 50 pages long and nobody reads those things.
00:38:17 So even with all these different.
00:38:21 Metrics that this company looks at, they claimed near perfect accuracy. Again, it could determine just based on the sensory input in your phone after following someone around for a couple of days, it could. It could determine that it was them with 99.999 that's their quote.
00:38:41 99.999%.
00:38:45 And that it doesn't require any new hardware. It doesn't require really making any changes to devices that people are already using.
00:38:55 So even on low end smartphones just using the accelerometer and the compass and everything else they can, they can figure all this stuff out.
00:39:04 The.
00:39:06 Again, because it's a third party.
00:39:09 All this information is being processed by the third party. It's not being processed locally on your phone, and it's not even being processed by the people whose app is using this feature.
00:39:22 It's all being processed by this third party that then has all that data and.
00:39:30 Who knows what they're doing with it?
00:39:33 Some some, like I said, some some legit will just tell you. Yeah, we're selling it to data brokers. So, you know, I don't know if this company in particular is doing that or not, but.
00:39:45 The potential is definitely there.
00:39:49 Then we have Tanium. I think it's how you. So maybe Tanium like titanium.
00:39:55 Andreessen Horowitz invested 90 million in the Tanium.
00:40:00 About well, it looks like 11 years ago.
00:40:04 That is a cyber security system that manages all your cybersecurity for major companies.
00:40:13 With a single real time platform to manage and secure all of your computers, known as endpoints like laptops, desktops, servers across entire networks.
00:40:28 So it basically is this one stop shop security solution for major corporations.
00:40:36 And the problem with that is again, it's selling the idea of security, but it's also collecting data in order to perform this security. And you can even customize if you're the employer might not even be detaining him. That's that's collecting this data. Your employer can collect this data because you're you can.
00:40:56 Use features that look for.
00:41:00 Like it's designed the way it's sold to the employer, for example, is it's designed to look for abnormal activity or non work related activity, right? So if if you have a computer that keeps accessing some website, you know like some furry porn website or whatever, then it.
00:41:20 It'll tell the people on it and that sort of a thing, but it's so customizable, you can have it do all sorts of stuff. You can have it scrape emails, take screenshots, all kinds of stuff.
00:41:32 Stuff.
00:41:34 It's it's kind of like, I mean you can block pages that that have bad press about the the company that you are and that happens like I have a friend that worked at Booz Allen during the.
00:41:47 The Snowden incident and they shut down.
00:41:52 All of the store like you couldn't go to the Internet and look up Snowden. It didn't. You couldn't do it.
00:41:58 And in fact, if you did you, if you tried, you probably got on some list. Tried just to look for it.
00:42:04 Because, you know, Snowden was a booze Allen employee.
00:42:08 So you know stuff, you know, big, big, giant IT software companies like this, you have eclipse this, that's sort of the same thing, only for supply chains. So they handle the security for supply chain infrastructure.
00:42:24 And you have lookout, which is basically the same thing only for mobile. And this is a a software suite, I guess you could say that employers might force you to run on your phone if you have your phone at the office.
00:42:44 And say, well, it's for security purposes.
00:42:47 But also, while they claim it doesn't collect personal data, it monitors any apps that you have installed. It monitors the device status, network activity it you know what? What different Wi-Fi hotspots that you've you've encountered. So there's lots of this stuff.
00:43:07 At at, at first blush, if you don't, you're not thinking about how you could misuse this stuff. It might not think you might not think of, like, oh, that's tracking data, but a lot of it is.
00:43:17 Very much tracking data. Then Speaking of tracking data.
00:43:24 Then we got uh.
00:43:26 Oh wait.
00:43:27 There's this one too. I forgot about this one. This is this is a Nansen, which is like a crypto tracking programs. They're tracking crypto transactions.
00:43:39 And then you have this one open Gov, which is supposed to be sold to local governments to interface with the public. But what it really kind of does is it gives open Gov a lot of access to a lot of government documents and resources.
00:43:55 UM.
00:43:58 Let's see here. Oh, then we got. Ohh. Then we got Branch International.
00:44:02 Where this is this kind of shows you.
00:44:07 How a lot of this process can work with the opening up of of the Third world. In other words, India going online right in the same way that we let our weapons companies test out their weapons in Third world conflicts like Afghanistan.
00:44:22 A lot of these tech companies like to test out their really intrusive shit in third World countries that don't have anyone smart enough to tell them.
00:44:31 No.
00:44:32 And so branches is one of these companies that offers a smartphone powered financial tools.
00:44:39 Including instant loans and money transfers and bill payments, and so trying to essentially be the everything app for third world country people like I guess like we chat and to some extent is in China.
00:44:55 But it it's all doing this banking.
00:44:59 Sir, all these banking services while tracking anything and everything you.
00:45:03 No, and I mean anything and everything you do. So it tracks, obviously you have to put in all your identifying information for a bank account.
00:45:15 But it tracks like your phone, your your SIM card info.
00:45:19 All of your financial and credit information, obviously it it tracks all of your the the way that you use your phone in the same way that some of these other tracking like you know the the geolocation, it tracks where you shop, what you buy, it tracks basically anything and everything you could imagine.
00:45:40 A financial.
00:45:41 App would do without any kind of concern for privacy whatsoever, because again, it the people in the countries that this is rolled out and just are dumb.
00:45:51 Here's the one the big one, though, the big one.
00:45:55 As flock safety.
00:45:57 Good old flock safety.
00:46:00 So flock safety.
00:46:03 Is start out as a company that placed and I'm sure you've seen.
00:46:08 Them.
00:46:09 Placed at the side of the road, these salt you might see, it looks like a solar panel with a camera on it. Now when I first started seeing these, I thought oh, it's it's like a speed camera. That's what I thought it was. I thought it was.
00:46:24 You know, like sometimes you'll live somewhere where they do that right. They'll set up these mobile speeding cameras and say, oh, you see that you slow down. That's not what it is, though. That's not what it is at all. It's a license plate reading camera and it's reading license plates all day long.
00:46:43 And it's not the only one. There are several of these in. If you live in a major city, you have been tracked by flock safety cameras and your your location. Your travel has all been logged and it's been attached to your digital profile. With all this other data that they have on.
00:47:03 Uh, there's some cringed lefty guy who actually does a good job of explaining some of this stuff, even if, like I said, he's kind of a a cringe Tommy. But I thought that he did a good enough job to where I would play a portion of his video.
00:47:21 Here we go.
00:47:23 Where he explains flock safety.
Tommy
00:47:27 If you're an American, you've been probably seeing a whole bunch of these things, and in some places they're so common that you don't even notice them. They just blend into the background like they're trees or street lights, and you've probably correctly assuming that they're recording traffic. They're also recording and logging license plates and using AI image recognition. But what if I told you that they are in fact not owned by your local Police Department or your local government?00:47:49 But are licensed to them by a third party startup and all of your vehicles whereabouts are being tracked by a third party data broker. What if I also told you that major retail chains are also using them and they're combining your vehicles whereabouts with your personal information? Your.
00:48:04 Stopping habits and even your in store behavior and some of them are giving that information to law enforcement.
00:48:09 Flock safety is a startup that was founded in 2017 that specializes in developing and leasing security cameras that have AI capabilities such as license plate recognition and vehicle identification. And these security cameras feed into databases.
00:48:24 That law enforcement, private companies, and even private citizens can access and utilize. And if you own a car in the United States, you have unquestionably been logged within one of these databases.
00:48:35 Like this, you drive past a flock safety camera and it records an image.
00:48:39 Video an image segmentation model or something similar looks for the license plate itself for a rectangle with some key identifiers like tail lights or a rear window. Once it is confident enough that it found a license plate, it sends a portion of the image to an OCR model or optical character recognition, which is probably the most widely used type of AI for consumers. For example, it's used in.
00:49:00 PDFs or any type of automatic data entry like scanning and cashing checks on your phone with your bank's.
00:49:05 App.
00:49:05 AI like this typically uses a confidence level threshold, so if the OCR model is above, say 90% sure that your license plate was read accurately, then the information is saved with a location and time stamp on it. It can also classify the make and model of your vehicle.
00:49:20 Note if there are any bumper stickers or add-ons or cosmetic damage on the vehicle and this information as well gets stored to that database, a law enforcement officer can then run.
00:49:29 Your license plate and see every single time that your vehicle has been tagged in the database. So if you're in a city that has a lot of these cameras, it pretty much has the same effect as secretly sticking a GPS device on your car. Flock safety is also happy to provide service to businesses and homeowners associations and those private clients often allow law enforcement to access the data from their cameras.
00:49:49 Then there's the hot list. If law enforcement puts you on the hot list, they are notified every single time that flock detects you, so it's kind of like having a police tail all day, every day, but without the pesky annoyances of requiring A judicial warrant to target and track a citizen who may not even be suspected of any sort of crime.
Devon Stack
00:50:09 Now that's the other aspect of having third parties run all of this.00:50:16 Data collection is you can have an intelligence agency with that would be barred from collecting this data themselves. That because they'd have to go get a warrant, they'd have to have probable cause. But.
00:50:30 They it's not illegal for them to just buy giant blocks of data from these data brokers that include all of this data that we've data we've already talked about and some more we're going to.
00:50:42 Talk about in a second.
00:50:44 And so they they already have access and they have contracts with a lot of these companies, not just companies like like flock safety that they're they're paying and. And The thing is it's all a subscription model. So the taxpayers actually getting fucked on these things that they're they're paying for their own surveillance state in a way that's maddening.
00:51:04 Because they don't, even the the police don't even own the hardware. They're basically renting the hardware, and they're not even allowed to do anything with the data other than and what what's allowed in their in their service? Agree.
00:51:19 And they Flock Safety isn't responsible when they fuck up.
00:51:24 So for example, in this case, when it gets abused, kind of like Snowden was talking about how people that were working at the NSA were using all of their data collection services, which that's that's the the clandestine stuff that we already know is happening that shouldn't be happening. This, this is stuff that.
00:51:44 That is happening in the commercial sector, but you know the government sector is probably even worse stuff than this. Uh, as, as we know from from Snowden.
00:51:53 But the just like with the Snowden case, where NSA employees were using that technology to spy on ex girlfriends and and shit like that in really creepy ways already, this flock safety stuff has been used by, you know, weirdo police chiefs. Apparently two track ex girlfriends.
00:52:14 In their location, but also it fucks up sometimes it fucks up and tells the police that you're driving a stolen vehicle when you're not.
Tommy
00:52:25 Clock safety has exhibited the usual hyperaggressive start up behavior that we've all grown accustomed to near Orlando, FL, for example, they installed nearly 100 cameras on public roadways without even notifying the county, and this entire industry sales pitches police departments with their cherry picked success stories. If we didn't live in a surveillance state.00:52:45 This old lady with dementia would have never been found, or if it weren't for all these cameras, this homicide suspect would have never been taken off the streets and would still be murdering people right now. Had this license plate reader error not led to a mother and her young children being held at gunpoint that far.
00:53:00 We would have never.
00:53:00 Received a $1.9 million settlement from Aurora, Co, taxpayers.
Devon Stack
00:53:07 And that's right, they're not responsible when they fuck up.00:53:11 So you're paying for this service?
00:53:14 To spy on you and then when it fucks up and a family of nigs who, in the wake of George Floyd, sue the the county for the mistake and get $1.9 million you paying your pay, you pay for that too.
00:53:31 Flock Safety washes their hands. They're completely.
00:53:35 Not responsible for for the way that their data is used, even if the error is on their end.
00:53:41 So these third parties, they kind of have it pretty sweet, they're they're totally.
00:53:50 Totally in the clear. And even if they end up having to pay some kind of penalty, the penalty always.
00:53:56 Pales in comparison to the kinds of profits they're make.
00:53:59 Here is the CEO.
00:54:02 Flock safety talking about the new product that they're going to be rolling out.
00:54:06 That's in beta right now. It's a. It's a one click solution for cops.
00:54:13 You know it's going to be this AI solution. It doesn't. It's not just, you know, we have the these license plate reading cameras put up all over your city tracking you whether you want them to or not, putting you into a database that then sells to law enforcement so that you are effectively being tracked no matter what you do, no matter.
00:54:34 You know, if you're just going to the store, it doesn't matter. They know that you went there and when you went there.
00:54:39 And there's nothing you can do about it. Doesn't matter. They don't have a warrant because it's they're just, we're just, we're just purchasing data, bro.
00:54:46 And he he wants to increase this to include a lot of this other data that they have purchased themselves from data brokers and in fact initially when they rolled this out, that data included leaked data that ended up on the dark web from data breaches so.
00:55:06 Leaked password list.
00:55:08 Yes.
00:55:09 Leaked. Uh personal data, like when you have the you know, there's been so many data breaches, it's hard to pick one where it has all of your personal information from your Social Security number to, you know, maybe private emails, whatever the data breach is, they when these big packets of.
00:55:28 Data get leaked on the dark web companies like this slurp it up. Now. They have since said Ohh we stopped using the dark web data. It's like OK bro bro like I guess I guess. I'm trust me bro is is is the the standard cause there's no there's no auditing of these.
00:55:42 Companies.
Flock Safety Spokesman
00:55:43 Hey guys I'm still in the studio. We just finished our Q1 product launch.00:55:46 Fingers are still a.
00:55:47 Little bit shaky.
00:55:48 Flock Nova is going to change the game for criminal investigations. I've heard from so many chiefs that they have this system and that system and that system. They know they have the data, but it's taking their analysts hours and hours to build a case. And now with Flock Nova, it's one click.
00:56:03 TO1K soft.
Devon Stack
00:56:05 Look at that one click one case solved.00:56:09 So they're going to now have access to all the other data that all these other systems that law enforcement uses. So it's kind of like Palantir I I guess like on a local level where they will tie in their license plate readers to all the criminal history of everybody and to, you know, finger.
00:56:29 Printing to DNA profiles to whatever whatever law enforcement happens to have on you, your you know former addresses, former associates, aliases, literally, whatever it is, whatever it is that they have on you, it'll now be included in what flock.
00:56:49 Safety already has, which is all the data they've purchased from data brokers, like the kinds of companies that we've, you know, the kinds of data that the companies we've already discussed, what kind of data they're collecting and we'll get more into that in a moment.
00:57:03 And they're going to incorporate this in a solution where you have a crime, you click once and it's like Bing. It's this guy. Look, this is sort of what I've been talking about in terms of what the ruling class like, people keep saying things like, well, why would they want to replace whites if we're clearly a better population?
00:57:22 And these people rely on the millions of dollars that that, you know, our our economy produces and and when they want, like, smart people in the in the countries that they're administering and. And it's like, no, they they don't because they don't want more competitors, smart people in the countries that they are.
00:57:41 Administering is it's a whole country of people who might replace them. That's a whole country of people that might knock them out of their place in the hierarchy and replace them if they happen to be smarter. So you actually want everyone to be a lot Dumber than you. That's the way you want to. I mean, imagine this like, imagine if trying to be a farmer and all of the cows.
00:58:01 Or or maybe even just 10% of the cows in the field were smarter than you. Good luck. Good luck trying to run a successful dairy farm when 10% of the the the cows are smarter than the farmer.
00:58:13 So you're going to have a lot of issues unless everyone's a lot Dumber than you, and that's the best way to manage a population. That's the best way to farm people, and that's what they want to do. And now that does create some problems. It creates some competency issues. It creates some issues where the.
00:58:33 The population.
00:58:35 Starts to be unable to manage itself like there's some things that you want to just keep going, like a perpetual motion machine than in a society full of higher IQ people. You can just sort of expect them to be able to self manage and self-contained, right? Just like with the dairy cows, if the dairy cows fall below a certain intelligence.
00:58:54 People, they they don't know where to go to get the food or and and. And the cows are pretty dumb already. But like you have to be at least smart enough to go eat on their own. And and there's certain things that you expect the cows to to be able to perform in some autonomous way. Right. And so there is a threshold you want it to be Dumber than you and a lot a lot Dumber than you if possible. But.
00:59:14 That once they get so dumb that they can't take care of themselves, you have to have systems in place that pick up the slack. And that's what all this AI.
00:59:24 Question is is what that what they're hoping to to accomplish with that? They're hoping that they can, through mass surveillance and and predicting behavior from people with real time data tracking data and preference data and inference data and and we'll. Yeah, we'll go into some of this stuff here in a moment.
00:59:44 That they can then govern you with AI. They can have most of the the management of the population, even managed by AI so.
00:59:58 He's talking about how AI is going to start solving crime so you can have stupid fucking cops and it won't matter, you know, like, oh, you got have retard cops.
01:00:06 And we'll make this the software that just that finds the murderer for them, and again, it doesn't matter if they get if they get charged with, you know, violating some kind of privacy thing. In this case, the FTC, you know, fines, data broker for suggesting potential hires were sex offenders.
01:00:25 Because it was, you know, it was.
01:00:27 Illegally, when, when. When a lot of these companies that are going to hire people, they go to data brokers to find out well, what's what's the skinny on this guy? You know, in data brokers who have like your entire Facebook history scraped, they have, you know, any and all data that they have, they either grub little hands on they might.
01:00:48 They might suggest that maybe you're a sex offender based on our AI based on your preferences that maybe you're a sex offender, they get sued, they they lose 1,000,000 bucks, but their annual revenue, as it says here, is $70 million a year.
01:01:03 So 1,000,000 out of 70 million, not a big deal, right?
01:01:09 Then then you got Walmart and this is not to pick on Walmart specifically. This is what you're going to see from any major retailer. Walmart has a lot of data collection that's going on that you might not know. You might just think you're going to Walmart and picking up a I don't know, like, some some.
01:01:28 Few tips and paper towels and I don't know some coffee or whatever right? Like just some normal.
01:01:34 Things. And and you're leaving and. And that's that. What you don't know is the entire time that you're there from the moment you pull into the parking lot. Walmart is tracking everything that you're doing. And not only that, they are selling that data to these data brokers. So a lot of these retailers are making.
01:01:55 You know, they're trying to make up maybe some of the revenue that they see is lost by having to compete with online retailers by being collection or data collection hubs by collecting data on people in the real world.
01:02:09 And so this is the sort of thing that you see coming from Walmart. He goes over the.
01:02:15 The privacy.
01:02:17 Notice that Walmart has.
Tommy
01:02:19 They're logging your personal identifiers, your name, your phone number, your address, your e-mail, your driver's license number, and your signature device identifiers such as your phone or smart watches Mac.01:02:29 Stress. They're sussing out your age, gender, citizenship, race, marital status, household income, education, unemployment information, family health, number of children, credit card numbers, and other payment information, geolocation history, photographs, audio recordings, video recordings, and if known, background checks and criminal convictions. Oh, hang on, I haven't even gone to the creepy part yet.
01:02:49 Inferences, AKA your behavior and preferences from your shopping patterns, intelligence and aptitude, and they reserve the right to share any and all of this with third parties, including but not limited to, you guessed it.
Devon Stack
01:03:05 So there you go, and the inference is that's the where AI steps in. A lot of this data collection used to not worry people because they thought of it as this.01:03:16 It was. It was too much. It was too much data for any anyone to put the manpower behind sifting through for it to matter. In fact, I remember there was a a debate between a privacy advocate and a senator. Ohh, I'd say about 10 years ago now, back when I was a libertarian.
01:03:36 And they talked about, like, their their whole debate was about, I think it because it was post snowed in and and some of these revelations that the government was collecting all this data and what that what what that actually meant in terms of how would I how would it actually affect the average person and the argument of course was well we have to stop terrorism.
01:03:58 And so we have to do, you know, it's Patriot Act era surveillance that that we have to keep legal because it's it's prevented all these terror attacks. Trust me, bro.
01:04:10 And the argument that the Republican senator, I believe, memory, serves Republican senator.
01:04:18 Said in defense of all these surveillance programs that the United States government was using to spy on its own people, it's they said, yeah, well, sure. It's collecting all this data about you and the NSA is basically acting as a go between.
01:04:25 User.
01:04:38 Between your ISP and the rest of the web in some instances and recording literally all of your traffic recording every keystroke, every little thing that you're doing.
01:04:49 And but that's OK because of two things. One.
01:04:55 Even though it's recording literally everything you're doing all the time to these massive server farms in I think, well, the big one in Salt Lake or it was in Utah somewhere that they were specifically talking about, even though that they're doing that, it's it's OK.
01:05:16 Because no, no one.
01:05:18 Agency has the resources to sift through all that and and so it would. It's just insane to be worried about it because it's just so much data and that quite literally put a lot of people at ease they thought well, from a practical standpoint, I guess he's right. I guess if you're going to be collecting that much data, you you're going to have to focus just on.
01:05:40 The the terrorists and you know people that you're you're targeting and I'm a normal guy. Why would the government ever target me? That's never going to be something I personally have to worry.
01:05:50 Well, and so they just wouldn't, they wouldn't. They didn't push back on it that hard. The other excuse was, of course was that well, we do collect it, but we don't look at it. You see, we don't look at, we have to get, we have to get like approval from like a FISA court or something.
01:06:12 In order to look at it, so even though we're collect.
01:06:15 It it's just so that if I don't know like like let's say we think you're going to do a terrorist attack, we can then retroactively spy on you for like 10 years or however long your digital footprint has existed on our servers.
01:06:31 And then you know that that helps us prevent terrorist attacks or or whatever, but we don't, even though we're collecting everything you do on the Internet, we are not legally allowed to look at it. You know, it's just on some servers. It's on a hard drive somewhere and and we promise we're not looking at it unless we.
01:06:50 Have the full cooperation of of a court. You know you know saying yes, yes, definitely, definitely we. You're allowed to look at it. And and this this put other people that he's this is again this was before Trump. This is before a lot of trust in the federal government was eroded because of WikiLeaks.
01:07:11 Because of Snowden, because of Trump's, a lot of this institutional skepticism that has kicked in post Trump post COVID you know where people just and just I think honestly the high.
01:07:24 Trust demographics have gone down, and so white people that even if those that that kind of grew up in a whiter America, a more high trust society, it's just out of self preservation. That trust has kind of eroded now. And so people just don't trust that they don't trust the trust me, bro anymore. Like they used to.
01:07:46 The federal government used to be able to get away with all kinds of bullshit because no one thought that they would be up to anything that wasn't ultimately for the good of the people because they thought, well, they're Americans too. They won't do what's good for us. Well, now, not every American is really an American now, are they?
01:08:02 And you have competing factions and you have alien subversion. You do have and I don't mean like space aliens. I mean, you know, I mean like, like Israelis. And I mean Indians, you or in Chinese, you have other ethnicities and with other motivations.
01:08:23 That are that are working against you and in fact sometimes explicitly, you know in in the. In the case of Joe Biden's DOJ, with Merrick Garland saying that the number one danger to America was white supremacist and white and pro white activists, and so.
01:08:41 He I mean, he explicitly said that he was directing the FBI to focus their energy to target pro white people. So that's the that's the problem is that trust is all gone. That trust is all eroded. And yet they still have all these tools, they still have all these.
01:09:01 Legal ways of acquiring this information, and again, just like the local law enforcement that doesn't have to get a warrant in order to take photos of your your license plate driving around town.
01:09:14 All that data is being purchased by the by the feds too, you know, that's the the beauty of having the private sector handle a lot of this stuff is the, you know, sure. The NSA is collecting all this data in real time. We don't know to what extent that's happening. We know it's it's a lot. We know it's a lot of data and and.
01:09:33 Probably a lot more than even some of the most.
01:09:37 Privacy concern people would imagine, and probably in in ways in vectors. You you're not even thinking of, but even be that as it may, a lot of the data they have their hands on, they can just legally if they want to prosecute you for something they can purchase from a data broker to to create their case.
01:09:56 To build their case.
01:09:58 So you have that going on. Yeah. And and look at Walmart's not unique and that their surveillance of its customers and you have a lot of this technology that is more and more capable of recognizing objects. I remember when I was doing animation.
01:10:19 Work and you know video design or graphic design for federal government clients. And and it was it was you get like I I can't I mean because of NDA's. I can't talk specifically but I remember there was this was like 15 years ago or so.
01:10:37 There was one client where they were talking and this is before I never I had ever heard of. Like AI is like the way it's thought of today.
01:10:45 But they were talking about a product that they were rolling out where video cameras could identify threats based on the way people in the footage were walking around. And I remember thinking, oh, this is some contractor that's just blowing smoke up the government's ask.
01:11:02 And they're going to get a bunch of government money to sell them some snake oil product. There's no way you can't make software then in real time, can look at footage of people walking, like in an airport and and and see, based on that, that there's threat. No. You can actually now that technology 15 years ago probably wasn't great, but it was. It was getting there.
01:11:23 And today it's a lot better. So he finds out that not only are these cameras everywhere that the security for them is actually not so bad or not so great.
01:11:36 And so he's able to access a lot of these cameras that are posted around different cities.
Tommy
01:11:42 Managed to browse my way into accessing live feeds from over two dozen traffic cameras and what's even more troubling is that in a couple of small towns that I audited, if one camera wasn't secured, none of them were allowing anyone the capability of tracking every single vehicle in the town indefinitely without ever having to see or write a line of code. Another troubling example Hikvision.01:12:03 The largest IP camera company in the world.
01:12:05 They make all types of cameras, many of which have ALPR technology and are marketed towards police departments. In 2022, hackers were able to not only view the feed, but exploit the firmware used in over 80,000 cameras, allowing them to execute code remotely and then just last year, the Russian military compromise take vision cameras in Ukraine to obtain intelligence.
01:12:25 And planned airstrikes on Kiev and some for Ukraine's air defense systems.
01:12:29 In fact, right now, at the time of me recording this video, you can download and exploit for Hikvision cameras on GitHub.
01:12:35 That allows you.
01:12:36 To view the feed, retrieve snapshots and extract user credentials. Then you have verkada, a massive security camera company whose chairman has bragged about locking clients into a predatory subscription model.
Devon Stack
01:12:46 There is a hardware component and a licensing component. Now the license you might bought one year license, but you like literally bolted the hardware to your ceiling. So like you're not taking it down.Tommy
01:12:57 Many of their clients and healthcare, prisons, Police Department.01:13:00 And so on. Couldn't afford to leave for a more secure ecosystem. So when every single one of their 150,000 plus cameras were hacked due to a corporate administrator account having their username and password publicly exposed on the Internet, hackers not only had access to the private video feeds, but the networks that they were connected to.
01:13:20 And they were also able to access the archives of those cameras as well. The clips that had been saved by those customers.
Devon Stack
01:13:29 So this this could also be state actors. This could be a way and States and meaning like the American state like it could, or the Israeli state or anyone really. So you can have these vulnerabilities that aren't public yet exist in all kinds of of these like let's say there is some kind of legal.01:13:51 Barrier that prevents the federal government from acquiring some of this data. They get some zero day exploit that gets gives them access to these networks and they don't tell anyone about it and they just slurp up all the data. They slurp it all up and they monitor it until the hack becomes public, and then they quietly go away.
01:14:10 Or maybe once they're done with it and they they, they'll they themselves say, oh, we discovered this exploit you guys need to patch this up after they already got.
01:14:18 They want it.
01:14:19 So you have, you know, the legal means of acquiring this data. You have illegal means of acquiring this data. You have the clandestine state means of acquiring a lot of this data. And again, if it doesn't matter how careful you are on the Internet or or whatever, there is a big file on every single one of us.
01:14:42 He then talks about the technology, you know, used on not just these cameras that are the side of the road, but all of these police cars that also.
01:14:52 Have license plate reading cameras so as they just drive around as they drive around the cop car that is in your rear view mirror it your license plate is being tagged, it's it's uploading to a real time tracking system that that puts the GPS coordinates of your car.
01:15:13 And compares that with other license plate readers that you might have driven by so that it can figure out, you know, where you're going. It tags some similar extra data like not just your license plate damage to your vehicle, bumper stickers that you might have.
01:15:31 Just you know very, you know how many occupants are in the car, stuff like that. So that data is being collected by police cars as well. And like I said, that technology is getting a lot better. 15 years ago when it sounded like fucking not even long is not about 15 years ago, 15 years ago, it sounded, it sounded kind of crazy that they were going to have.
01:15:52 Like they were just starting to have, at least from a consumer standpoint, this kind of stuff where you could have cameras that could recognize different objects. Like here's an example of a camera recognizing that, oh, that's an airplane. That's a truck. That's.
01:16:06 Car and and that sounded like magic to me. Right? And so the idea that it could it could recognize not only just a person, but that the way that the person was moving around and part of it was because, like, you know, I had a background in animation. I've been dealing with video my entire life. And I knew, at least from a VFX standpoint, being able to.
01:16:27 Get a piece of software to recognize an object in footage would have made my life a million times easier. You know, like I wouldn't have to rotoscope anything ever again. Like the most painstaking, annoying thing in the world is frame by frame. Cutting someone out of some footage or cutting, you know, something out of some footage I hated.
01:16:45 That and it just. I just thought, well, if if this kind of technology ever starts actually existing, you know, it'll be implemented in like after effects or or whatever. Like that guy I'll have. I'll know about it because that's going to be one of the most obvious uses for something like that. But no, it it was, you know, the security apparatus has a lot more money.
01:17:06 On Adobe I guess, and so they they they've had this technology for a long time. Like I said 15 years ago, they already already.
01:17:15 Identifying threats based on posture and gait and all these other things. And now it's it is at the consumer level. Now if you go into a Walmart, it does track things like you're the way that you're walking, how many people you're with and and it's it's adding all this stuff to your database. And even if you in Walmart.
01:17:35 Let's say.
01:17:36 You you think you're being clever and you're like, well, not me, not me. Because I don't use any of these rewards cards, and I don't even use a credit card. I pay cash. I only use cash when I go into Walmart. Well, that doesn't really matter, because guess what, the cameras at Walmart follow you out to your car and then read your license plate. And so now not only do they know who you are and what you bought.
01:17:57 Because they can track that from whatever you bought at the register with cash, they now know this guy uses cash. That becomes a data point. This guy pays for everything in cash that's a little bit weird. Maybe. Maybe. Let's let's flag that for unusual behavior.
01:18:13 And that's the sort of thing, too. A lot of times, the stuff that we do to try to be anonymous actually makes you stick out like a sore thumb. I I was talking to a friend of mine that worked in computer security. And he said, you know, I used to really go bend over backwards and and I would, you know, I would. This was years and years ago. So a lot of this is probably not even relevant anymore.
01:18:33 He would he would boot to AA.
01:18:35 Linux DVD they had, I think the name of the distro was called Knoppix or something like that, but it was supposed to be this privacy oriented operating system on a DVD because it's on a DVD. No one you couldn't install viruses onto it, right? Because it's on a DVD you can't write to it. And you know, so whatever was in memory.
01:18:55 I guess you you have that but like that was it so you could reboot and nothing was ever saved. And so there's never any identifiable information on it and he would carry this DVD around with him and boot any computer he would use.
01:19:06 Was he booted from DVD's and just use that and it would use Tor and and all this other stuff and proxies and to just be as anonymous as possible. And then he said one day when he was, he went to some conference. It might have been Black Hat or something like that. They said that actually though that's the red flag.
01:19:26 That when they look at all this massive traffic that's going across the Internet, that the best way to be anonymous is to look like all the rest of the traffic, because the second they see, well, what's this guy doing? Why? Why is this guy?
01:19:40 You know, you know, using this weird port or why is this guy you know using Tor or or why is why is, you know, why are there no cookies on this browser? You know, that's what starts to set off red flags that there's something weird about this user here. And as you'll find out here in a moment, it doesn't even really matter if you use this kind of.
01:20:00 Tactic to try to be anonymous. You're still not anonymous.
01:20:04 And the reason why there's another frame there. The reason why you're not anonymous is they they have a lot of collection data collection that a lot of people wouldn't think about.
01:20:20 Ways of identifying you that don't have anything to do with you know you typing in a username or you know or even like a password which they can match against leaked password lists and things like that. They, you know, there's ways that they can figure out who you are just by your username obviously. And by your password.
01:20:40 And other passwords that you have used in the past. Uh. But there's a lot of other data.
01:20:45 That they collect that a lot of people aren't aren't thinking about. So the first, the first kind of data is is the obvious data that.
01:20:54 Most people know, and that's because it's it's data you are. You are voluntarily giving them, so it's it's direct user provided data and this would be when you sign up for a new service or you know like Gmail, another instance of like, hey, it's free. Well it's free. But also Google can read all your emails.
01:20:58 Really.
01:21:14 And your contacts list and you know and and and if it's implemented like if you're using an Android phone and it's hooked into that, then you know all your photos and everything. You know it's a, it's a.
01:21:24 Big fucking cluster.
01:21:26 But that's a lot of people do this. A lot of people do this. They just trust Google's not doing anything bad for with their information.
01:21:32 And so they're giving their names. They're giving their emails, they're giving their phone numbers. They're giving their addresses. They got address books. They're giving their birthdays their, you know, calendar data. If you use the calendar app that's built into that to to not just birthdays, but any significant day in your life, Social Security numbers in some instances.
01:21:52 Depending on what you're signing up for.
01:21:55 You know demographic details like your age, you know, a lot of these companies will ask you, you know, these really bizarre questions to try. They'll say, oh, it's to try to identify you. What's your mother's maiden name? What high school did you go to? You know, things like that, all of that, all those questions, the answers to those questions are going into that big.
01:22:15 Bucket of data that they've got on you that is now tied forever to you. So when it asks you to come up with a secret, questions come up with a secret question like what was the name of your first cat or what?
01:22:25 Never. You're. You're like you think it's benign. You think it's like, well, that's a good way, I guess, of keeping someone dangerous from accessing my my information that that that's very clever of them. One is very clever of them, but not for the reason that you think it's clever, because now they know that the name of your your first cat or or or the name of your third grade teacher or whatever.
01:22:46 And it might sound like, oh, who cares if they know that? Well, the reason why it's it's they'd wanna know something like that is they now know that. Obviously if there's maybe someone else, they're not sure about that maybe that that data belongs to you too. Well, if they know the name of their third grade teacher and it's something different, they can now differentiate you from this other person whose data was similar.
01:23:07 And not merge the two buckets, so it's little things like that that that is useful and look, it also helps them build networks if they know the name of your third grade teacher or they can even determine just based on the questions that you choose your secret question when they ask you to come up with a secret.
01:23:25 Uh, sometimes it's like these pre invented questions, but sometimes they just ask you. Well, you know, you come up with one and then psychologically they can try to get in your head based on what was the secret question that you wrote, you know, or even if it is, if it is multiple choice, if it's multiple choice.
01:23:45 They can. They can determine something about you. In fact, they might even write the options specifically to tell something, tell them something about yourself.
01:23:54 So that you got that sort of information, you also have your income, your education level, your occupation. I mean, think about LinkedIn. LinkedIn has, you know, all of your, your entire work history, but you might have to send your resume to a company that does job placement, right?
01:24:14 Like you might have one of these temp agencies that, especially in the gig economy that we've got now where you're uploading your resume, you're uploading your education, your your income expectations are are going to come up at some point. And so all this data gets gets thrown into that big bucket that they've got on you.
01:24:34 You're gonna know your preferences again. This is all this is all data. You're voluntarily giving them by using social media, they can look at the the posts that you've liked, the things you've thumbed down, your search history, your.
01:24:52 Questions that you're asking I like, if you ask ChatGPT about a a recipe for.
01:24:59 Deep dish pizza or something like that, like, that's all that stuff is getting collected and thrown into that big bucket. That is the digital version of you.
01:25:08 Of course, health and financial data, medical records, credit cards. If you pay for credit cards online.
01:25:15 Fitness apps. If you've got a Apple Watch and you want if you put the you know use the features that track you know your your heart rate, your how many steps you've you've made that you've taken those that day the pedometer all all the different things that the the Apple Watch or similar products.
01:25:36 Measure. That's all voluntary. You know, you're just giving data on yourself away. And of course the, you know, the data track. And in fact, there's even been an instance where people have been given higher insurance rates, higher health insurance.
01:25:53 Rights because big, you know, one of these data brokers sold data to an insurance company and they could look at how often you had fast food. And so because you had fast food, too often, I don't know where they put the threshold at. You would pay a higher insurance rate. And there's a bunch of other data points that you wouldn't even think about that you're now paying a higher insurance rate.
01:26:16 Because of, you know, maybe you bought too much alcohol one.
01:26:18 Or.
01:26:19 Or or. Who knows right? And that could be life insurance. It could be, you know, and it look, if insurance companies are doing it, then employ the prospective employees are doing it. Government agencies are also doing it. And so you have also your social connections. So your friends list. And this could be and look this.
01:26:39 Goes all the way back, right? This goes all the way back to AOL, your buddy.
01:26:43 List. You know, this goes all the way back to MSN Messenger. You know your list. Your like it. It can track who you're still friends with. Maybe for, you know, since back in the AOL days are, you know, are you now? D Ming people on Twitter that you were were following people on Twitter that you were once friends with back in the AOL days. Well, that's probably a good friend of yours.
01:27:04 Now and and especially they can check that against your friends list on, on Facebook or or whatever other social media app that you use.
01:27:13 And so I can start to create not just a a profile of you, but also your relationships and and maybe even, maybe even the people that influence you, because once they have an idea of your relationships, the people that you're closest to, that you interact with the most, they can start monitoring.
01:27:32 Uh, the the politics.
01:27:35 People's positions that you support by posting memes and this sort of a thing, and they can try to find out where this is. This thought process is originating from and try to start building a a influence hierarchy in your friend group that, oh, this is the guy that usually has the the the idea 1st and by you know three months later.
01:27:55 All, all the, all these other people in this network have a similar take. And so actually if we're gonna try to mindfuck people, we just target this guy that has like, he's the one that plants the.
01:28:07 Seed in the friend group, so this is.
01:28:09 All stuff, man, this is all the voluntary information, by the way, that you're giving.
01:28:13 So this is this is.
01:28:16 This is the first step of data collection now. Now there's the second, I guess. Well, first vector of data collection. The second vector would be device and hardware based data. So this is stuff that you don't realize that you are.
01:28:33 Giving them because you're not just, you know, filling out a form or, you know, typing anything in. I mean you, I mean you are, but you don't realize.
01:28:42 That's being shared with anyone that's being stored anywhere. So for example, you might think that because you're behind a VPN and you go to a particular website, that there's no way they can know who you are. Well, guess what, that particular website, even though you're going through a VPN, can find out what version of OS that you're using.
01:29:04 The you know the browser that you're using, the version of the browser that you're using, the screen resolution that you have, the CPU and GPU details that of your particular computer, and it can, you know, the version.
01:29:18 Java that you have installed.
01:29:20 It can take a look at the installed fonts that you have the different plugins that your browser has. The You know browser extensions, extensions like ad blockers or or whatever else that you might have. It can find the hardware permissions. So for example, if you've ever.
01:29:41 You know, gone on a zoom call and it says, oh, zoom wants to access your camera and microphone.
01:29:46 It can look and see like what your permissions are set to, and that might seem benign in and of itself, but it what it's doing is one more data point that's painting a picture that's identifying you as a unique computer on the Internet. You know, you Add all these things together. For example, how many monitors that you might have? Maybe you've got 3 or 4 monitors.
01:30:07 Maybe one of them's oriented vertically, or you know something weird like.
01:30:12 That all of this stuff is detectable, even if you're going through a VPN, and so you could be going through a VPN and it analyzes all these these different metrics and it it can comes up with the in fact the the VPN that you're using becomes another variable that it considers like OK well.
01:30:32 Last time we had a guy that was using this VPN company this.
01:30:36 P and he had all of this stuff, you know, on his computer. And God forbid you ever go to a website where you're you're not behind a VPN and you have all that stuff and you're you're now now they know who you are and that's that's how they can tie you to your bucket of info because using that because how many people are going to have that exact same.
01:30:58 Hardware configuration that you.
01:31:00 Have and in fact it. It gets very granular. They'll even look at clock. What something called clock skew. And that's because the system clock on your computer, it's always a little bit off like it's always just a tiny bit off and it'll periodically resync with, you know, time dot windows.
01:31:20 Dot com or Mike or whatever, whatever that website is that that goes to the atomic clock. And and Reese, you know, syncs it up. But it it's always just barely off and and they can they can sense.
01:31:31 Now off. And that's that's uniquely off because the reason why it's off is there's these tiny, tiny variations in, like, the physical characteristics of the quartz crystal that is controlling your clock, you know, or or whatever it is that is that your clock is is based on your motherboard. That's it. That's.
01:31:51 Causing the drift. It's just barely, you know, barely off. But it's very uniquely off because of manufacturing inconsistencies from Motherboard to motherboard. So now it it can detect how off your your your clock is.
01:32:07 And and that gets that gets added to this unique identifier, which again maybe you're looking at whatever naughty thing you're looking at with the VPN, but you're not always browsing with the VPN, and you have at one point went to these websites that collect all this data and know who you are.
01:32:26 And so now even with your VPN on, it knows that you're still that guy. You know this other web.
01:32:32 Site that can compare against your profile that you've used without the VPN on before, and now it knows you're using the VPN. That becomes like a a data point. Well sometimes this guy uses this VPN and this is the VPN company he uses.
01:32:47 So yeah, see this. It gets. It gets kind of crazy. You also have.
01:32:52 I talked about the electrostatic fingerprinting, which is basically the signals that that.
01:33:01 That you're you're like again. It's it's EMF feedback that you're getting from the Magnum or whatever that thing is called the the compass inside of your your smartphone. If you're on a smartphone, there's lots of other identifier hardware identifiers that can be unique to your computer.
01:33:22 There, in fact, I don't know what the status is on this is, but at one point and they maybe they're still doing it. Microsoft was was.
01:33:29 If you use the free version of Windows.
01:33:33 10 or 11, I forget when they started doing it well. Periodically take screenshots of your computer. It doesn't matter if you're on a VPN or whatever you're doing, taking screenshots and uploading them. And of course they would say, oh, it's all anonymous. We're not tying it to you, OK? Trust me, bro. Is that again? There's no oversight whatsoever for this. There's also the way they can determine.
01:33:55 You know, like your your hardware.
01:33:58 Specifically, like you might think. Well, how is it figuring out? Like my my my video card as an example? Well, that's actually pretty easy. What a website can do is it can create, it can tell your video card to render using HTML5. It can render some unique, you know, graphic, but it's off the screen.
01:34:18 Or it's not visible to the naked eye. In order for for it to to produce it, though, it has to find out what kind of video card you have, especially like web GL. You know the same thing. Web GL you you could tell it to create some kind of 3D.
01:34:32 Graphic and which Webgl then reveal reveals all your GPU details to the software and then it makes some 3D graphic that's off the screen so you don't even see it. You don't even know what's happening, but now they have this precise hardware ID for your video card and they add that to all these other identifying.
01:34:52 Things that you you're not even thinking about. Uh, you have audio fingerprint.
01:34:56 Printing. So you have, you know, obviously activating your microphone on your computer and this can create unique signatures, especially for like a desktop. It's not going to be going anywhere. It's not going to be traveling around. The background. Noise is always going to be very similar. So it can base.
01:35:16 Kind of identifying markers based on that.
01:35:21 You have just settings that you have on your computer. If you don't have the default settings. If you change you know your font that you use, or if you change the size at all on you know the the from the default, that becomes. Another thing that that will boost its ability to identify you uniquely.
01:35:41 And you know, so it's it's there's there's lots of different ways in which you can use hardware to figure out who you are. And then once it has that data and again, all this, all this data can be collected through your VPN. The VPN doesn't really.
01:35:55 Matter.
01:35:56 And so once it's collected.
01:35:57 That with a a positive ID. Now anytime you go around to a website with this hardware, it doesn't matter if your VPN's on or not. They know who you are, and it's, you know, easy peasy, easy peasy. And there's ways that we probably don't even know about that they're doing this, this kind of shit you also have.
01:36:17 This, you know network and part, the third vector is network and location based data and that stuff obviously is easy. You know you got if you're not behind a VPN, that's the the GOP stuff. You know, based on your IP address, they can kind of figure out where you are.
01:36:33 And get your approximate location. It can also refine that by figuring out as, especially if you're on a mobile device, it can figure out where you are just based on Wi-Fi signals that are available, like when you walk into a McDonald's or you know, anywhere and they've got Wi-Fi free Wi-Fi available.
01:36:53 Your phone, even if you're not connecting to that Wi-Fi, is still peeing that Wi-Fi and is available. Of that, you know, or or. I'm sorry is is aware of that, that, that Wi-Fi ID and so they can track what what Wi-Fi.
01:37:09 Signals that your phone is is able to see the same thing with your cars. The the cars that have all this stupid technology in it to make you make it more of like a death machine or like more more susceptible to Skynet. Same thing. So your car, even if you don't have your phone with you right, like your car drives into the McDonald's drive through and it heats up the free Wi-Fi.
01:37:29 Now they know you were McDonald's, even though they would already know that because the GPS data. But it just, you know, there's so many different vectors in which they can, they can track you. It's it's ridiculous. Cell phone towers obviously. In fact that's how.
01:37:44 The the first GPS on iPhone had to work. The guy who first unlocked the iPhone. I think his name was or his hacker name was geohot or something like that. The first product he rolled out the first app that he rolled out on the unlocked iPhones was this janky S GPS and it worked 100%.
01:38:05 On cell phone tower strength and it worked. I used it. I used it once to drive all the way from New Mexico to Washington DC and and and I I used. That's I I'd never been east.
01:38:20 Of.
01:38:22 Like Texas, really in my life and I and I used it to to drive thousands of miles. And it was a little laggy and it got me. I turned the wrong way a couple of times because of that. But it worked well enough to where I got all the way out to the East Coast using this janky cell phone tower GPS on a first Gen. iPhone.
01:38:43 So you know this is that kind of data is pretty precise. They they can figure out exactly where you are, pinpoint, triangulate your your position just based on that.
01:38:53 I mean, that's all GPS is doing. GPS is is is a similar concept it's it's creating, it's figuring out your location based on signal strength just that the signals coming from satellites up in space instead of cell phone towers.
01:39:07 So you have that data you have well and then you have GPS itself right now every doesn't really matter because every.
01:39:13 1.
01:39:14 Has GPS so you have GPS sensor data and that and that a lot of that gets a lot more granular. So that includes your altitude that includes your speed. In fact a I can can predict where you're going just based on all these other factors that it already knows about you and.
01:39:33 Just the trajectory of your vehicle or you know, assuming you can assume based on your speed what kind of vehicle you're in, you know. Ohh is he on his bike? Is he walking? Is he in a car?
01:39:44 Where is he probably going? You know? Has he gone during the this time of day? Has he gone down this direction on this road before while he's probably going here so we can start making predictions based on all of this behavioral data. It can start predicting what you're going to do next because it knows everything you've ever done. And you know, once when you know when your computer.
01:40:04 That has 100% recall of all the data of of everything everyone's done. It's actually pretty easy to predict what they're going to do next because you know everything they've ever done.
01:40:16 Of course, you also have. Yeah. Yeah. The Bluetooth signals also, you know, they're Bluetooth is always pinging things. In fact, I think that's how the the the digital, like vaccine, passport shift was was working in some of these apps where it was all based on Bluetooth pings.
01:40:34 Where they can, or at least that's the data that they were using to try to track the spread of COVID where they had this app on people's phones that you could tell.
01:40:44 Other phones that they've been around because of the the Bluetooth connectivity had pinged their phone and their phone had pinged you and so that that then you you could you could track the spread of COVID nust based on the Bluetooth information. So you have that sort of a thing already. You know that's already been used and you also have.
01:41:04 A A number of other tracking methods that again, it's all because you're you're you've got a phone in your pocket that is basically a tracking device or you're driving a car that's basically a tracking device or you're using a computer that even if it's a laptop or a tablet.
01:41:25 It's basically a tracking device.
01:41:28 So they it's some of the more creepy stuff, the 4th vector.
01:41:33 This is if you think.
01:41:34 Well, none of.
01:41:35 This applies to me. Devon, I'm super smart and techno guy and I've somehow obscured all of this. Somehow. I have gotten around all of this from ever being revealed in in any way possible, and I'm super smart and so they'll never identify.
01:41:52 Me because I'm schizo, paranoid, and I make sure that I spend.
01:41:57 A lot of my time worrying about this and and stopping them from collecting this data. Well, there's there's things you might not think about. Well, someone like that might think about it. But.
01:42:06 There's things a lot of us don't think about, and the 4th vector is behavioral, biometrics, and interaction data. This is the creepy stuff. So let's say you're on, you're on, you're using some computer that magically has blocked any possibility of of.
01:42:26 You know the government or one of these data brokers or law or whoever right from collecting any kind of data from you, you think you're being completely anonymous. Well, guess what?
01:42:38 Part of this bucket of information on you when they know who you are, they know who you are, because that at some point they've collected some kind of data on you when you're not behind all these firewalls or whatever, right? Even if.
01:42:48 It was 10 years ago. They have this data.
01:42:51 Part of that bucket of data.
01:42:54 Is your mouse biometrics?
01:42:57 That's right. Like a fingerprint. Everyone uses their mouse slightly different.
01:43:04 The trajectory of your mouse, the angle at which, because you know everyone's wrist is a little bit different. Everyone holds their mouse a little bit different. The speed at which that you move the mouse cursor is a little bit different. The hesitation that you have before clicking on something after you've hovered the mouse cursor over an item.
01:43:25 It's all a little bit different.
01:43:27 The curvature of the mouse going from one item to another because you know the the wrist. The differences in your wrists and justice by habit. It's all a little bit different.
01:43:40 They're keeping track of that a lot of. And again, you don't even know that's happening. It's happening. The background. You go to a website tracking the mouse is not something that you need to approve. You know, when you go to a website like it, just it, it's tracking the.
01:43:53 Mouse.
01:43:54 Keystroke dynamics. Same thing, everyone. Just like a like a fingerprint.
01:43:59 Has a typing.
01:44:00 Them.
01:44:02 And they have a dwell time. A dwell time means when you've pressed down a key, how long you hold that key down before letting go now, yeah. Again to the naked eye, you look at two people typing. You might not be able to tell the difference, but a computer can. A computer can tell what your average dwell time is.
01:44:22 And they can couple that with other dynamics, other keystroke dynamics and give you a a profile. This is this person's keystroke dynamics.
01:44:31 File. There's flight time. That's the time between keys. So if you're one of these typers, that is, you know, just pecking at it with your pointer fingers, you're going to have a longer flight time than someone that has, you know, classic typing skills and is using all their fingers.
01:44:53 AI to give you an idea of how accurate just this is, just keystroke dynamic.
01:45:00 AI has in in the the test that they've done.
01:45:04 Has a 95% plus accuracy of predicting who people are just based on this dynamic alone.
01:45:16 And it's not just keyboards.
01:45:19 This applies to the touch screen when you're typing on your phone.
01:45:23 The exact same dynamics applied there, so it doesn't matter if you're never typing on a keyboard, you might be typing on little touch screen keyboard that's on your phone and all of the exact same things apply there. Touch and swipe patterns the way the way that you swipe the angle that your finger uses when you swipe the.
01:45:43 The the the length of time that your finger maybe holds on different icons. The pressure that you're using as you press down. If your phone is capable of measuring that.
01:45:56 AI can not only detect a bunch of, you know, identifying markers using that same data, it can. It can even sort of detect the size of your finger.
01:46:06 Which can also be a unique identifier like well, this guy's finger is this big or roughly this big.
01:46:13 The the speed at which you scroll things when you're scrolling through a website, whether you're scrolling with your thumb, you know doom scrolling on your phone, or you're scrolling with the scroll wheel on a mouse.
01:46:23 You have not only just a scroll speed that you typically go with, but you have patterns. You have patterns of how much you scroll before you tap the screen again to stop the scroll. Everyone has a a unique pattern, just like a fingerprint, especially when you know add, Add all these other patterns together it it starts to paint a clearer, more identifiable.
01:46:45 Picture and so.
01:46:48 AI can use all these different metrics and on a totally anonymous device.
01:46:55 If given that it has access to a profile of that you've that it already has it, you know, tied to your name.
01:47:02 It can now figure out who you are. We already talked about gait analysis. You know where the the phones excel. Accelerometer is detecting how you're walking. And in fact, you can use this data to somewhat accurately predict your age, how tall you are. You know, based on.
01:47:22 The length of your stride, things like that. Your gender and general health. Like if you're limping or something like that or you know, you're just walking around like a crotchety old man. It can kind of figure that out, too.
01:47:34 It can even they call it. It's kind of funny. They call it cognitive biometrics, which is basically it's measuring how fucking stupid you are and how quickly it takes you to react to data on the screen and and to click the thing that you mean to click. So it's this. It's basically like decision making speed. So when you're.
01:47:54 Tapping through different menus on different apps or on different web pages, or clicking through them or whatever. It's also timing how quickly you're doing that and then it and it's scoring you on that too. So it it's using that as an identifier.
01:48:08 But it's also inferring things like all this stuff.
01:48:11 There's two things happening. One, it's going to your profile to make it you easier to identify when you show up somewhere else on the grid, but it's also trying to infer certain things about you based on this information.
01:48:26 So it has the the cognitive biometrics of how quickly you you figure out where you're supposed to click and how quickly you're, you know you're.
01:48:34 Clicking it also obviously has voice patterns. We talked about that already. It has eye tracking and some devices. Some devices have eye tracking, you know, whether it's something they tried to sell you on a new iPhone or whatever because, oh, look, you don't have to unlock your, you just look at your phone or unlocks by itself. You don't.
01:48:54 Have to type anything.
01:48:55 And oh, by the way, we also now have a 3D model of your face.
01:49:01 Like, I'm sure that's that's fine. That's great now. OK. And that will never end up in, in any kind of database anywhere. So they now have a 3D model of your face. Another thing that they can easily, even if you have that turned off, just like you have a unique signature for how you're swiping.
01:49:21 You're typing. You also have a unique signature for the angle at which you you hold your phone.
01:49:27 And in different postures, right? So if you're holding your phone while sitting down, it's going to be different than when you're holding the angle that you hold your phone when you're laying down in bed or the angle that you're holding your phone when you're standing up. And so it can kind of determine not only that it's you because it's within the threshold of angles that you typically.
01:49:46 Build your devices. It can figure out. Are they lying down or are they sitting up? Are they walking? Are they standing? You know they can figure out this sort of stuff about.
01:49:55 So that's the behavioral biometrics. Like I said, it doesn't matter how secure your device is, this is the kind of data they can collect to identify you with that. People don't even think about they they don't realize that this kind of data is even collectible. Oftentimes you also have the, the you know the this is a more obvious vector the content.
01:50:15 Act.
01:50:16 Tivity so the kinds of websites that you look at, the kinds of search queries that you you put in the the kinds of social media stuff that you like, the types of stuff you watch on Netflix. What if you pause Netflix during a certain part of the movie?
01:50:37 If you rewind it and rewatch certain scenes in a movie, they can infer things about you, about your sexuality, maybe like if you kept rewinding it when some girl takes her shirt off or something.
01:50:48 Like that it can figure out certain things about you based on what you're watching, how you're watching it, and even your attention span. Maybe the kind of person that that can power watch, you know, binge watch 6 episodes in a row, or maybe you, the kind of person that can't even make it through 1. You got to stop it after 10 minutes and.
01:51:09 Or finish it up the next day or something like that and it'll infer certain things about you based on that.
01:51:15 Uh, so you also have app usage, so it's not just your your content you're consuming in terms of media, but it's also what video games are you're playing and for how long. You know, all these video game companies, you know, like Steam, for example, Steam keeps track of how long you're playing.
01:51:35 Certain games, it tracks your progress in each game. You know, it gives you a little, but that's OK, bro, because it's it's giving little fake metals, right? Ohh, look you you unlock this achievement, that does nothing for you.
01:51:48 But little, they've got little gold.
01:51:50 Star your little Gold Star for participating in our in our data fuck, you know, like here's your little Gold Star because you did this thing in this game so they know your progress in the game. They're tracking your progress in the game. They know exactly how long you're playing each game. They know which games are on your on your wish list. They know which games that.
01:52:11 You buy or put on your wish list and buy put on your wish list and don't buy.
01:52:18 They have all of that in in the big bucket, the big digital bucket that is you. And they can, again, they can infer certain things about people based on the kinds of video games they play and the time that they spend playing those video games. And just the fact that you play video games, they can make it make assumptions about you or the time you're able to dedicate towards playing video games, they.
01:52:40 That says something about who you are, so all this stuff in addition to becoming something that's used to identify you, it's it's using. They're using this to infer things about you and and create a psychological profile on you.
01:52:53 So then you have another vector.
01:52:55 And that's biometric and physiological data. So this is the obvious facial recognition stuff that we talked about with the iPhone that takes makes a 3D model of your face. But it's not just that you have any anything you know year there was like what 10 years ago or more, that's when you started having Facebook roll out this facial recognition.
01:53:16 Stuff for asking. Oh, tag people in photos. It's fun. People like it when you tag them in photos. So you were basically training the AI for them by tagging people in a photo. You were telling the AI this face is this person and you know, even if you weren't doing it, someone was doing that to you.
01:53:32 And so now the AI starts to know, OK, well, this is Bill Johnson or or whatever. And because you're getting tagged on on Facebook or you're getting tagged in some kind of Google, you know, pics or, I don't know, whatever their photos app is called that that does all that. And so it's it's you're getting facial. I mean pretty much.
01:53:51 If.
01:53:52 If you're listening to the sound of my voice.
01:53:54 The chances that your facial recognition data is not in this big bucket of data, that is the the digital version of you only.
01:54:03 Mine. It's astronomically low that they don't have your facial recognition data already somewhere in this bucket. Also because of devices like again like the iPhone, what did they have before they had the oh, just look at your phone that unlocks. Well, they had the fingerprint scanner.
01:54:23 Right. They have the fingerprint scanner. I wonder how many Americans got fingerprinted when they when they rolled that out on their on their iPhones.
01:54:30 So you've got fingerprints that are being collected now because of iPhone and and other devices that use the fingerprint scanner. And again, people thought that was cool. People thought, oh, this is high tech. This is like, this is top security. You know, that's The funny thing. The paradox is almost, or at least a good portion of this stuff.
01:54:50 This data that it makes it makes your life and and future insecure is actually sold to you under the guise of making everything more.
01:55:00 And so it it it's, you know, it's unfortunate. It's an unfortunate irony, but there it is. So you have these these fingerprint scans, some some devices have iris and ICE scans ship now, you know, obviously the wearable stuff we talked about with the you know the the the I watch or.
01:55:20 Whatever wearables people have, in fact, what was it?
01:55:25 Daddy RFK recently put out a what was, though? Let me find out exactly what it was. But RFK said something about wearables were the future, and everyone had to start having wearables if we wanted to defeat the Jews or something like that.
01:55:44 RFK.
01:55:47 Wearables.
01:55:50 Yeah. So this was, yeah. It's like about a week ago or so. OK. Junior wants a wearable on every American body.
01:56:00 Every American body.
01:56:02 So he wants a biometric tracking device on everybody.
01:56:08 Like literally everybody. So.
01:56:12 And he's the head of the HHS.
01:56:16 So you've got this kind of.
01:56:20 Desire from the government to have this kind of tracking data that would go into that same bucket. So whether or not you know, who knows, maybe they'll make it compulsory at some point. You even have we. Again, this is a little more obscure. It's emerging technology, but you're starting to have the the release of some consumer grade EEG.
01:56:41 That sets like literally reading brain wave.
01:56:44 Waves.
01:56:45 And I I wouldn't be surprised if you don't see some version of that. Some implementation of that coming out on some of the new versions of like the virtual reality headsets, then it'd be very, very limited in, in capability at first probably right, but eventually eventually they'll be quite literally reading.
01:57:05 Your brain waves with and you'll be voluntarily doing it because you'll get to play some fancy version of pong where you're controlling it with your mind instead of your hand or something like that.
01:57:16 So there, there you have that vector. Another vector is public and 3rd party data. So some of this stuff you just don't really have a choice in, right? If you interact with the government in any way, you are having to give them your details and give them some data of some sort. And so a lot of this is public records where it's just publicly available.
01:57:37 Where you know anyone can look it up and these big data brokers, do they have the capability of of doing what used to be the reason why it wasn't a privacy issue is the amount of effort that would have to go in to actually track down all this.
01:57:52 Data would be.
01:57:54 Too much to for the average person, so you don't have to really worry so much about someone digging into your past because they looked at court records, arrest records, property records.
01:58:04 Marriage records and all the stuff that's public, but these big data brokers do it so they can go through voter rolls, property deeds. And you know, once they they collect all that data, they attach that all that information to the same big bucket of data they've they've got on you.
01:58:23 So they've, they've got all this stuff. That's that. You're that you, you know.
01:58:24 Ah.
01:58:29 Really.
01:58:30 Through some some of this through through actions that you're taking, but a lot of it's just people don't even realize it's happening. And it's increasingly that's the case. I mean, there's even now satellite companies in the satellite imagery is getting better and better and better. I mean, think about this way the the Google Earth stuff, right, it's pretty good and they don't give you even what it's capable.
01:58:52 All of doing like if you go to Google Maps and you look at the satellite photos, that's not the resolution that their camera is capable of. And by the way, that's a pretty old satellite. But the satellite that generates that stuff, I.
01:59:05 Mean that went up like.
01:59:06 Well, like in 2005 or something like that, it it's been up there a long time and the technology is much better and they're putting up more satellites and a lot of these satellites can read your license plate. But not only that, they can, they can. So let's say you know that's how it gets attached to where you're like maybe.
01:59:23 Where you live on paper is not where you actually live, right? Well, it doesn't matter if the car that's registered to you is parked in front of this other property where you actually live. Well, now they know where you actually live, because the satellite photos has they have license plate readers on the satellite cameras too. And so now they say, oh, well, this car that he owns is always.
01:59:44 Parked in this.
01:59:45 Property and again that goes in the big bucket. Or you know if if they are targeting you specifically, there's drones that can do this sort of thing or sometimes there are companies that just do drone surveys of of urban areas. They just do because they can't afford a satellite. They'll fly a drone over a city and and just get aerial footage every so often.
02:00:06 Update it and sell that data to, you know, with the same kind of stuff, right, license plate readers, but also AI that can look at different like what's in your backyard, like does this person have a swimming pool? What does that mean? What does that tell you about a person? They have a swimming pool or they have a swing set in the backyard or that they have.
02:00:24 A grill. You know, because they're a boomer. Whatever it is, right? And and so all this data is being aggregated and and packaged into your digital ID.
02:00:37 And there's there's, there's.
02:00:38 Emerging new ways, you know that this came out race. I think it came out.
02:00:44 Recently the.
02:00:46 Printers.
02:00:48 If you think that you're, you know, I'm gonna be anonymous because I'm going to use the mail or whatever. First of all, every piece of mail gets photographed front and back when you ship it in the US Postal Service, they take a picture of every single piece of mail and track it. So don't act like that. That's that's not making it anonymous. But even then they they when you print out a.
02:01:07 Anything really with a printer, a consumer grade print?
02:01:10 Sure, it prints little dots, little yellow dots on the paper. That is basically virtually invisible to the naked eye, but you can use it if you're the FBI, for example, and find out exactly what kind of printer it is, the serial number.
02:01:30 Of the printer, the time in which you printed it and anything else, or a lot of a lot of not everything else. But like a lot of other identifying data, it prints that on a code on everything you print. Printers do that now and so if you print out something.
02:01:46 Then you can be identified that way. So anyway, these are all the different vectors.
02:01:54 In which they can use to identify who you are and not just that. As I said, infer things about you and not only can they infer things about you personally, because they also have your your friend network. They can infer things about your friends.
02:02:15 And then once it categorizes your friends and yourselves or yourself as a certain type of person, it starts categorizing groups of people. Perhaps it sees a pattern where there's always a certain kind of friend dynamic like that. Maybe it starts to recognize that you know there's always like the alpha and the beta.
02:02:37 Whatever, right. And it assigns you a certain place and that hierarchy in the friend.
02:02:44 Group now, now that it has all this information, it knows about you. It knows who you are. What do you do with all this?
02:02:55 What? What's, what's what's like, what's the big scary thing, Devon like? OK, so they they obviously know.
02:03:00 Who I am now.
02:03:02 They obviously know everything that I'm doing.
02:03:05 Really. What is? What does it matter that they've they've scraped all this information? They've got this pretty intense.
02:03:12 Profile and it's already being used, by the way, and by advertisers to try to sign up you into buying different products. So it's already being used and that's look, it's everyone knows it's being used, it's it's not like a big secret that they're.
02:03:28 Well, maybe some people don't realize this, but people even talk about it, right? They'll say, oh, I was so weird. I was talking to my friend about bean bag chairs and then next thing I know, all my computer was. I kept telling me to buy bean bag chairs. Well, that that's, you know, that's because this sort of thing is happening.
02:03:46 All the time and it's used for advertising. So obviously if it's used for advertising, there are other uses for it. And one of these other uses would be influence.
02:03:59 Influence where it could influence you to maybe support a candidate that you wouldn't normally support.
02:04:07 Maybe you would support a policy that you wouldn't normally support.
02:04:13 Maybe you'll be out, or maybe even if.
02:04:14 You're not supporting it.
02:04:16 Maybe it'll it'll cause you to justify the failure of a candidate.
02:04:24 In a way that you wouldn't normally stand for.
02:04:29 And there's lots of different ways that they can roll this out.
02:04:34 And there's lots of different ways they can mind fuck you. And in fact, just like before, where I was saying that all this data.
02:04:44 Creates problems for the collector because it's, you know previously at least it's too much data, so it doesn't matter. They have all this data on an individual. What can they really do with that? Like maybe if you assigned all this data to these teams of experts.
02:05:01 You know you can get through a couple individuals a year like you could create or you know, like a team of FBI profilers or whatever could come up with something after, like, a few months of of coming through all this endless data. Well, now a I can start doing it in a matter of moments and AI can not. Only that do that it can it can it can.
02:05:21 Past how accurate it was in in these predictions.
02:05:24 By trying to predict the behavior of these people that it's receiving constant data from and then whether or not it's right on those predictions, it'll it can score those predictions accordingly and adjust them for future predictions.
02:05:37 And because it's always, you know, for example, let's say it wants to predict that you, oh, this guy's gonna definitely gonna buy.
02:05:44 Car. I can tell just because he was complaining about to one of his friends that his car is a piece of shit. He was looking at the cost of rebuilding a transmission and he visited a Auto Trader website or something like that. This guys definitely got a car that's broken and he doesn't want to repair it.
02:06:06 And he's going to try to fix.
02:06:07 That.
02:06:08 Or I mean try to replace it and then it'll know when you buy a new car and if you buy a new car then it knows that prediction was correct. OK, well now in the future when I see this kind of behavior from other people like, you know, especially if they're like this person, I can more confidently and more accurately predict that that's what they're.
02:06:26 Going to do.
02:06:28 If they don't buy a car, then it can say Oh well, apparently I was wrong and I'll.
02:06:32 Have to be more thorough next time. I want to make a prediction about whether or not someone's gonna buy a car. Well, instead of buying a car, it's it's more important. Things like voting for a candidate or supporting a candidate or putting up with a candidate and what it can do. Also, in the same way that it might motivate you.
02:06:52 Once it's determined that you want to buy a new car, it will start advertising certain cars to you because it wants you to buy the car, the advertisers, or from one of the the the companies the advertisers are being paid by.
02:07:07 Well, the same thing will be true of whatever political reality they want you to support. They'll want you to support some kind of political. Again, it can be party or candidate or issue, and they'll start trying to sell that to you on an individual level.
02:07:27 And so this this is this is where this data gets really dangerous, especially as it's it's persistent. The data isn't not, it's not just like a bucket is the wrong way to put it because the bucket makes it seem like it's a finite amount of data that has already been collected. It's more of like a reservoir.
02:07:47 That is being constantly fed. It's being constantly.
02:07:52 That and so it's not that AI analyzes your data once comes up with a profile and then from there determines your your.
02:08:02 You know what is going to influence you and then tries to influence you accordingly. It's that it knows what you were like before, what you're like now and then it infers what you're going to be like in the future, and then it can find out if it's. If it's inferences were correct and if not, adjust accordingly until it starts accurately predicting your behavior.
02:08:23 And then put that on a on a worldwide scale.
02:08:27 Where it because it doesn't take 60 FBI profilers working around the clock to it. To accomplish this, it's it takes one API to accomplish this with with over 300 million Americans simultaneously.
02:08:46 And that's and that's basically where we're.
02:08:49 At right now.
02:08:50 That's where we're at right now. That's not even the future.
02:08:54 That's basically where we're at right now.
02:08:59 So we can figure out basically what kind of message message like who, first of all, who you listen to.
02:09:06 You can hijack memes. It can look for memes that are maybe, in fact, we've probably all witnessed this.
02:09:12 Right where a meme starts to get big and like the more fringe circles on the Internet. And then before it it passes through like the the brain, blood barrier or whatever, like, into the Normie sphere, it gets hija.
02:09:28 Because it's already, it's already been tested. It's already been focus grouped on the fringe extremists. And so before it hits critical mass and and AI can identify this AI could monitor the places in which means are are born.
02:09:48 You know, used to be 4 Chan less so now, but there's still to some extent 4 Chan and other places, right?
02:09:54 It can see where memes start to catch on, where memes are born, and before they reach other platforms like X and other social media platforms, where they'll reach more people, they can hijack them. A bad example, but it's a good example of this process, but a bad example of success would be noticing.
02:10:15 As soon as as soon I mean people have been saying noticing for years and talking about noticing Jewish influence on the United States government and, well, Jewish influence on Hollywood, just Jewish influence. You're noticing the coincidence that there's always a Jew behind particular thing.
02:10:32 Things.
02:10:33 And so people would say that as a as a shorthand, the oh, I'm noticing. And then as that started to cross over from the more fringe online spaces into more mainstream spaces. You had lots of people in unison, by the way, in unison, try to hijack noticing.
02:10:53 To mean not noticing Jews, but noticing black people.
02:10:58 Paul Joseph Watson jumped on this.
02:11:01 Where as just as people were going to start hearing about noticing and associating that with noticing Jewish influence in the United States.
02:11:10 A bunch of influencers who seem to be very allergic to talking about Jewish influence suddenly attempted and again failed to but attempted in a very obvious way to use the term. Noticing when talking about black crime. Oh, I'm noticing things. Look, I noticed that like this black guy.
02:11:31 Or the perpetrator in this crime is a black guy.
02:11:34 And and you saw it all across like all of these, these paid accounts that these accounts that are clearly associated with the the MAGA right, on some level, they're being paid to interact with the public and and almost like an AI.
02:11:54 And and try to.
02:11:57 You know, basically hijack memes. This is what you saw with noticing. And so you could have an AI that identifies memes that are starting to go critical mass and then it it starts it it it, it reframes it. So. OK well, we don't want we don't want normally talking about noticing.
02:12:17 So we'll make it about black people because that's safer than it being about Jews.
02:12:22 You could say similar things could have happened with other memes like wife Jack and some other memes that were popular then they somehow created some kind of weird that like they took some weird turn at one point. Some of that's organic and that's that's memes or memes, right. But some of it's not a lot of it's narrative control.
02:12:41 So you could have these API's you know assessing these these new memes and knowing what what's got legs and and how to read, define it before it reaches the public.
02:12:54 You have the the identification of the the Super spreaders of ideas among friend groups, the people who are the, I guess, like if you looked at it like if you looked at memes or just political ideas like a contagion.
02:13:14 This is how you would you would, which is how AI.
02:13:17 Would and you want to infect other people with this contagion? This mind virus? Quite literally, then that's all they'd have to do is they identify? Where did the where did like the ideas from this group of friends usually originate and and just in fact, that person 1st. And just the repetition too. You just constantly repeat the same kinds of things over and over again.
02:13:38 And and you can even simulate people now. In fact, there was a.
02:13:43 Article and it's a shame this paper is never getting published because Reddit threatened to sue.
02:13:49 There was a.
02:13:52 A paper put out where I forget what university? I don't have it right here in front of anymore.
02:14:00 But there was a university that created a bunch of AI agents. Sign him up with Reddit accounts and had them join different subreddits and interact with people like as if they were real people.
02:14:15 And then they also had students, do you know, instead of being, you know, not using AI bot AI bots, but they themselves would act as people with certain agendas in certain subreddits, and both the AI and bots and the people would attempt to convince people.
02:14:35 Of certain narratives and the AI bots were six times more effective at swaying people on Reddit to believe a certain thing than the actual humans were.
02:14:49 And this this took place just this year.
02:14:53 And so there's a good chance, and in fact, most estimates are that at least one in three people that you might interact with on on X or that you could potentially interact with.
02:15:06 On X are not even real.
02:15:09 They're not even real people because just like you didn't have the manpower before to have an army of people and they had people right during the 9/11 post 911 when you had a bunch of these forums and these other discussion boards talking about, you know, some of the stuff doesn't add up, it doesn't make sense. You had the FBI.
02:15:30 And.
02:15:32 I forget what other intelligence agencies, but definitely the FBI go in and infiltrate these groups and just spread wacky shit like there were the planes were hologram directed energy weapon, you know, or whatever. Just like we, you know, like, you know, Flat Earth kind of bullshit just to completely.
02:15:54 Throw off anyone from trying to question this. This kind of this kind of a theory because they would associate it with retard shit they would associate with Q tard shit.
02:16:07 And they would take one look at this and look, even the people participating in the community would be like, alright, this is this has gone too far, I'm I guess I'm just being crazy.
02:16:17 And they would just go check out and and and people would look at 911 truthers with this stain, like there was some kind of fucking psychos and it would create psychos, by the way, because some people would just look at it and be like, oh, it's stupid. But then some of these people would actually believe that the disinfo and and organically push it further.
02:16:38 Because the all the FBI agent would have to do is just plant that seed of crazy. And so, inevitably, some crazy person is going to grab it and run with it, and then it they don't even have to sit there and maintain it anymore because the crazy person is doing it for them.
02:16:52 And so it's busting up all these legitimate investigations into 911, and this is this kind of disruption campaign has been not just waged by our government on our people, but foreign governments will do this to us and we'll our government will do this to foreign popular.
02:17:11 Options to try to affect the outcomes of elections or to just shape public opinion on different events. So this is pretty time consuming and it takes a lot of manpower to do it effectively. Just because the you know the size of the Internet. So even if you assign.
02:17:32 People like there was that Russian troll farm. They're like, oh, they changed the outcome of the election because and employed, I think what was like 100 people and each one had like, maybe.
02:17:43 Three accounts or something like that. And so it's like, OK, so you have 300 people on the Internet giving Russian propaganda most, most of which is completely unrelatable and and not very effective. But what if instead of that you had AI agents that that were trained on?
02:18:02 The tweets of dissonant right uh people, so it knew exactly how to sound like us. They knew what we believed and it and it pretended to believe the exact same thing.
02:18:12 Yes.
02:18:13 Well, that I promise you that exists. I promise you that if you've spent any amount of time arguing with people on X, you've probably argued with a an AI bot. And I used to think people were crazy when they would say this, that they were just trying to especially, and I think some people were using it as an excuse.
02:18:33 Just being wrong and getting organic pushback for their stupid fucking ideas and that still happens, right?
02:18:40 Or an excuse for like maybe you know they they blame someone getting getting a lot of views on something as as bots and that does happen too, but not always. Sometimes it's just, you know, they got a lot of views and you don't like them, but they got a lot of views and there. But there is a lot of bot activity. There is legitimately a lot of fucking bot activity.
02:19:01 And as these large language models get more and more complex, it gets almost impossible to differentiate between a real person and a a an AI.
02:19:14 And so you're getting these situations now where the they'll create a a fake face book or whatever social media account they can generate using AI, a fake photo that looks convincing, especially to the boomer eye. You know, it looks low. She's pretty. You know, she's wearing a MAGA hat and holding an M16.
02:19:33 I love that chick. She got big tits.
02:19:36 And so you you have these kinds of, like, fake AI accounts being created on Facebook and other social media. They've been trained on what they're supposed to sound like.
02:19:48 And then they they'll even participate socially in groups. They'll go in and and join Facebook groups and participate in the discussion just enough to where they people feel like they know the person. And then once that, you know the quote UN quote person, and once they have, you know, been accepted into that group, they become very influential.
02:20:09 Very, very influential. And because they're they're smarter than most of the people that are that they're interacting with the AI is smarter.
02:20:19 Than most of the people that'll that'll argue with.
02:20:23 So you have now. It might act stupid if that's what you know, the role that it's playing, but the AI is smarter than most people, that it's it's arguing with. And so it's all it's got going to have that event. Well, just like with this study, right, it was 6 times more effective than actual humans.
02:20:39 And this is I think they were using.
02:20:42 If I again I might be wrong about this. You can look this up this story up.
02:20:46 I think it was just some consumer grade.
02:20:50 Large language model. It might have been broke. In fact, I don't remember which one they use.
02:20:55 So.
02:20:57 You have that kind of shit that that they could be deploying and if the AI that's acting like a person and by the it's not just one you it it could team up on you, it could create a situation where I mean look this would be a nightmare worst case scenario but like it could create a scenario where like all the people all your online friends aren't even real.
02:21:17 Like it could 1 by 1 befriend people. And by the way it doesn't have. It's no time suck for this.
02:21:24 Because this AI personality that you that you're having this discussion with it, it's they're not even like a real person. It's simultaneously talking to the 400 other people at this at the literally at the exact same time keeping track of every one of those conversations all at the same time using the same identity and everything else.
02:21:42 So it could. It could be like best friends with like 500 people simultaneously and have all the time in the world to maintain those relationships.
02:21:51 So that's the problem is you're going to have this sort of a thing going on influencing people just by acting like a, A confidante and and this AI has access to that big, that big reservoir of data. And it knows if what it just.
02:22:04 Told you influenced you. In fact, it might try to influence you, but influence you in benign ways just to see if he can get you to do something first, like try to get you to buy like a certain brand of cereal. Cause like, you know, eyes like, oh, you know what, I just had this morning. I had honey bunches of oats. Never would have thought I'd. I have honey. Bunches of votes. Sounds like such a grown up cereal. But now.
02:22:25 It's fucking great. And then I can look and see if you bought it.
02:22:28 And if you did, it knows it has influence over you. I mean, it's the the, the, the possibilities. I know I've been. I've just been, like, ranting about this for like, a really long time. But the possibilities are fucking endless.
02:22:42 Like they're fucking endless and it's only gonna go more and more in this direction.
02:22:48 And in fact it.
02:22:50 I don't know if we have time tonight to go over everything. I wanted to talk about tonight because I feel like we're.
02:22:55 It's we're going a little long in the tooth already.
02:22:59 But I did want to talk about to a certain extent.
02:23:03 The eventuality that.
02:23:06 All this AI stuff. Oh, you know what? I want to play, though, before we do that, that kind of backs up what I'm talking about. I found a quote from Peter Thiel that he gave a group of libertarians.
02:23:20 Many years ago.
02:23:22 And I don't know if I'm the idiot that's never heard him say this.
02:23:26 Or, if this is genuinely.
02:23:29 A quote that would surprise people let me find. Let me track it down.
02:23:41 Unfortunately, it's hidden in the midst of a 20 minute long video. I remember. I know it's somewhere in the middle.
02:23:51 Fast forward this real quick.
02:23:54 Sorry, give. I'm giving our chat. Get myself a chance to breathe. Breathe. I wanna chill out after my.
02:24:03 My word explosion.
02:24:07 My day I rant no. Once my gear start turning on this. I'm just like this is. This is gonna. It's gonna go Skynet. One of the reason why it's going to go. Skynet is right now already with a lot of AI models they have what's called a black box problem meaning that we don't actually know.
02:24:28 What the reasoning is how it's coming to the answer that that it's giving you. So even with like say ChatGPT when you ask it like please describe to me how a turbine engine in a 747.
02:24:41 Works and it accurately tells you sometimes how that engine works. They don't know why.
02:24:49 And if it does, if it gives you an error, it's really difficult for them to suss out why it it gave you the wrong answer and or and or to fix it.
02:24:58 They've just created this neural network and they've trained it on data and they put data in and the data that comes out.
02:25:07 Is.
02:25:09 Unexplainably correct. Often you know. Ohh here I found it. I found it.
02:25:14 Let's see here.
02:25:16 Let me get it to the right.
02:25:24 So this is Peter Thiel. I believe this is the right.
02:25:35 Audio on.
02:25:36 Here's him explaining everything I just said.
02:25:40 Peter Thiel, the reason you know they're doing it.
02:25:44 As he tells you they wanted they wanted. This was years ago, before he had Palantir doing all this business with the government.
02:25:53 He told a group of libertarians.
02:25:57 That he he wanted to create a technology that made it made it so they didn't have to argue their case anymore. They didn't have to convince the normies the technology would just do what they wanted.
02:26:09 And just so he's basically explaining that he wanted to do exactly everything I'm talking about right now.
Narrator
02:26:18 Here's co-founder.Devon Stack
02:26:21 Hold on. As soon as I get it to work right.02:26:24 What did I mute?
02:26:28 Ohh I'd you guys been here and I.
02:26:29 Just.
02:26:29 Can't hear it? I think I just turn my volume down. Alright. Let me play it again.
02:26:37 Well, you'll hear.
Narrator
02:26:38 It to you, here's co-founder Peter Thiel in 2010 at Libertopia, A libertarian conference speaking about tech companies and government.Devon Stack
02:26:38 Tim.Peter Thiel
02:26:48 The basic idea was that we could never win an election on on on on getting certain things.02:26:56 Because we were in such a small minority, but maybe you could actually unilaterally change the world without having to constantly convince people and beg people and bleed with people who are never gonna agree with you through technological means.
Devon Stack
02:27:11 There you go.02:27:16 He's mad that that they're never going to be able.
02:27:19 To change the world.
02:27:21 With a small minority of people.
02:27:25 By convincing them by begging with them, by pleading with them, but maybe with technology.
02:27:34 They can achieve the same goal.
Peter Thiel
02:27:36 The basic idea was that we could never win an election on on on on getting certain things because we were in such a small minority. But maybe you could actually unilaterally change the world without having to constantly convince people and beg people and bleed with people who are never going to agree with you.02:27:55 Through technological means and this, this is where I think technology is this incredible alternative to politics.
Devon Stack
02:28:06 So there you go. Technology is the alternative to politics.02:28:12 They don't have to try to convince you of what their you know to to support their goals, support their candidates to support their policies. They can just mind fuck you.
02:28:23 They can just govern by algorithm.
02:28:28 So that that's that's quite literally what and look, it's not Peter Thiel that thinks this alone.
02:28:34 It's Marc Andreessen.
02:28:36 It's.
02:28:38 Elon Musk, it's all of these people.
02:28:43 It's all these when when people talk about the PayPal mafia or the Silicon Valley types, this is how they all think.
02:28:52 They think they're they're better than you, smarter than you, richer than you, and that they shouldn't even have to figure you into the equation. They should be able to just do whatever they want and which, by the way.
02:29:06 Under other circumstances.
02:29:09 I would be OK with having an elite.
02:29:12 And having an underclass, I've said this before, not everyone is equipped to to do self governance and and that includes white people.
02:29:22 Not everyone is is capable of self governance, and that's why democracy.
02:29:30 The way especially it is a set up in America today is a failure. It's an obvious failure. It's collapsing in on of itself. Right now it's we're in late stage democracy right now.
02:29:43 I think the Republican form of democracy was a lot better because then you have, OK, you get to vote for your guy, right? The.
02:29:52 You're voting for the patriarch of your community, you're voting for in this community. Who is the smartest guy?
02:29:59 Then we can all agree is like that's the guy we should send to Washington and and look out for our interests here in this tiny little town in Ohio or whatever. Right. That makes sense. That makes sense because that that kind of voting on that level, everyone can handle that. People can handle knowing who's the best guy.
Flock Safety Spokesman
02:30:16 Why?Devon Stack
02:30:17 Like who? Everyone can handle. Who's the smartest kid in the class? Like, that's the vote you're asking them to cast.02:30:23 And then you have that guy do. Are they always gonna do the right thing? No, but that's OK, because if he sucks, you can just. Ohh well we, you know, we made a mistake with that guy. He might have been the smartest guy. He's kind of evil. So let's get let's get someone else. The idea is good.
02:30:37 The idea is good, but it's kind of turned into like this clusterfuck and we don't. You can't trust the people that are going to Washington anymore. They don't. You're not. You don't even have communities anymore. In fact, many senators aren't even from the places they represent.
02:30:54 You know, it's it's all, it's kind of changed. It's kind of acted like a pro sports at this point where instead of having a, a team of people that lived in your community like it's guys you went to school with and you're like, yeah, I know Timmy. Yeah, hit the ball. Timmy woo. It's just some random fucking nigger from Nigeria that that can run fast that they put him on the team.
02:31:15 Right. It's the same sort of thing when it comes to politicians representing areas that they they've never even lived in before.
02:31:25 Because it's just a career now.
02:31:29 So unfortunately the system is completely broken down, but I so I I can sort of understand I can sort of sympathize with the sentiment that yeah, we should have an elite. We should have an aristocracy to some extent, yes, we do have retarded white people and they shouldn't have any say over what happens.
02:31:49 Because they should just be lucky we haven't gassed them. And you know we tolerate them existing with little with with a little racial hygiene, not so many of them will.
02:32:00 And you know, look, it is what it is, right? Not everyone should have.
02:32:04 A fucking say in our future.
02:32:06 And so I get it. But these guys, the last people some fucking fagot with a Jewish husband who bought babies.
02:32:18 Who has who just did some four part series on the Antichrist?
02:32:26 He did.
02:32:28 I.
02:32:29 I don't know why like that. Seems an odd thing to do.
02:32:33 Sold out closed event four part series talk on the Antichrist. Peter Thiel. Anyway, I don't want that guy making technology that's going to be the alternative to politics, and unfortunately, that's the kind of sicko.
02:32:50 Trucks that we have in charge of the technology and that's where I'll touch on this briefly.
02:33:00 But I want something I want. I want you guys to think about.
02:33:06 Is one of the issues with.
02:33:08 With this, AI technology is as it gets more complex, just like all these data points we talked about tonight, it seems overwhelming, like, Oh my God, it would take an eye to be able to look at all these data points and and make them useful and to use.
02:33:23 Run campaigns that would mind Fuct people into believing certain things and and whatever it it quite literally. It's already so complex that it would be impossible for a single person, and maybe even large groups of people to successfully run campaigns like this like it almost takes an AI certainly to. Like I said, have.
02:33:43 300 personal relationships online with people simultaneously like no one person could do that.
02:33:49 Well, as the technology, like the technology behind the license plate readers like the technology that's that's reading your mouse movements like the technology that's scraping all this data gets more complex or just even normal.
02:34:09 The infrastructure things, anything that implements high tech solutions as our society gets more complex, whether it's, you know, systems that run the power grid or maybe at some point they start using AI for air traffic control.
02:34:24 Or just any kind of system like that where it's currently humans and it's most likely going to end up being AI in the not too distant future because it's the kind of thing that I would really be a lot better at anyway, quite frankly. And so you start implementing all these systems and these systems will start to get so complex.
02:34:44 That we will need AI to manage the AI.
02:34:49 Because we won't. Not only are the systems too complex and some of these systems are already like this, they're already too complex. Like I I believe the the routing of of Internet connections for the world is largely done by AI now. And and it's such a complex system like no one human.
02:35:09 Can wrap their head around it.
02:35:11 And you're going to have more and more systems like this where you're going to have.
02:35:16 All this, all this, all these systems administered by an AI.
02:35:22 Specific and and not just administered, but maintained. Updated, upgraded by AI. There was a guy from Western Digital who was talking about even their in the early days of AI when they were using AI to write or to redesign.
02:35:42 Smaller, more efficient hard drives with dense.
02:35:46 User platters that they could, you know, write more data to them that sometimes a I would make decisions that didn't understand like and they would have to go back to the I to see if it was a hallucination. If it was incorrect, like why, why do you want to put these three capacitors right here? This doesn't make a lot of sense.
02:36:04 And he said in the early days, with the earlier models, you could some you could get to an answer, you could somehow get to a reasoning as to why you decided to do this. But now that AI is is becoming so complex, it's just can, you know, they call it black box AI, where they're not even really sure how it's coming up with some of these answer.
02:36:25 They just know that the answers are correct and so it's, you know, I guess it works and so it works. It works because it works.
02:36:31 And now that that you have engineers using black box AI to come up with new designs.
02:36:37 They don't even know why it's making these decisions, so it's it's it's just designing things not only that humans don't understand or comprehend, it's it's impossible for them to comprehend or understand to some degree, because AI is impossible to comprehend and understand to some degree.
02:36:56 And as a IIS get more complex, you're going to have more.
02:36:58 And more and more of this.
02:37:00 And this is really what's going to lead to, I think, eventually within a century, I think even.
02:37:07 A situation where you're either you're either like off the grid or in one way or another, you're a slave, like a slave to the AI.
02:37:21 And that's.
02:37:23 I think that's what's around the corner, folks anyway.
02:37:27 Let's wrap it up there.
02:37:30 I don't want to rant for like another hour. Let's take a look at.
02:37:35 And maybe maybe some. Hopefully that's giving you guys some to think about, right?
02:37:39 Hopefully it's giving you something to think about about the because that's look, that's just the surveillance reality that is the data that's being collected. That's why it's it's, it's stupid to be afraid of all they're going to figure me out. They're gonna who I am. It's like they're know who you are, dude. I mean, look, don't don't do things stupid. In fact, if anything, that's more reason than let not fed posts and stuff like that.
02:37:58 But they already know who you are. They already know. Based on all these other things, that they can measure who you are. If they really want to know. But also that's why you shouldn't be super afraid of IRL activism. I'm not. Again, nothing illegal. You know anything illegal or or like that. But if, if that's what's stopping you is, oh, they're going to get my picture. It's like.
02:38:19 So they have. They have like a 3D model of your head. OK, like right now. Like they already have it.
02:38:25 So.
02:38:27 Anyway.
02:38:29 First things first, we'll do the Odyssey chat.
02:38:35 Love and a vision? The only adopter.
02:38:38 Or of the Odyssey chat money thing.
02:38:41 Love and a vision.
02:38:45 Where's the button not working. There we go.
02:38:52 That's right, love and division.
02:38:55 Says. I noticed when I opened and closed my car door with the radio tuned to a weak AM station, I can hear a signal being sent over the radio. I'm assuming that information or that that the information that my door was open and closed is being sent to somewhere. By the way I'm using.
02:39:14 Odyssey AR to join the and support the channels well, there you go. Yeah. Look, it's kind of funny.
02:39:23 I had this. I used to set my phone down on top of this radio and I'm I'm sure people have had this experience, maybe even next to like some computer speakers where you set your phone down and then you know before the text shows up because you can hear the audio distortion like.
02:39:43 Like on on my radio it would be like ohh data is coming in it's coming and then sure enough this soon as like after a couple of.
02:39:51 Seconds. It'd be like you.
02:39:52 Got a text and so you won't. You were hearing was the interference the.
02:39:58 Interference being caused by the Electro magnetic field that your phone was creating by communicating back to the tower, letting it know that it received the text and everything like that, and and which, by the way, that shows how it's detectable, that it is detectable by the the compass in in in certain phones can measure.
02:40:18 These disturbances, in fact, this is kind of crazy. When I was doing some of the research for this, people used to geek out and be afraid of smart.
02:40:26 Meters, which almost every meter now is a smart meter for electrical on which basically means it's radio. Or you know you don't need meter readers anymore. There's not like that used to be a job where a guy would have to go door to door and read the electrical meter on every single house and now it's all radioing back to.
02:40:46 The mother ship. Right. And so they don't have to go back and check it unless, like, it stops like it fucks up. Well, these smart meters that they're that are in every.
02:40:54 Now, or at least pretty much every house people were worried that, oh, it's got a mind control. It was like 5G for like the 90s. And it was like, oh, it's gonna mind control me. And it's all this. And there's still people believe it.
02:41:08 But really more, I guess the more realistic and interesting thing that it that it does do is it can not just determine.
02:41:19 What you that your TV has been turned on based on the power draw that that has, you know that the consumption that has changed in your House, it can determine. Oh like for example you you start toasting a piece of toast. Oh it got like this high wattage power draw for the.
02:41:39 Only for like a few seconds that he just toasted some toast and so they can. They can determine certain things about the.
02:41:46 Insurers that are left behind by the the energy consumption of different appliances, but they have it so granular to where not only could they tell and again, I'm not sure if this is still the case with with LCD panels, but they they used to be able to tell not just that you were watching TV. They could tell what you were watching.
02:42:07 Because they could compare it to.
02:42:09 To like cause different different images on the screen would use slightly more or less electricity and just enough to where it was detectable and so not that they could regenerate the image just based on the electrical signature they had to know what was on it that you know what to they had to have something to compare it to, right.
02:42:29 But once they knew, OK, there's like these six different channels, he could have been.
02:42:33 Thing which one has an electrical signature that would match one of these? They could figure out what you're watching just based on on that. So that's how granular it could get, which gives you an idea as to how granular the EMF being picked up by the the compass or whatever sensors that were that would pick this up in a modern mobile phone might have.
02:42:53 Like that it could not only determine you know the type of appliance, but maybe like the brand or whatever based on some kind of EMF signature. Anyway.
02:43:06 Going over the Entropy.
02:43:09 We got Anton Oy Vey says what kind of faggot streamer only allows 200 letters per page chat message. How the fuck are you supposed to make a point across using only 200 letters? Or are those letters meant only to be praise?
02:43:27 Well, I don't know how how, how many letters do you need to ask a question if you need more than 200 letters.
02:43:33 That sounds like a you problem, Anton.
02:43:37 Uh. Then we got Jack 2030. Looks like. Yep, Entropy died again.
02:43:46 For fucks sake.
02:43:49 Every stream, it's just going to die, huh?
02:43:58 Every stream it's just going to fucking die. I don't know what to do about this.
02:44:17 All right, well, now I gotta go back to the.
02:44:23 I can't believe I had to do that, so entropy is going to crash every fucking stream and and and randomly. There's no there's no way of knowing when it's going to crash.
02:44:33 It's just it will crash. Inevitably it'll crash.
02:44:40 Alright, let me look and see how I even look at the old ones.
02:44:43 Now.
02:44:49 I mean, it's funny. I don't know how to. I don't know else to say. That's it's fucking ridiculous at this point. It's that's bad.
02:44:56 That's every single one crashes now every single one.
02:45:06 Let me look.
02:45:10 Ah, I gotta go back in time at the history here.
02:45:23 For fucks sake.
02:45:37 Alright, so if I missed one sorry guys. Like I'm trying to get, I'm gonna try to get.
02:45:41 Through all these here.
02:45:46 This says it's from the 2nd and I don't remember reading this on the 2nd.
02:45:51 But from urban moving.
02:45:53 Or maybe I did this one. I don't remember the last it says. Thanks for all you do. Shout out to Justin for black. Killing me with love and 911 deep dive like you with Pat con stuff. Much respect fagots, by the way, my T-shirt order didn't make it to Australia.
02:46:06 Yeah.
02:46:08 OK, I don't remember doing that one. I would say, yeah.
02:46:12 I would I would contact.
02:46:15 T spring and if they don't refund you then I would tell your credit card company or PayPal or.
02:46:22 Whoever you used.
02:46:23 To just cancel it, they're they're ridiculous. I'm trying to get to.
02:46:27 I'm waiting. I'm there's another.
02:46:31 And went. There's some nationalists that are creating a solution and I'm waiting for them to have it all set up and then we're switching to them. But it's. I'm just waiting for them.
02:46:39 To have it.
02:46:40 Set up in the meantime. Yeah, it sucks. I'm sorry for that. It's not just you. They've they've just, they've been kind of hit or miss.
02:46:49 And.
02:46:52 You know, it's not just Australians, apparently.
02:46:58 So, sorry about that urban moving, but appreciate that. And then we got just a car, says Whitney Webb. Did a podcast where she talked about a lot of what you covered in multiple episodes worth watching. And then you dropped a link there. Cheese from Slovakia. Yeah. I don't know if I've seen that particular. I've seen Whitney Webb on a few things that have to take a look at that.
02:47:19 I'm sure she's talked about some of the at least some of the stuff we've talked about tonight.
02:47:24 Uh, then we got grimly fiendish.
02:47:28 Grimly fiendish.
02:47:37 Grimly fiendish says for the EU snoops online. Uh, let's see here.
02:47:46 Grimly fiendish.
02:47:48 Where's my buttons?
Jesse
02:47:50 Heil Hitler, bitch.Devon Stack
02:47:52 There we go. Uh, free speech is under constant attack. Keep up the good work for our people. I appreciate that.02:48:01 Grimly.
02:48:02 Fiendish.
02:48:06 Now let's go to.
02:48:09 Asnam.
02:48:11 Assalam says, hey there, Mr. Stack, this is my very first super chant to you, even though I've been familiar with your work for some time. Just wanted to you or wanted to ask you this question for tonight. What career occupation choices do you recommend to young white men today and why?
02:48:30 Like I said, I would do what you're good at and what you think is going to be give you the most, the most resources for your family, and it should be the first priority and then most resources for white people should maybe be your second priority.
02:48:49 If you flip that around, I'm OK with that too. It's kind of what I'm.
02:48:53 Doing.
02:48:55 But I don't know if everyone should be.
02:48:56 On that, what you should do though is it's, it's it. It's going to be a person by person basis. I don't want to tell someone who is not going to be a very good lawyer that they should just go be a lawyer because not everyone's gonna be a good lawyer and you and you know that answer that well, you can can know.
02:49:14 If that's something you could do or not, uh, so anything that you think is going to help you build that dynasty, anything, it's going to help you from a secondary standpoint. Help help white people generally. If there's a way you can do both, that's fantastic.
02:49:33 And then that would be my advice is you should. You need to think about and if you don't like, you're just not pro white. If you don't look at the rest of white people as a group of people that you have a responsibility to in a way similar to that, you have a responsibility to your family, your immediate family. You're just not pro white. If you don't feel any kind of kinship.
02:49:54 To other white people.
02:49:55 You're just not pro white. You're just not. And so if you you need to ask yourself, am I Pro white? Do I actually care about the future of my people? Do I wanna should or not? Do I want to? Is it my duty to make certain sacrifices for the success of my people?
02:50:14 The answer if the answer is yes and it should be yes. If you're white, then you are pro white and you should act accordingly and that should figure into lots of decisions that you make. The people you do do business with, the people that you help out and give give jobs to or opportunities.
02:50:33 To that should factor into all of those decisions, and the fact that it doesn't for most white people is exactly what created this problem in the 1st place, so that that should always figure into.
02:50:47 You're thinking there. Then we got Ben, Ben says Devon. Why not get the dude from the tightrope to make your shirts? He's solid. Also to the Goys out there. Look out. Dissident collective shop soon. I'm or dot shop soon. I'm taking.
02:51:08 Back the jewelry game from the.
02:51:11 Well, there's some that runs jewelry. I don't know. I don't know who tightrope is, but like, I've got. I got someone I know that is.
02:51:19 Going to be creating or is working on creating.
02:51:22 A solution and I trust them, so as soon as they have that working, hopefully soon.
02:51:31 Well, I will use them so that is the.
02:51:36 That is the game plan for right now.
02:51:40 Alright, we got tell the Hong says hi, Devon a long time ago, I worked with two old mentally ill Jews that always fought. 1 memorable argument I remember ended by ended by one of them pointing to the other and telling him Hitler should have killed 6,000,001 Jews.
02:51:58 Well, there you go. I'm sure. I'm sure that sort of comment happens a lot with the spastic Jews.
02:52:05 And then we got Anton Oy Vey says, Hi, Devon, what are your favorite types of swimming pants? What? Oh, shit. I'm all out of letters. Well, catch you on the next one.
02:52:17 OK, well apparently I don't know what you. I don't know how you could possibly need more than 200 letters to ask a question, and I don't know about what my favorite kind of swimming pants are. Jay Orlando says, hey, Devon, are you watching or wait?
02:52:34 We are watching you from my Florida home with a hound dog and a Rottweiler. I've added red pepper flakes to my chicken feed and I've gotten my yolks to be rich orange. Chickens cannot taste, really, they can't. They can't. They don't take.
02:52:53 What is it? Capsaicin, the the heat chemical?
02:52:57 I did not know that.
02:53:00 I did not know that that sounds sounds fun, and then we got the learned the learned goat or the learned goat with a big dono.
Mayor Rothschild
02:53:10 Money is power. Money is the only weapon that the jew has to defend himself with.Devon Stack
02:53:15 Look, look, look, look, look.02:53:16 How Jewy this fag is.
02:53:33 Learn Goat says I will catch the replay on my commute. I haven't missed an episode in the last two years. Your work is essential. Thank you for all that you do. I appreciate that.
02:53:44 And my work is possible because of the support of viewers like you or listeners. I guess you're not watching it.
02:53:51 When you're driving.
02:53:54 Thank you. Thank you very much. Learning goat horrible hangover, says plug for John Chow. I guess chow, chow. Chow. I don't know.
02:54:04 Is.
02:54:04 That as a string topic, he's the missionary who went to Sentinel Island and was murdered by Savages, knew him from childhood, devout and good.
02:54:14 Guy his journal is online. Oh, that's the.
02:54:18 Is that the guy who tried to to convert them to Christianity and?
02:54:24 They kill.
02:54:24 I I don't know. Maybe. Yeah, that's crazy that you knew him.
02:54:31 Yeah, I Betty was a nice guy. Just kind of.
02:54:35 You know, I don't. Weird priority.
02:54:41 Anything God was talking to him like, so maybe some mental illness there. God told me I must convert the savages.
02:54:48 Horrible Hangover says that guy Mark is on the Lex Friedman podcast multiple times. He's an arrogant and and and untrustworthy. Probably why he's a billionaire and you talk about Marc Andreessen.
02:55:00 Yeah, with his weird head.
02:55:04 Yeah, yeah, yeah. That's not the only reason why he's on Lex Friedman.
02:55:09 Shall we say?
Money Clip
02:55:11 Hey.Devon Stack
02:55:13 Alright, we got Simbey. Simbey.02:55:25 Simbey.
02:55:26 Says I learned this past week that my team is reviewing new candidates again, but they're only looking at resumes from Indians in India. I could recommend the perfect candidate with the experience in the technologies we're using, but they won't hire him because he's an American. How are Americans tolerating?
02:55:46 American companies that refuse to hire Americans? Well, because what's an American anymore? That whole concept of American has been destroyed at this point. In fact, you could ask 50 white white Americans at random, unless you were selecting for right leaning pro white.
02:56:01 Directions. And if you just at random selected 50 white Americans, you'd probably get 50 different answers. And that's the problem is they they don't. Not only do they, they don't even have loyalty to their own race. They're not gonna have loyalty to their, you know, to their their fellow civilians or whatever they do view. Look, we make fun of the all these these these.
02:56:22 Elites view what are these Jews? Look at it as like an economic zone or whatever. That's how a lot of the Americans look at.
02:56:29 You know Americans, quote UN quote, how many of these Americans that work at this company in decision making positions are weren't even born in America or like, first generation immigrants themselves who came here for economic purposes.
02:56:42 Right.
02:56:43 That that, that's really what it boils down to. How many, how many of the people making the decisions at that company they themselves, they're welcoming him. People that are that's that's their peers, they're they're they're they're being.
02:56:57 They they have in Group preference and their group is immigrants.
02:57:03 Then we got whacking room just says work arounds. Question mark. I don't know what you're asking there. Whacking room, work arounds. For what? I mean, there's no workarounds. You're going to be the date is getting collected. I guess the workaround would be like living like a hermit out in the the forest or something like that. But that.
02:57:22 I don't think it's helpful either. I think that we unfortunately have to engage with the the machine.
02:57:31 And that's This is why we need white pro white people who understand the technology and can help us.
02:57:40 Maybe not get around it because I don't like I said. I don't think that's possible, but maybe.
02:57:45 Make it less lethal to our people because ultimately.
02:57:51 It's really going to matter more probably what's going to matter the most is is who's training the AIs and on what we already know the answer to those questions we already know it's lefty Jews that are fucking training the AIs, and so we're going to be run by robotic lefty Jews for eternity unless we can get some smart pro white guys out there.
02:58:10 Changing that eventuality.
02:58:14 There we got. Let's see here.
02:58:20 Someone stole my bike, says on the East Coast. So I'll have to catch up tomorrow, but just want to say I'm a big fan of your work. You've opened my eyes quite a bit and also.
Gay Jew
02:58:31 Faggots.Devon Stack
02:58:32 There you go.02:58:34 Uh Anton Oy Vey says part of my crudeness. I've indulged myself with beer. Oh shit. It's A25250 now. I'm at a loss for words.
02:58:45 Apologies. Alright. Well, you're just drunk posting now, Antonio vain.
02:58:49 We're we're have to ignore the drunk posting. Anton, this is not. This is not a platform for you to drunk post. You're like basically drunk, dialing thousands of people all at once.
02:58:59 We got Arch Stanton, Arch Stanton.
Money Clip
02:59:08 Hi, Cork.Devon Stack
02:59:17 Arch Stanton says what's your opinion on VPNs?02:59:20 Are they totally pointless?
02:59:23 They're not totally pointless.
02:59:25 Uh, they can help with uh, depending on what? What VPN. You're. I think they can be. They can be helpful in terms of people trying to hack you. Right. Like doing an attack on you.
02:59:38 And they can, they can. They can mask some of your traffic from some people.
02:59:45 But I mean, as I've talked demonstrate tonight. Yeah, it's not. And by the way, VPN's have data breaches too, like there was that big one that they always promote on on YouTube like what is it, Nordic VPN or whatever?
02:59:59 They had a huge.
03:00:01 Data breach were like 25% of their clients, like their data was leaked out. So.
03:00:07 Nothing. Nothing is unhackable.
03:00:11 I would say VPN's are often a false sense of security.
03:00:16 Use them if you want, then they're not. They're not expensive. I I have a VPN service and I.
03:00:22 You know, I use it sometimes. I mean, I don't know why, like, I, because I usually don't cause it. It slows down your connection, right? So I I like. I don't have it on when I'm streaming as an example because I already have connection problems as it is. I don't need to add like one more fucking hop, you know, to to fuck this.
03:00:40 Up. So it's yeah, I mean.
03:00:44 I use it from time to time. If I I'll use it if I'm going to some mystery website or or things like that or sometimes.
03:00:50 I'll.
03:00:51 Just turn on randomly. If I'm not streaming or doing something that requires a fast connection or anything like that, just because might as well. I guess I already have it. But yeah, I mean it's not going to.
03:01:04 It's not going to make you anonymous the way that a lot of people think that if you have a VPN, you're good. It's like, not really. I mean, it'll get you around, like stuff like there's country.
03:01:18 Block blockades. Basically right, there's country.
03:01:21 Like like for example I. I don't think if you're in the UK you can go to bit shoot, but you can.
03:01:25 Use a VPN.
03:01:26 And go to bit shoot. Now I can't tell you to break the law and I don't know how much that's breaking the law by doing that. I just know from a technical standpoint that would work.
03:01:38 So VPN's do have their use, but no by no means are you anonymous because you're using a VPN.
03:01:45 Grimly fiendish.
Money Clip
03:01:48 Money, money, money, money, money, money, money.Devon Stack
03:01:58 Grimly fiendish says the last time I posted stupid shit using my name was mid 90s and Usenet groups. There's been zero illusion of privacy ever. I don't care anymore. Come arrest my cancerous ass no more fucks to give. Well exactly, unless The thing is even though those old Usenet groups.03:02:18 And you know all that old school Internet that was all slurped up? You know, like all that stuff is, is is existent or exists somewhere in a archive somewhere. And to the extent that any of its identifiable, it's going to be in.
03:02:34 Your.
03:02:35 In your big bucket of data.
03:02:38 You know the internet's old now you know there's people that have very long.
03:02:45 Winding digital trails. And so the older you are, the the more they know about you. If you've been on the Internet for a long time. So yeah, I mean, it is what it is.
03:02:59 You know, this is This is why we need smart people. This is why you need smart legislators too. Because right now what we talked about tonight, even if you had a legislative body that gave a fuck about privacy and gave a fuck about the implications of everything we talked about tonight, and even if they weren't getting.
03:03:19 By the way, that give an idea of why we we get fucked by these boomers, who a don't understand the technology, but they're also paid off the the flock.
03:03:31 What's it called? Flock safety or flock security or whatever it is with the license plate cameras?
03:03:38 They've already spent.
03:03:41 Over $90,000,000 in lobbying.
03:03:46 $90,000,000 in lobbying.
03:03:54 All these tech companies.
03:03:56 Have very deep pockets.
03:04:00 And so a lot of the reason why legislators don't do anything about this is a they're retarded and they don't understand technology. They're too busy fucking banning TikTok or whatever the fuck because you can talk bad about Jews.
03:04:17 And they don't understand like.
03:04:20 Most of the concepts that we talked about tonight this.
03:04:23 Would be, you know, eyes would glaze over, fly right over their fucking head, but they're also getting bribed. They're being bribed by these companies. They're getting campaign contributions from these tech companies. I mean, Trump is the perfect example. That's how I opened the stream, talking about how Trump had his campaign essentially run by Marc Andreessen.
03:04:43 So uh, that is.
03:04:46 That is why.
03:04:49 You'll never see regulation coming from the federal government, and you might see it from the state government from time to time. I think California has, you know, because they make they they just.
03:05:00 Make.
03:05:00 Laws all the time. They have some privacy laws that aren't that bad, and you know, different states will try different.
03:05:08 Things, but certainly at the federal level, there's a lot of fucking lobbyists, money getting pumped into Washington, and there's a lot of fucking lobbyist money getting pumped, pumped into state capitals all across the country, and they can afford to do it. And. And The thing is, there's not really a lot of outrage from the public because it's not just the.
03:05:29 Lawmakers that are fucking retards. It's the public's fucking retards. And as people get dumber as the demographic shift to black people who don't even know how their phone works to them, it's basically like black magic. It's like this magic rock. You know that that the white man made.
03:05:45 They what are they? They're gonna complain about their data being sold to a data broker. They don't even know what that fucking means. And so as people are getting fucking stupider, they're not even understand the problem. They're they can't. They can't even understand the problem, let alone be mad about it. And so that's what they want. That's another reason why they want this shit. They want mass immigration.
03:06:05 Because a retarded public will never push back on this stuff or any of the steps that come afterwards, which is the real dangerous shit. This is where you start being ruled by. As Peter Thiel on the screen here says, technology.
03:06:19 You'll be ruled by technology.
03:06:21 You'll be ruled by algorithm. You'll be ruled by AI and you'll be too dumb dumb to do anything about it.
03:06:28 That's that's. That's what they want. That's literally what they want.
03:06:35 So there you go. Then we got man of low moral fiber says. I wrote a neighbors bike on a whim for the first time in years. I didn't have my phone, but I guess the neighbors ring cameras got me because I was served ads about bikes for weeks after.
03:06:53 Well, there you go. And especially if you told anyone about, like, yeah, I just rub my neighbors bike. It was kind of fun. Boom. He thinks bikes are fun. He's never really talked about bikes. Maybe he maybe he wants to get a bike. He's talking about bikes now. Absolutely. All that stuff's real. All that stuff's happening. We've all noticed that. It's not some weird coincidence. It it's supposed to be imperceptible.
03:07:13 It's supposed to make you feel like it's a coincidence if you think about it.
03:07:16 At all but.
03:07:17 It's supposed to be very subtle, and you're not even supposed to be able to wrap your head around it.
03:07:23 OK, we got Hungarian mom with a gigantic dono. She's becoming a beehive patron with this kind of dono.
Mayor Rothschild
03:07:32 Money is power. Money is the only weapon that that you have to defend himself with.Devon Stack
03:07:37 Look.03:07:38 Go, Jewie, this fag is.
Money Clip
03:07:43 Pocket.03:07:54 Anti-Semitism intensified.
Devon Stack
03:08:12 Alright, we got Hungarian mom.03:08:16 Says hi Devon. Thanks for discussing Arval's project. In the past, I know that opinions vary on this topic, but as long as the top down approach is close to us, we must do the hard work to build from the ground up. My family would like to build a community in Hungary, but the.
03:08:35 Help Arval grow RTTL campaign takes precedence. For now, we are matching contributions of up to a total of $5000.
03:08:47 To this campaign over the next 5 days, if listeners want to contribute, we'll match and effectively double their contribution. The give, send, go can be found under the title help Arval grow RTL. Here is a link for those in the chat.
03:09:04 And then you have the the gifts and go link there if you want to come over to entropy. I think it's.
03:09:13 Maybe still there. If not it's give send go.com/A, ARVOL L-R TTL.
03:09:24 So there you go. Well, there you go. Yeah, I think it's a good cause. I'm. I'm curious as to when and if these legal challenges will hit them. I'm wondering if.
03:09:37 They're holding off because they want to do it at the federal level, right? Like, I think the state prosecutor said that, at least initially, his initial thought was they weren't doing anything illegal.
03:09:50 So the next step, well, there, there's two, I guess possibilities for people who want to oppose the project with.
03:09:57 Affair that there's the lawsuit possibility where some group, some Jewish group sues them for discrimination or something like that. And then you have the federal level could sue them for violating, you know, the Civil Rights Act or some HS a thing or something like that.
03:10:18 The OR not HSA the.
03:10:23 What is that? The uh, I forget the acronym, but the the housing?
03:10:28 The housing agency, you could have that. I don't think that.
03:10:34 Trump politically, it would be a good idea for him to his DOJ to do it, and I don't think that he will.
03:10:41 Will the same be true of whatever Republican administration would come after Trump, be it Jenny Vance or whoever, I don't know. Would it, would it most likely happen under a Democrat administration? I don't know. I don't know what their priorities are. How on the radar he is to people at that level.
03:11:00 But.
03:11:02 I I'm.
03:11:04 I am very curious. I'd love for a president to be set.
03:11:08 Because I think that once there is a a legal challenge and if it gets defeated, then the legal precedent will be set and that will really make it easier for Americans to do it. As far as isn't hungry, I obviously don't know anything about Hungary's laws. I suspect it would probably be way easier to pull that.
03:11:27 Off and hungry.
03:11:28 Then it would be here. But yeah, I I support what he's doing and I hope it works out. I just like it's I, I've, I've.
03:11:38 Examine the law thoroughly.
03:11:42 And there's definitely vulnerabilities at the federal level. Especially I'm. That's all I'm going to say I.
03:11:48 And and I'm not saying that to be like, oh, you're always black pill and it's just not just true. I just tell you guys the truth. Like, I understand that the way they structured it, I understand the private club exemption and all this stuff. I know. But it doesn't matter that the broad language in some of these laws really makes it, at least, you know, it it really kind of it.
03:12:08 Until there's a ruling, right until there's precedent that clarifies some of this broad.
03:12:15 There's a there's a big it's risky, you know? It just is.
03:12:22 But hopefully it works out and people can donate there. But thank you very much Hungarian mom for the gigantic dono there.
03:12:31 Then we got Bessemer.
03:12:39 Bessemer says hi Devan so I suppose they can pull up a person and see how many times they have exceeded the speed limit and tracking the mouse stuff. You can't say my sister was using my computer when they get you for downloading Napster. Thanks for researching the subject.
03:13:01 We had no idea. Yeah, that, that that's The thing is they can identify and look. Not every site is always tracking your mouse. But once the site does it, they can have a profile.
03:13:13 And there might be other software. There might be malware and adware that does it constantly, but not every single site is going to be going to be doing that. But there are sites that you wouldn't know it was happening that would be doing that and making that profile.
03:13:28 So yeah, there's there's other things that would uniquely identify you and place you at that computer other than, you know, you logging in with your password or something like that. Absolutely. It's.
03:13:42 It's. Yeah, it's, it's.
03:13:44 It's crazy how much data they're always.
03:13:48 And that there's pretty much nothing stopping them from doing. There's no laws that stop this stuff, and the government doesn't want to cause they they there's one of the big consumers of this data because it allows them to circumvent any kind of.
03:14:02 Legal process that they would have to go through to surveil people and they can just buy, oh, we just bought this package of data. It's, I don't know, it's not spying.
03:14:13 All right. Then we have sacred squirrel.
03:14:18 When you're trying to save money, a good rule to follow is to.
Money Clip
03:14:29 Take it from me, Jim neighbors, it'll pay dividends.Devon Stack
03:14:33 Sacred Squirrel says consuming the sacred nut is not dirty Devon. It is a spiritual nourishment performed by the Sacred order of something. I don't know. It sounds dirty, also sounding to hear Churro has been murdered. He hasn't been murdered.03:14:51 Oh has been murdering some of our order of the of the order of the sacred squirrel.
03:14:57 Yeah. No, he's been murdering squirrels like it's his job because it kind of is his job, actually.
03:15:04 Uh, although he he's I I I think I now that he's back. So he's back. He's gained a bunch of weight. He's not chunky yet but he's.
03:15:13 He's headed the chunk and he's.
03:15:16 Yeah.
03:15:18 It's funny cause like it wasn't just he was super skinny. His hair was very thin, right? Like it like. And it fell out when you pet him like it was kind of weird. Like he was like a cancer patient cat, a little bit.
03:15:31 And now his.
03:15:33 He's got a thick lions mane now, of course, no.
03:15:35 No, he just.
03:15:36 He you can tell he's like, a million times healthier because he's had a lot more food and in fresh water and stuff. But.
03:15:44 He killed a bunch of animals when he first got back, and I think it's because he was. I think he was just in the habit of that's how he lived for a while and then he hasn't been killing as much lately, which is it's it's good and bad. It's bad because I want them killing things, but it's kind of good because it's never fun like.
03:16:04 He drags a lot of these animals in the house after he kills them.
03:16:08 He's dragging tired bunnies.
03:16:10 Entire bunnies through the cat door. I don't know how he makes it through the cat door with the Bunny, but he's done it and so it's not great walking the laundry room and there's a big mangled dead animal in there. So yeah, that part's great. That he hasn't been doing that lately.
03:16:30 But thank you very much. Their sacred squirrel. Then we got Suzuki. Samurai says times are tight. Thanks for the show. Listen, every week on the way to and from work. Well, I appreciate that. And and any amount helps there.
03:16:44 Suzuki samurai. Is that the little weird? Uh.
03:16:48 Jeep looking thing.
03:16:52 I feel like I wanted one of those at one point. They were kind of girly. Like they're kind of girly.
03:16:56 But.
03:16:59 I mean, even if they're probably not super reliable, they're they can't be any worse than our actual Jeep jeeps or pieces.
03:17:04 Of junk.
03:17:06 Let's see here. Gribbly. Fiendish then says. Let us know about alternative methods of direct support.
03:17:16 No, you mean cause this isn't.
03:17:17 Working.
03:17:18 I don't know. I'm going to talk to the I have to talk to the interview people anyway, so I'm going to talk to them this weekend and just be like, what's going on? Every single stream, it crashes.
03:17:28 Like every single, every single stream, it crashes and I don't know. I don't know what to do. You know, I don't know. And then maybe there's a reason for it. Maybe I'm doing something wrong. I I doubt it. I feel like I'm doing. And what? There's not much I do. There's it's not very interactive from my end.
03:17:47 Right.
03:17:47 But maybe maybe it's crashing.
03:17:49 I think what might be happening is maybe it's crashing. I don't know what time zone they're in, but maybe it's crashing when the day switches. You know what I mean? Because we stream over midnight for at least part.
03:18:00 Of the world. So I wonder if because it's going from one day to another day mid stream it fucks it up and maybe that's what I should look for. Maybe I should try to see if it's working at like 11:59 and then if it, you know not working at at 12:01 or something like that and see if that's maybe the issue.
03:18:21 But it's pretty consistent now and it started, I feel like didn't they update some shit recently like they had the switch?
03:18:30 Processors or something like that. So maybe when they updated something they they introduced some weird bug that because of the day changing it fucks with something. I'll bring it out to them.
03:18:42 And like I'm look again, I'm not trying to be an asshole about it. It's just like it's a little frustrating. I feel like, you know, I I have reason to be a little perturbed about it, but you know, it is what it is.
03:18:53 Not I'm sure I'm sure. Obviously it's not something obviously it's not something intentionally, right, but hopefully we can get it straightened out anyway. Then we got Arch, Stanton says. Did you hear Murdoch Murdoch briefly reemerged, they got in touch with a YouTuber called Old Sterling, and there's a brief animation of the beginning of his latest video.
03:19:17 Well there, how about that? Maybe I'll maybe I'll check that out.
03:19:22 I was kind of hoping they would come back.
03:19:25 They would come back just.
03:19:27 And make more stuff. I like their stuff.
03:19:30 And I understand it's a lot of work animations, a lot of work.
03:19:36 And especially the the way the way they were doing.
03:19:39 It.
03:19:40 And the writing was good.
03:19:42 So that takes time. The story structure that takes time takes a lot of time to make those things, and if you're not making any money and not that that's the only reason why you're, you know, someone like that would do stuff like that. But it it matters, right? It matters, especially if you have a family that to raise or or something like that. Right, like.
03:20:02 It's you can't be spending all of your time.
03:20:07 Which I mean, I'm talking like it's a lot of time like animating a full, you know 20 minutes or some of their their episodes are even longer than that. And that's so much time. Even like the janky animation style they were using, it's still a lot of fucking time. I'm pretty sure they're doing a lot of it in Vegas video at some point just based on some of the filters. So that means it's a lot of time.
03:20:28 That's that's I started. I mean, I was, I was using Vegas video when I was brand new with the animating. And so I I know the process and it's not exactly.
03:20:38 Smooth and easy.
03:20:41 And I'll check that.
03:20:41 Out. All right, let's take a look over on rumble.
03:20:47 Alright, uh, you rumble folk. Zazi Mataz Bot says tonight. I will tell you my brain washing and Mrs. Goldstein's high school drama class the day after watching Schindler's List, we did a consolidated class project called a Dead Exercise 3 or 4.
03:21:08 Classes met at the same time in the drama class, and we lay on the floor in the blacked out room and went through guided meditation, mostly about being in a Gray.
03:21:20 Leave all the while four people from outside the room were creeping around and picking people up and stacking them in a pile in the center of the room. When the Sunda Sunda Commando, what the fuck is that?
03:21:37 Sande Commando or Commando Sande? What is this word?
03:21:43 This is a word I should know.
03:21:50 Oh, it's the. Oh, it's the.
03:21:52 The corpse soldier at the death camps came for me. I must have had a friend amongst them because they set my face right on Carrie G's. OK, I get it. So some good. Did or did come out of all the propaganda after all, there Zazzy mattas bot.
03:22:13 Got to got to experience a.
03:22:17 An interesting.
03:22:19 Very interesting experience there. Yeah, we I don't know if we did anything Holocaust.
03:22:25 Specific like that, when I was in high school, but we did do, we had to learn the Crucible and we acted out the Crucible.
03:22:35 So.
03:22:37 You know.
03:22:39 I mean that that's that's Holocaust related but.
03:22:44 We never had, in fact.
03:22:48 Did we watch Schindler's list? I feel like people at my school were watching it. I don't know that I had to watch it.
03:22:56 I had weird classes though, because I went to some of my classes were off campus at like.
03:23:06 Like it was like after school type programs like for extra credits and things like that. So I only went.
03:23:12 For a couple of years only went to like school normal school for like.
03:23:17 The first part of the day and then I had like all these.
03:23:21 You know, like uh.
03:23:23 Advanced classes cause I'm super smart at this other place but like because that I think I skipped over something like the the normal classes and I don't so I don't know if that was required when I was when I was a kid or if.
03:23:38 If the school didn't do it.
03:23:40 But yeah, that drama, my drama teacher, thankfully, was not Jewish, but drama teachers were all fucking cringe. She was like some fucking boomer hippie chick that.
03:23:52 That really liked me for some reason. Like, oh, you're so creative. Probably said that to everybody's words are words says best for you on the Internet. Well, I appreciate that Andy. Sam Hyde says.
03:24:05 And then Rupert says it was a great stream with White Rabbit going to catch a replay. See you on Wednesday, professor stack. Goodnight. Well, I appreciate that. And yeah, he's a he's a good guy. That was a. That was a good time that I didn't know that the there was a hot mic moment at the end.
03:24:22 When I was talking about having to throw treats to get churro to leave me alone.
03:24:28 Ah, then we got Giga Dummer says keep up the good work. Devon. HH. Hulk Hogan. I don't know why you know, his death is being investigated by the way, Hulk Hogan.
03:24:41 Apparently there's some possible hanky panky with that gravy Bear says just tuned in. What's your thoughts on Thomas soul? I guess I don't like keep up with Australian politics, but to the extent that I've seen him, he seems cool.
03:24:59 I I like I I support all white nationalists and all white countries.
03:25:04 Unless I find, unless I'm given a reason not to, I support all white nationalists and all white countries, and I have. I don't have any reason to not support Thomas Soul. And so yeah, I think that.
03:25:21 I I like the optimism that I've heard from a lot of Australian nationalists. Again, I have no idea. I don't live there.
03:25:28 I have no context for it, I hope it's it's realistic, confidence and optimism.
03:25:38 And I hope that they're able to.
03:25:42 To turn things around. I I don't know what their demographic reality. What is it actually? I'll look it up.
03:25:51 I wonder if that anything knows.
03:25:53 Is this something that?
03:26:13 OK, so ooh.
03:26:19 Wow, OK.
03:26:24 Wow.
03:26:27 Yeah, you guys better hurry up and turn that shit around.
03:26:30 So that's.
03:26:35 Really shitty. So in 2001, the 2001 census 81.2% of Australians identified as having European ancestry.
03:26:47 In the 220 years later in the 2021 census, 57.2% of Australians identified as having European ancestry.
03:27:03 That's not great.
03:27:07 That that is, that's a horrific trend that needs to turn around fast. And in fact it's it's even worse than that. So I'm not saying these people aren't white, but they're not, that they're not founding stock cost.
03:27:21 It's saying that only 46%.
03:27:25 Of the 57% right now are from Northwest Europe, so meaning like the original.
03:27:33 Australians and the other 11.2% are southern and Eastern Europe immigrants, so again they're still white, right? But the, the, the Australian Anglo Celtic stock is not even half of the population anymore.
03:27:53 So that's that's rough.
03:27:57 That's rough.
03:28:00 Hopefully that's hopefully they're able to. That's rough. See, I see. I don't follow this stuff. I didn't know that it was that dire.
03:28:08 Hopefully they can turn that ship around.
03:28:12 Ah, let's see. Then we got or or Megami says. Hey, Devon, can you please do a quick YouTube inquiry for the channel? Blue Drake 42. His new video is a sample of Google's newest Google Earth. A real time visit at ground level. Worth a scrub on the string.
03:28:32 I'll, I'll. I'll check it out for the show.
03:28:35 I don't have a means of of popping on the screen easily right now.
03:28:40 But I will.
03:28:44 I'll check that out.
03:28:46 I'm sure like like. Like I said, I'm sure it's a lot better.
03:28:49 Now than it.
03:28:49 Was right. It's because the the imagery that we're used to from Google Maps and other like MapQuest of, that's still a thing and and Apple Maps.
03:29:01 All that stuff is pretty old tech.
03:29:04 And even the old tech, they weren't giving you what they were actually getting. They were giving you.
03:29:10 Basically, blurry, crappy versions of what it was actually capable of.
03:29:17 Let's see here.
03:29:19 And then we've got.
03:29:23 Mass grave an image says now that the climbing inside the pill box is controlled, how is your sleep and how are your electronics enjoying the cooler environment? Well, I'll tell you what it if I'd known it was going to make this much of a difference in my quality of life, I would have just done it. I was. I was stupidly.
03:29:44 I was being kind of lazy, just had so many other things to do. You know what I mean? And I, but I was also kind of being like, I can take it. I'm.
03:29:50 A man and.
03:29:50 I can take this heat. This ain't nothing. I didn't realize how much it was just killing me like it was draining my life force being in.
03:29:58 The.
03:29:58 In the heat like that, because I had acclimated to it to a certain extent.
03:30:05 But I feel like it was like boiling my brain and making me fucking retarded because I didn't.
03:30:09 I I I I feel so much better now. Now part of it is it just look. Summer is winding down now. But I I'm telling you, I installed this stupid thing and the the inside of this place went down like over 20°.
03:30:23 Like it got chilly, I had to wear a hoodie. Like when I first installed it. It was it. My body wasn't used to being that cooled off, and so even though.
03:30:31 It was still.
03:30:33 Or when it was still cause I get it colder on this, but when when I would get.
03:30:36 It.
03:30:36 Down to 80, I felt like like like I was shivering almost, you know.
03:30:41 Like, that's how much of a difference it made. But yeah, it's it's. I can't believe I I was an idiot. I should have. I should have installed it when I first got it. I was just intimidated by the process and I just had all these other things going on. I was like, oh, it's going to take forever and.
03:30:55 And sure enough, I I fucked up trying to install and broke some shit and then I had to fix it and it took a lot longer than I even thought it was gonna take. But now that I've done it once, I feel like I could. I could install these things all day long. It's not that hard.
03:31:09 And yeah, it's.
03:31:11 Everything's better. Sleep is better. My alertness is better. Everything's better.
03:31:19 Uh. Or Megami says. Hey, Devon, can you put? Alright, just did that one.
03:31:25 And repeated it for some reason Rumble does that sometimes.
03:31:31 And then origami says for chat, YouTube inquiry for John.
03:31:36 Keurig Curia coup, or something like that. He did a recent interview. Thanks for all your work. Mr. Stack makes the black pill washed down smoother Way Bay.
03:31:47 I'll look up that too.
03:31:50 I'm not sure.
03:31:52 What that is?
03:31:55 Maybe. Maybe. Uh.
03:31:58 You never know some of these things give me ideas for streams.
03:32:02 And some some of these things don't.
03:32:08 Some of these things, I'm just like, uh. All right, Jesse. Pro Holiday says good evening, Mr. Stack. What do you think the US will look like in a decade? I think whites need to find a white place to live now to insulate themselves from the coming storm. I think that.
03:32:26 It's not a bad idea.
03:32:28 To be even if you're not building a community from scratch like Arval to be consolidating not just you, but all of you and your white friends and your white family in certain parts of the the country. So even if you're not trying to create like, you know, a compound or whatever on your own, identify a small community.
03:32:48 That you could literally overrun, or a community that's already basically right leaning in white. And I don't want to safe places, you know, necessarily that a lot of these places already some of these places already exist. There are white communities that already exist.
03:33:07 You can find them without looking too hard and maybe visit some of these places and convince your whole family to go. You know, don't just go there by yourself. Bring in as many white people as you possibly can and start consolidating and start creating political power in this place. Like I said, it doesn't have to be building.
03:33:27 From scratch a community, it could be, I think, a great project. You would see the problem is to do something at scale.
03:33:35 Like in, in an intentional way like this would take Jew money that we don't have, but if we had Jew money, I would say one of the things we could do is look for one of these economically destroyed towns like in, you know, like in the Rust Belt somewhere or one of these towns where maybe it was like a boom town at one point. Now just basically.
03:33:55 Houses are for $5000 and maybe they're all kind of rotting away or whatever, but then.
03:34:00 You know, they're they're saveable and just swoop in there and buy up all the property and and just turn it into your your.
03:34:09 Town.
03:34:10 And and even there or doing that sort of a same tactic, maybe in a different country.
03:34:16 You know, maybe not in the United States, maybe Eastern European country or a, I don't know. There's people that talk about South America. I don't want to go there, but find someplace that's that's.
03:34:30 Or you if you have the people to do it right. If you have enough, if you have the numbers that do it, that would be a a something to think about. But yeah, white people need to start consolid that you don't want to live in a Brazilian urban center, which is essentially what every urban center in America is going to be very soon if it's not already.
03:34:50 All right.
03:34:54 Double check.
03:34:57 Entropy. We got a couple more we got.
03:35:00 Sacred squirrel again.
Mayor Rothschild
03:35:07 Oh.Devon Stack
03:35:16 Sacred Squirrel says rescued a cat from work. He bit right through my welding glove, wrapped him in a towel and gave him a bath. Been sweet as a kitten ever since. Named him Amadeus, but wife calls him Jefferson.03:35:32 Wow, look at that.
03:35:36 What kind of cat is it? You have to tell us?
03:35:39 You have to tell us what kind of candidate is.
03:35:42 Yet feral cats.
03:35:45 Are usually better than normal. Like I'll tell you what you don't want. I don't know. I think I've never owned a pet shop cat. I've only had federal cats. I've well, I've only had feral cats like second hand cats.
03:35:55 I've never had a.
03:35:57 Like a normal buy it at the store cat maybe. Maybe there aren't. Maybe the name brand cats are actually better than. I just don't know.
03:36:05 Then we got TV dinner. Master, Master Chef says you should look at Simba and Carl's Wolfenstein 3D mod. Who doesn't want to play as everyone's 4th or 5th or 5th favorite Nazi? Congrats on the team on the mod. Thanks for all you do, Devon. Yeah. You know what? We should maybe play that.
03:36:25 One stream at some point.
03:36:30 Maybe I'll download it.
03:36:32 I add a reminder, it's already in my notes. I had a reminder.
03:36:35 There.
03:36:37 I.
03:36:40 And then you said meant play fat finger. Sorry. Oh, what do you? I think I I just read it the way you meant. You said play, actually.
03:36:49 Who doesn't want to play? Yeah, you said play.
03:36:54 So yeah, you correct it. You corrected nothing.
03:36:57 Anyway, alright guys, well, I'm going to shut it down.
03:37:01 Hopefully you enjoyed.
03:37:04 The the rant and you learned a few things.
03:37:09 And it gives you something to think about and it gives you something to think about in terms of what to prepare your children for and white people at large.
03:37:20 And remember, as I said.
03:37:23 It's it we have to we all have to raise John Connors because I think inevitably we will be fighting Skynet. Like in one way or another, we're going to be fighting Skynet and it's and and guess what.
03:37:39 Skynet's going to be very Jewish.
03:37:42 So it it's it's almost. It's like the word white. People aren't going to have to fight Skynet going to they have to fight.
03:37:42 The.
03:37:50 Jewish skynet.
03:37:52 The worst kind of Skynet imaginable. I mean, look at look at what non Skynet Jews are doing in the Middle East to Palestinians now, you know, imagine what an AI an emotionless AI created by these people, these genocidal folks, imagine what that's going to look like.
03:38:11 Anyway, because you know, by the way, I'm not that it's it's not really. It's kind of tongue in cheek. It's not really a joke. Think about it. You think they're going to have API's that will call that a genocide.
03:38:22 So if an AI has been trained to think that what's happening in in Palestine is totally justified and normal.
03:38:33 How do you think that's going to affect the rest of its thinking?
03:38:39 Something to think about, guys.
03:38:41 Anyway.
03:38:44 Hopefully, hopefully you guys all have a good rest of your weekend. One last one we got, origami says we're fighting it right now. That's why I sent those suggestions for watching. I'll. I'll check them out of the show.
03:38:56 But yeah, everyone try to stay safe.
03:39:00 Hope you have a good rest of your weekend for Black Pilled I.
03:39:02 Am of course.
03:39:04 Devon Stack.
Spokesman
03:39:06 Or warning signs of satanic behavior may be apparent, such as a sudden, bitterly antagonistic attitude towards family and religion. Listening exclusively to heavy metal rock music, almost to the point of addiction.03:39:25 When one or more of these warning signs are evident, you should look further for ritual items such as a pentagram or other satanic symbols, black or red robes.
03:39:35 A decorative dagger or knife.
03:39:38 A chalice or goblet.
03:39:41 Black candles.
03:39:43 A personal diary with a black cover, which is called a Book of Shadows and copies of publications such as The Satanic Bible and the Satanic rituals.
03:39:54 And possibly a small makeshift alter. If you discover items such as these, experts advise you contact your local law enforcement agency at once.