Discover more from From the New World
It's not perfect, but shockingly better this time!
Timestamp Speaker Transcript
00:02.95 cactus chu All right, welcome , So roon , what are you famous for?
00:07.60 rooooon Good question I don't know. I think I might be 1 of those people that's famous for being famous. I mean I I make jokes about technology. I think I try to naivize it in a way that.
00:15.79 cactus chu Yeah, maybe so.
00:25.65 rooooon Most people don't try to do it because they're either bad at it. They have notes exactly? yeah um and you know I had a few inflection points. They got me a lot of notoriety including the obvious words. Whatever one.
00:27.10 cactus chu Yeah, you seem like a real ah, a real brother in the technology space.
00:45.23 rooooon You know, but um, but yeah, and now we're here. Mm.
00:45.78 cactus chu Yeah, so I think three months ago to the day you ah you invented you invented a new word. You invented a new word and it has since gone. Let's say viral. So the word is wordcel and I just kind of want to know what that means.
00:56.42 rooooon Ah, yeah yeah, yeah, I don't think I invented it three months ago I think it got really famous three months ago I had been saying it all. My friends had been saying it for like at least a few months before that. But um, yeah I don't know I guess like I was making fun and like my fans were making fun and like other people are making fun of ah like a set of people that we thought were really lost in the sauce like. They might be like philosophers by training or something where they've kind of lost connection with reality where they're just like the only thing that they operate on is words like they shuffle around their essays that they're writing or they're ah. They're journalists and they're shuffling their newspaper art newspaper articles. They have a minimal connection with reality. Um, it leads them to believe like all sorts of wacky shit. Sometimes it's like you know it's possibly true and interesting like this is a thing like I love. wordcels in general I love following them I love ah like reading what they write, but they're also like super easy to make fun of and a lot of them just are kind of useless and the guy that we initially were describing was this dude named logo Dedalus who. You may have had a run in on Twitter with but yeah, so yeah, and it's like it's like certainly high iq right? It's like high iq rambling about.
02:31.77 cactus chu I don't think I've interacted with him but I've definitely scrolled through a few of his threads before today.
02:47.75 rooooon Shit where like the referend. The referent is not anything like meaningful in reality. It's like I don't know it's like metaphors on top of metaphors on top of metaphors and like how does this relate to that and yeah, so it was like is this funny I guess like. Watching him come into contact with like tech Twitter like engineerbrain Twitter which is like oftentimes quite literal and like like focused on results and hard numbers and empiricism and all that. So yeah I guess ah. It's where it started but it also like was also just like niche internal squabbles about like what kind of math is shape rotator versus or like which which kind of math prefers visual spatial like iq versus verbal intelligence. This is also like.
03:41.32 cactus chu I want to put a pin on this because this is like a marked out thing later. Um, but the the cell in wordcel for the people who are just listening and you've never seen this in texted before it's it's kind of it's the Ceo the single l and it's it's short for celibate and the image that I just get in my head.
03:43.42 rooooon Interesting. Yeah, um.
03:57.90 rooooon And.
04:01.10 cactus chu Um, is that you have these people who are kind of trapped in their own world. They kind of and they've kind of climbed up a large ladder and then pulled it up behind them where everything they do is just obscured and all these like all these random Jargon and metaphors and stuff and there's kind of.. There's kind of a disconnect there right? That's the that's the image that you're giving us. Yeah and the and the the natural counterpart to the wordcel is the shape rotator and I think you talked about that a little but can you give us more on it.
04:20.00 rooooon Pretty much. Yeah.
04:27.50 rooooon Well yeah I don't know if like I actually meant it as a dichotomy but like it became that way obviously like um because I think especially because Andreessen like really went ham on that.
04:44.39 cactus chu Yeah, this is like this is like deep human nature. You know every every like outgroup has to have a corresponding ingroup. Yeah, but but what is it before we get on that. What is the shape or tater.
04:46.23 rooooon But um, exactly yeah, Um, yeah I don't even fucking know really but like I Think. Back in the day we were using it to talk about people who had like high visual spatial like Iq that might be engineers by training or something like that like quantitative it was like honestly a euphemism for quantitative which is not necessarily the same thing as visual spatial but you get the point. I Stopped using it a long time back because it cut it got grating very quickly I don't even think I liked the meme but other people liked it so much that I said it a lot I Definitely do like the wordcel terminology Though. It's like a funny insult and like. Quite fitting for a lot of people. So yeah.
05:43.43 cactus chu Yeah I think some of those people we we might share a common enemy with like who do you think is kind of the most aptly judged word sell.
05:53.91 rooooon I don't know good question who is like the wordcelist of them all I mean it's probably logo because he was the first person in recorded history to we call the words. So so.
06:03.37 cactus chu Yeah.
06:11.56 cactus chu Yeah I think the word the the word really describes it but he's kind of like a he's kind of like a cool guy I don't think he has that kind of like he he has a kind of like bitter culture to it that I think word sell later became associated with right.
06:13.68 rooooon And wrong. Ah.
06:24.90 rooooon Wait are you sure? No, he's a very bitter guy like I don't know he like he tweets incessantly like probably hundreds of times a day or something and it's like usually just unhinged screeds about like something of the other like.
06:36.98 cactus chu He.
06:43.20 rooooon I Don't know like um tech people are worshipping Satan or something you know.
06:49.10 cactus chu I didn't know it's always he's like he gives me kind of like a he's he's like a fun guy 5 right? like there. There are certain people like okay I mean.
06:57.15 rooooon Some people let me put it this way. Some people are not like trying to be fun, but they're fun Anyway, you know so that's that's definitely him I would say.
07:10.67 cactus chu Yeah I can definitely see that but I think like I hope to have Mark Andreessen on like he follows me on Twitter you should he should come on the podcast you should come on the podcast but if he did and I ask him the same question we would probably have like the same person in mind.
07:22.37 rooooon Miss Laurens I I don't know. Yeah, but I mean the journalists as a class are definitely.
07:29.16 cactus chu Yeah, the the the fakest fake journalist.
07:38.98 rooooon Like word sells quote unquote they don't know much about technology or anything they write about really but they're also like they're not interesting targets in my mind like they're all Npcs and we kind of know it like they're slaves to the market function including miss lores like. Taylor like yeah she doxes people but it's because there's such a massive audience that wants to see people Doxed. You know there's people reading this shit and it's not as interesting to me for some reason like as would be like calling a product manager a word sell or something. But yeah.
08:18.61 cactus chu Yeah I think it's kind of like it's kind of like where you're proximate to right? if you see if you if you see a lot of damage done by like bad product managers. It's like okay this is the thing that I want to throw my throw my kind of verbal spheres at um.
08:23.49 rooooon We are yeah.
08:30.60 rooooon 1 on. Yeah.
08:36.48 cactus chu I don't know though I had ah I had a kind of mini audio essay and in my first episode where I talked about basically like you know like the kind of technological progress movement like the kind of like Tyler Cowen people um they they seem like yeah, exactly exactly okay I wasn't sure if you knew that word. But yeah, basically.
08:38.96 rooooon Written.
08:45.75 rooooon Progress studies.
08:55.94 cactus chu For the audience they're they're like a bunch of ah they're a bunch of very smart people who basically talk about very simple or not simple, but very clear and straightforward technological steps that we can do in order to kind of kind of metaphorically get the free lunch. Get a lot of. Development for basically nothing and they they kind of have this vibe where it's like oh we're all friends. There's no enemies of progress and and um in my first in my first kind of mini monologue I basically like pointed out wait a minute guys. Your your entire movement has a very obvious set of enemies has a very obvious set of detractors that are actually doing a lot of damage and that's exactly these kinds of people these kinds of people who are very interested in the distraction very interested in the emotional manipulation ah and actually like.
09:42.40 rooooon Um, ah.
09:51.16 cactus chu I Know it's like logically it's the most straightforward thing to say to say like oh these people are petty. These people aren't like are aren't like very there's there's not a lot of cognition there you know, but I think like at hacking them actually serves a very good purpose because they're doing a lot of damage. They're doing a lot of damage to like good things that we could actually coordinate on.
09:59.47 rooooon Yeah.
10:05.80 rooooon Yeah, yeah, right right? But I also don't think that Taylor Lawrence is an anti-technology.
10:10.13 cactus chu And even like not like coordinating things just like banning technologies and stuff like that. It's totally a threat.
10:22.81 rooooon Ideologue like it's giving a too much credit right? like these people. Yeah, yeah, yeah, right? they're they're they're detrimental whether they know it or not right is that what you're saying.
10:24.60 cactus chu Yeah, no so exactly there there doesn't need to be any cognition going on here right. Yeah, the the point is like you can kind of see it as kind of like a for alifier where you take these kind of like emotionally salient things and you just kind of like distribute them and like the thing with emotion is that it clouds judgment and the thing with clotting judgment is that it means that you lose out on a lot of this free value.
10:38.54 rooooon Yeah. Yeah.
10:50.62 rooooon Ah.
10:52.68 cactus chu So so you kind of ask yourself like where is this kind of like massive cloud of Cloud of anti-judgment coming from um or Cloud Cloud of like bad judgment coming from and it's exactly from these kind of like these kind of like probably high empath narcissists right? That's the kind of trait um of these people.
10:56.53 rooooon Um, oh.
11:07.20 rooooon Yeah, yeah, yeah, yeah, yeah, yeah.
11:11.72 cactus chu And I mean you've probably heard balaji talk on this as well. Right? Balaji Shrinavasan yeah he he gives us like very clearly is that even if we say we have kind of like a decentralized system if you have a strong enough incentive push in one direction like you do with.
11:23.27 rooooon Ah, okay.
11:29.86 cactus chu Um, politics right now then it's de facto centralized and it's de facto centralized around this kind of this kind of legacy class right? Yeah, yeah, exactly.
11:34.50 rooooon The cathedral yeah sure but I yeah I guess but it's um, what what do they call like an arcic tyranny and Arco tyranny. Yeah, but in in.
11:47.11 cactus chu An arcco tier in the yeah.
11:52.58 rooooon Any of those cases. It still doesn't ascribe free will to like individual journalists like they are doing what gets clicks because it gets clicks. Why do people click on it because you know that's the culture. It's like people want to See. Ah. Like negative sentiment about technology like there's a hunger for it like people in my opinion think too much about like who is funding bad memes instead of like why do people want to see these bad memes. So bad like the the demand side right? So I don't know. Yeah. But I Overall agree with you I think you're right? Oh I I think it's like it's pretty obvious you know like if you publish something negative about powerful rich people.
12:30.40 cactus chu I Mean that's ah, that's a very good question. Do you have an answer like why do why? do people like this stuff.
12:47.93 rooooon Everybody wants to read that. That's just not just like human nature. Ah, if you write bad things about the finance industry people will read it back when obviously finance was more powerful now. It's like seems like a little bit of a backwater but.
12:58.65 cactus chu I Think people still over it I think people would sorry that.
13:04.93 rooooon They Yeah but not at the same like emotional valences when they were say exploding the housing industry or something right.
13:12.35 cactus chu Yeah I mean in that case, it's not even like it's not even like too wrong, right? like you, you kind of want to figure out how these bastards did it.
13:21.69 rooooon No, but it was sensationalized for sure Again, it's like people are like oh yeah, these evil actors were doing X Y and Z and then when you actually look at it. It's like yeah a bunch of Impersonal Market forces. Went out of hand and you know like these dark markets and whatever. But it's not like there are bankers sitting in like dark rooms. They knew everything that was happening and they're like we need to juice this to make all the money and then. Bankrupt the American people it doesn't make sense like none of them wanted that they didn't even understand until our models were going awry or their default were going up and they're like okay I didn't realize this would happen and then all of a sudden had to unload everything overnight Because. You've seen that movie I'm sure margin call or whatever.
14:14.39 cactus chu And no I don't really watch movies you you want to explain it.
14:17.70 rooooon Fair enough fair enough but you know yeah so there's like I guess 2 schools of thought on the 2008 crisis. It was like the kind of margins call school of thought I'm talking about in terms of pop culture. Terms of like financial economics and like Market structured is way more,, but there's like the big short model and the margin call Model. So The big short is like kind of the populist telling of it right? It's like a way more fun movie and it was like a absolute blockbuster. It's like. Ah, yeah, the bankers did extremely irresponsible shit because they don't give a fuck about you and they don't they knew was going to go bust one day but they had moral hazard and they were making fat margins bonuses, Management fees, etc. Um. And then there's the other school of thought the margin call school of thought which is like yeah they were using statistical models that made zero sense at the end of the day and it like um, like kind of went to their head and you know maybe that an inkling of what was going On. But. When you're making a lot of money. You don't look that hard right? Ah, you don't Um, you try not to examine the assumptions of your models for sure and.
15:38.11 cactus chu Yeah, you don't look that hard and and the small things you just round off but as the kind of like teleb case shows like you you can't just round these things off.
15:43.18 rooooon Exactly? Yeah, yeah, right? and margin carl is the story of like 1 big bank like I think a fictional big bank learning that this is happening. Their models are extremely fucking wrong and then like unloading all of their bad securities over the course of like 1 ne day 1 trading day or whatever. Um, and you know like while it's ruthless. It's mostly a story of like like scared. Single actors doing exactly what they have to do in order to survive. So yeah I mean the point being the financial press is not going to run with that story. They're especially not going to talk about like how like the moral hazard of banks is all correlated and like. The banks are de facto parts of like the state you know because of the number of ah the number of regulations that we put on them to keep them in line with the public interest means that. They really like kind of take marching orders from certain people in government for many things. But yeah, nobody wants to hear that story like they're going to hear this story like the Ceo of Goldmanachs. Well yeah I mean.
17:02.43 cactus chu I mean I want to hear that story that sounds like an interesting story.
17:10.47 rooooon I Don't know I'm not prepared to talk about it all that much because I'm not really an expert but it seems a lot like the banks all take the same exact risks in a coordinated fashion. Um, because they think that if everybody screws up then the government.
17:21.88 cactus chu Yes.
17:30.23 rooooon Will basically bail them out because they have to They cannot risk the destruction of the banking industry and you know, right right? and so that basically means the banks are an extended arm of the state.
17:36.39 cactus chu Yeah, this is what too big to fail means right.
17:47.86 rooooon Like they do what is allowed by regulation and no more and no less. So it's hard to assign them agency for anything really.
17:55.57 cactus chu Is is it all regulation though like here's the thing here's the thing and I feel like you probably have some inside look at this as well. But how a lot of these companies work I'm not sure if it's completely this way in finance actually because while there's a lot of ah money to be made if you're.
17:58.83 rooooon Ah. Yeah, you.
18:15.23 cactus chu If you're a Contrarian and you're right? but especially in a lot of legacy companies. What actually happens is that you have you you have and a quote unquote industry standard and then people just go along and they conform on purpose. There's like this conformity That's not always.
18:15.50 rooooon Oh.
18:24.60 rooooon Yeah, right.
18:33.19 cactus chu Due to kind of state coercion. It's sometimes due to state coercion and but not always and this creates like these kind of too big to fail dynamics just on their own and it's It's a huge problem and I don't think it's I don't think it's completely fair to just say like oh it's the State's fault.
18:35.70 rooooon Yeah, yeah, yeah of course, yeah.
18:51.60 rooooon It's not the State's fault in the case of like many industries for sure like why are they failing in a correlated manner. But for the banks I think it is because like banking is necessarily like an extremely. It has to be regulated right. Like especially consumer banking. The government has yeah yeah has assigned like a de facto oligopoly to like several big banks in return for some very onerous regulation that they have to follow and I'm not taking a.
19:10.67 cactus chu Yeah, consumer banking. Certainly.
19:26.89 rooooon Negative stance on that regulation I'm just stating like um, not normatively but like just this is how it is um and I don't know maybe the big banks or like the big investment Banks have a little more freedom but I think. Especially now post like um, what was that what was that that law called I don't remember but you know there's There's a lot of restrictions on what they can and can't do.
19:50.30 cactus chu Ah. Yeah I Know what you're talking about it was the it was the bailouts right? It was the.
19:59.97 rooooon Yeah, was the Obama Era banking law Dodd Frank yeah but even.
20:02.93 cactus chu Yeah, the one that basically said they would stop issuing new banking licenses. Yeah I think I think there's kind of like proxy battle going on between us here and the proxy battle is that I'm like I'm actually very optimistic about the world and you seem like kind of like. If not doomorish then kind of like not you you believe that the world is very nonagegentic. No. Okay, okay, awesome. So Good place to a good place to kind of test title techno.
20:25.31 rooooon I know I think that certain parts of the world are very non-agegentic I am extreme technooptimt I think yeah, but ah, go ahead.
20:41.90 cactus chu Techno Optimism is like what the fuck is elon doing man. Yeah.
20:42.21 rooooon And oh with Twitter um, well I think that Elon truly loves Twitter you know, um I think that people like Elon. Have a hard time making human connection. But they have an easy time making like humanity connection if that makes sense like they want to be exactly yeah they want to be lecturing to the masses.
21:10.24 cactus chu Yes, the crowd versus the person. Yeah.
21:19.15 rooooon And they love like the adoration of the masses and that's like the only real connection that they can feel when you know, maybe the people in their lives are disappointing to them because there are these hypercompetent freaks of nature etc that I don't know like. Read a story once where elon said he would like um like his first wife was complaining to him in an argument like Elon I'm not your employee like stop bossing me around like this whatever and he says something like yeah if you were my employee I would have fired you by now right? like.
21:54.78 cactus chu Even yeah.
21:57.18 rooooon Implying that you know she's just not up to a speed and first of all like it's it's kind of depressing but it may just be like that forum where he is holding other people to insane standards where like maybe nobody meets his bar. It's like. Kind of shocking how few friends he has from his companies. It feels like um, the biographer said the same thing if you forget his name Ashley Vance something like that. He said like the most surprising thing interviewing all the people around Elon was like how few of them would call. Elon their friend like he just doesn't hang out with them outside of like work or something so it's like he clearly has some difficulty like it's lonely at the top right? Um, but he has this outlet where. He gets his feeling of connection to the masses and I know this because I get that feeling too. Sometimes it's like putting on cerebbra you know, have you have you seen x-men or you don't watch movies but you might have heard of it via. Yeah.
23:05.16 cactus chu I think I've seen it I've seen it like a long time ago. But I don't I don't It's kind of like wipe for my memory.
23:10.88 rooooon Yeah, it's like professor x like he puts on this. Um this like brain thing and it lets him like read the minds of like millions of people around the world and like understand their like the collective conscious or whatever right? and and it's like.
23:26.71 cactus chu No.
23:30.22 rooooon Twitter is kind of the best approximation we have of that right now it's like this never ending firehose of the collective consciousness of people on the internet humanity some approximation of humanity anyway and he knows that he respects it. He has talked about it at length. Um and he covets it right? He wants to own it like he wants more of that. Um, and he thinks this is a good way to do it. The free speech thing may be like part of it. But I think it's like. Like maybe cover for the fact that he just absolutely adores Twitter and what it means to him and he wants to own it right? Ah I white him what's up.
24:17.28 cactus chu Yeah, so what's the end game here. What's the end gameme here. What does what does elon do after he buys Twitter yeah.
24:27.10 rooooon That's a good question I think he's going to struggle really hard on the free speech front I've been meaning to write something about this but look I don't know too busy too tired. Whatever but um, exactly exactly.
24:37.16 cactus chu Yeah, that's why you come on podcast to make your case. Let's go let's hear it.
24:43.75 rooooon Like I think that many smart people have been tackling this for a very long time I think that elon like really succeeded by showing up in industries that are dominated by like boomers midwwits ancient grandpas. That are not running things at all like optimally and that's not to say that he and Tesla and Spacex aren't brilliant. They are. They've done the impossible many times over but it is in part because they brought like these new age management practices and like. New ways of thinking and like high iqe talent to like dying industries and completely revamped them but the free speech problem is different. It's not about technological competence. It could be I mean I hope that there's. Things that I'm not thinking of where you can solve these problems technologically and it turns out there are a few cases where that happens maybe I'll get back to it but um, like Facebook and Mark Zuckerberg I think are like extremely competent like. Ah, very intelligent organization I don't think that Zuckerberg is personally like super politically skilled. But I think he's like a genius in his own right? A business genius for sure and has a clarity of vision that most people do not have um. And they tried. They tried their best I think that Zuck actually started out as more or less a free speech absolutist. Um, but eventually they became beholden to a lot of different interests like there's an equilibrium that to strike between the sensibilities of the viewers. The governments. Advertisers the advertisers want certain things like they cannot be seen with their content showing up next to something violent something pornographic something hate speech related and so yeah, go ahead, you're going to say something.
26:54.60 cactus chu I'm going to push back on the last one like okay like here's the thing Hate Speech is kind of like a completely fake concept. Um you you can have like things that are like threats or like ah calls to violence and that's already illegal like it's like flat out illegal calls to direct violence.
26:55.87 rooooon Um, yeah, yeah, yeah, yeah.
27:11.20 cactus chu Like yeah we have first amendment case law for a reason and like okay here here's the thing like every category of hate speech is equivalent to basically saying to someone like your god is fake. It's like a it's like a deep attack on a belief system not anything else and so like.
27:12.21 rooooon Yeah, yeah.
27:21.43 rooooon Yeah, yeah, but yeah, yeah, but.
27:30.72 cactus chu The correct way to view this in my opinion is just to see it as like a religious war and this is actually a white pill because like most religious wars like if you're like a heretic you just get like shot or like you you get like stabbed or like burned at the stake. Um in this religious worry like you maybe get banned from Twitter like maybe ah and that's actually just like kind of pleasant.
27:33.16 rooooon Absolutely.
27:41.15 rooooon Um, yeah. Yeah, it's like it's too fine. A point you know of course hate speech is fake but like who cares we we the population of Facebook has certain moral standards that.
27:50.80 cactus chu It's quaint.
28:05.15 cactus chu Yeah, but it's all the population of Facebook it's it's like okay here's the thing right? like okay, so there's this pew survey. Ah this pew survey that basically categorize people on their on their kind of political views and progressives are 6% of the population. they they 6% and.
28:06.99 rooooon If They see them Breached. Um. Yeah, yeah, yeah, yeah.
28:24.54 cactus chu This just redounds to what nasim tale lab calls the tyranny of the vocal minority right? Um, or the tyranny of that I think the the ignorant minority so it was something like that something like that. Um, and so for the audience there's like.
28:29.20 rooooon Of course, of course.
28:40.35 rooooon Listen.
28:43.90 cactus chu There's like this dynamic where if you have ah if you have a large group of people who don't have a particularly strong preference and you have a small group of people who are willing to like basically make a kind of like huge fuss over something like like really just like be an asshole over like really is a trivial issue.
28:47.55 rooooon Um, go to War. Of yeah, ah.
28:59.74 cactus chu And then then that small population can end up dominating the larger one even though that causes a lot more dissatisfaction and and in a lot of situations causes a lot more damage. Um, and this is exactly that right? So I think when you have this framing of like oh it's just a market. It's actually not.
29:08.48 rooooon Right.
29:17.29 cactus chu You can kind of will these things into existence by having like a very organized activist class and you can kind of you can equivalently unwill it from existence by having like a countervailing activist class or by like sufficiently stigmatizing um that kind of like minority of beliefs.
29:20.37 rooooon Um.
29:33.33 rooooon I disagree I think I think now you have another activist class that also needs to be pleased. You know it's like it's like super multipolar spectrum of extremists I guess but there are like.
29:36.22 cactus chu Really why.
29:44.72 cactus chu Yeah, but it can be like less bad. Okay, like if you have a lot of free speech absolutists. That's actually ah have a lot of that's actually much less of a problem than having like basically like a lot of people who believe in a conspiracy theory about like mass racism or whatever like having like having like a very dedicated activist class.
29:52.46 rooooon You know who.
30:02.81 rooooon Sure yeah.
30:08.83 cactus chu Who are like free speech absolutist is like a much better problem to have it might still like be a problem but like I don't care.
30:13.60 rooooon Yeah, but I agree and like it's clear that the value of free speech has been upheld throughout the history of like America especially because there's a small elite that believes in free speech right.
30:28.95 cactus chu Yes.
30:31.31 rooooon It's not It's not like anything else. It's not something systematic. It's just that again. It's a vocal minority that wants something so I agree with you of course like it's much better to have this vocal minority that cares a lot about free speech than. Ah, vocal minority that cares about you know, censoring things for whatever reason, but there is so many polls to the spectrum like like I remember Elon was tweeting about this a few days ago who's like ideally Twitter should piss off the. 10 % most left and 10 percent most right people equally. Um I don't know what he meant by that exactly but the point is that those people are um the linchpins of what finally gets to be said on Twitter and it's just a like classic Overton window thing.
31:08.40 cactus chu Yeah, yeah.
31:25.91 rooooon Where I don't know like you're going to run into this no Matter. What is my feeling um and it's not like the the class of people that really respect free speech They are the they're like a reactionary party. Especially now they've only come into their like relative status and power because like we've swung so far toward censorship and they're going to disappear probably as soon as Elon takes over whatever um I.
32:00.12 cactus chu Right? This is the line right? Every every civilization is one generation away from tyranny something like that as by Reagan I think.
32:03.74 rooooon Yeah. Sure yeah, but like it. It certainly does feel like a market on Facebook or Twitter or something like some equilibrium. Being struck between the extreme right? The extreme left advertisers, etc, etc. And yeah I do think that someone like Elon can push the needle especially if he for example, he has alpha in that he can fire all of Twitter's current employees. And get new ones for cheaper who are maybe more free speech-oriented because people love elon especially engineers especially like Silicon Valley technologists ah and in that way he can shift the equilibrium in that the employees may not be as ideological in one direction.
32:51.16 cactus chu Yeah, totally.
33:01.40 rooooon Um, but in terms of expanding revenue if you want to run Twitter like a real business. He will run into the same problems that Facebook did at that skill like they're going to cause genocides on accident and things like that. I don't know about cause I think that's way too strong a word and like people shouldn't blame communication. yeah yeah I agree I agree. Yeah.
33:20.46 cactus chu Yeah, it's also it's it's also very that narrative is also just very kind of blown out of proportion like this is the thing that actually triggered that happening was like they took over all the radio stations and literally like the ads on Facebook were ache. They're they're a very small minority of the messaging when you actually added everything up.
33:41.56 rooooon Absolutely I agree with you I was just um, what I really meant is that you're going to get blamed for causing genocides rather than.
33:49.34 cactus chu Yeah, you you have this kind of here's the thing I don't think I don't think you put enough blame on the kind of like and I don't I don't like ascribing to like a broad brush. So I'll just say like I'll say like some journalists.
33:56.87 rooooon Um.
34:05.82 rooooon Um, yeah.
34:07.91 cactus chu Not all but some journalists you got to blame the more men and they know why you don't right you and your articulation is like let me see if I'm getting you right on this. Your articulation is that they're kind of like just driven by Incentives and they're They're not really kind of like adgenic right.
34:15.35 rooooon Yeah I think they're mostly Npcs you know.
34:26.28 cactus chu But here's the thing right? You can you can change those incentives and you change those incentives by like treating them as a gentic. Do you think that's right.
34:34.30 rooooon Um, maybe a little I think we did change our incentives in a massive way by destroying. Their ad Monopoly right? or their ah their business in the classifieds. Yes, yes, yes, um, and I think that um well newspapers for a long time had a monopoly on the classified page.
34:55.27 cactus chu Right? And by we you mean like just technology in general right? Yeah yes I tell the story. What happened there.
35:13.30 rooooon Of you know, selling things locally to um, advertising local things like I don't know like home listings. Whatever or even broader like Coca-cola takes out a whole full page ad in New York Times or whatever um or like the local newspaper even and. That business model dried up. Basically as soon as craigslist was invented um, as soon as Facebook was invented Yahoo whatever where we found much more efficient ways of ah classified ads like putting viewers or connecting. Eyeballs to their intended target. So now. The news business is this like race to the bottom for clicks where they really are competing not on journalistic quality or integrity but just raw revenue metrics. Of what they're writing um and so while maybe a generation ago New York Times could be run as this sort of philanthropic project to like purely document exactly what's going on or whatever I don't know if that was ever true by the way but it was certainly like an ideal they were striving for. Um. But now like they don't even try to do that. They're ideological. They are editorial. They like clearly have an agenda and that's like what sells right? like their stock was booming throughout the trump years and then it crashed as soon as. Biden was elected. It's clear what their business is and I don't know this is usually the general narrative of news these days.
36:54.46 cactus chu Yeah I don't know like is it that clear though I don't even think it's clear to me that it's like completely a kind of business thing because um I think the New York Times would make.
37:04.62 rooooon M.
37:11.91 cactus chu A lot more money if they had a right wing slant like you just look at like you. You just look at like right wing substackers. You look at like daily wire or whatever and it's like I don't know maybe daily wire is not not quite right because they're they're kind of like they're they're kind of like new. So their revenue numbers are actually still like relatively low, but like.
37:12.62 rooooon Ah, no, no.
37:24.10 rooooon Yeah, yeah, well you you have the other thing where not like go ahead. Ah are there. Okay I mean that's news to me but I also am'm not that surprised but like.
37:30.31 cactus chu You look at like all of these right wing sites like Fox or whatever and they're just they're just doing better I believe so yeah.
37:43.96 rooooon Clearly 90% of elite journalism grads are liberal right? So you you have to go to war with the army you have I guess.
37:53.21 cactus chu Yeah I think that's that's closer to writes. Let let me know what you think of this because this is kind of like a persistent thing I've mentioned this in like 2 episodes now and I don't know why it keeps coming up but I really see. Um.
37:55.80 rooooon Um, yeah, ah ah, ah.
38:10.96 cactus chu Really see this kind of like new social progressivism as a war by by old money against new money so you have these kind of like um people who are really inheriting their jobs like some of these people are like literally getting getting their like Journalism jobs because of family connections.
38:12.51 rooooon Ah. Yeah.
38:28.38 rooooon Ever.
38:30.70 cactus chu But aside from that you can have like subtle you have like more subtle nepotism or subtle corruption where it's like ideological successors where it's like people who are who are very good at sucking up to a kind of ideology who are very good at like repeating repeating like the kind of the kind of like fashion.
38:34.74 rooooon Um, But. Um.
38:49.76 cactus chu And that sort of stuff instead of actually being able to get to an accurate point of truth and in a lot of the more sheltered and the more legacy a company like exactly like the New York Times the more this is the inherent order.
38:54.70 rooooon Yeah, yeah.
39:02.72 rooooon Um, yeah.
39:08.12 cactus chu This is the kind of underlying field. What is selected for is it's selected for conformity and so that is that would be my explanation for like why you have this and for why they're anti-tech besides obviously tech is their competitor.
39:17.50 rooooon And.
39:23.00 cactus chu And but even like non-compeitor things right? like they're They're very pessimistic on and they're often just like blatantly dishonest on as to why they are doing that I think it's exactly this. There's this kind of antithesis towards people who make things on their own.
39:38.78 rooooon Yeah I mean that it is definitely the whole basis of like why is wordcel versus rotator thing even had any like had any meat to it right? Why Andreessen really loved it. There clearly is something like that going on. But. Um, it's also seemingly like always the case that old money are the ones that create and defend cultural institutions like why San Francisco is kind of like a cultural wasteland in comparison to New York right
40:16.52 cactus chu Right.
40:17.50 rooooon Because it's all new money here. There's um I mean it's not strictly true. There's some old money in San Francisco but the the dominant majority of the wealth here is like invention of stripe or whatever. Um, so.
40:32.90 cactus chu Yeah, it's companies that didn't exist like fifty years ago almost every single one of them and most are like last twenty years right so
40:36.71 rooooon Right? right? Yeah, but like by that by that Prism. It's kind of unsurprising that um old money dominates through like cultural domination rather than. Financial Technological whatever and if you assume that if you assume that all axes of power are at War against each other somewhat. Then yeah, the culture is going to fight with the financial and etc etc.
41:11.67 cactus chu It's not cultural versus financial though I don't think that's true I think the old versus New is the biggest dynamic here because we kind of have predefined rules for new money competing with each other. It's called competition or it's called the market.
41:14.80 rooooon Um.
41:23.63 rooooon Um.
41:28.77 cactus chu And we have predefined rules for old money competing with each other or old power. Let's say competing with each other. It's called Prestige Um, what we haven't agreed with the rules on is new money versus old money and that's why it's the biggest dynamic.
41:34.25 rooooon Right.
41:42.37 rooooon Yeah I mean yeah I mean I agree there definitely does seem to be. They definitely does seem to be but there definitely does seem to be like a cultural versus financial access to it. You know like I tweeted about this a while back like.
41:48.19 cactus chu Yeah, we were. We're agreeing too much. We need like fit on something.
42:01.88 rooooon Journalists resent tech because of how much money they make like versus these journalists have been their salaries have been decimated over the last um, however, long and they certainly resent tech for doing some of that to them and like. Mark Andresen certainly somewhat resents like the the journalists because they have the ear of the people. Although it's a bit different these days he does have millions of Twitter followers. Um, but you know there's there's definitely those competing forms of power like journalists want what. Andreessen has which is fuckloads of money. Um, and the vcs want what journalists have which is like the ear of the masses. There are some people that have both rare inbetweens like Elon Musk who gets to be this like populist. Ah. Caesar type guy. but um but yeah so I'm I'm going to stick with my my access there. But yeah.
43:06.52 cactus chu I Don't know I think the major challenge here is like do do they feel the same way about like errors right? Just heirs to to money and like people who just inherit a lot of money.
43:09.20 rooooon Earth. Um, what do you mean yeah.
43:25.80 cactus chu Like I just mean like normal inheritors here. Not like the kind of extended inheritors just like normal inheritors like like the walsons or whatever the the grandchildren of of the Walmart founder. Um I don't think they care nearly as much I don't think like I don't think they care unless it's new money.
43:28.50 rooooon You? Yeah yeah, yeah, um.
43:45.25 rooooon He's saying the waltons don't care about. Oh right right.
43:46.18 cactus chu No that the Journalists don't care about people who inherit their wealth. They care about people who make their wealth and who make their wealth like on something new. Basically.
43:57.96 rooooon Yeah I mean I suppose that's true, but like you also definitely see like hatred of Murdoch and Coke and whatever Soros like ah is it kind of yeah.
44:11.65 cactus chu Hold I don't think it's the same people who hate sorus I think it's like a very different like like maybe those people are awful too. But like it's a different set of people who hate Soros but yeah.
44:20.60 rooooon Of course, of course Yeah, but it like on the fringes it kind of circles to like the same people. The populists who seem to like hate all rich people like.
44:34.89 cactus chu Wait. Okay, this is this is just like this is disagreement but I don't think if it's like it's like that productive because like wait I Also just think this is like different people like the populists are like different people than the kind of like legacy journalurnos. The legacy journalurnos like the populists.
44:39.26 rooooon Month and ah.
44:52.95 cactus chu Hate the legacy journalurnos. The legacy journalurnos are their enemy that they they actually like have no greater enemy they hate the legacy journalurnos even more than they hate rich people. Yeah and for like good reason because like let's let's face it here like that.
44:55.39 rooooon Yeah, that's true. That's true. Yeah yeah, agreed.
45:07.84 rooooon And if.
45:10.46 cactus chu As much as we hate the legacy journalurnos like the populace would have a lot easier time getting power and probably doing something stupid with the power if not for them right? So that that's why I think they eat each other but like I think that's different I think like.
45:17.74 rooooon Yeah, definitely. Ah I can't hear you I think you cut out for a second. Okay I say something? Yeah, yeah, we're back.
45:28.20 cactus chu All right? Okay, awesome. Um, yeah so I was saying like I had a conversation with Richard Anania I think it's actually the 1 right before this one and Richard Richard and a lot of political scientists.
45:38.52 rooooon And yeah. And.
45:47.66 cactus chu Ah, point at just statistically like what is what is more effective at driving politics or driving changes in the way that government or that other institutions act is it The small organized minority is it Basically the activists.
45:58.31 rooooon The. In.
46:05.31 cactus chu And here activists includes like Journalists or like professors or like whoever whoever is like actively trying to get power versus versus um the populists versus like the masses versus like the large group of people who aren't really organized who aren't really um, all that interested.
46:09.58 rooooon Yeah.
46:24.50 cactus chu And you just look through a history. It's always the activists the populate the populists like okay they have like a few. They have a few bright spots and even then even then it's because they have a strong activist contingent. So there's a lot of talk about populism.
46:25.36 rooooon Yeah.
46:33.91 rooooon Never.
46:43.75 cactus chu Nowadays and I think like if they win it's because there are a lot of there are a lot of activists inspired by Populism basically and not because of populism itself.
46:52.57 rooooon Okay, yeah I mean I don't I don't even know if we're using Populism in the same sense. But um.
46:59.95 cactus chu Okay, what do you mean by populism.
47:09.36 rooooon Like like Trump is a kind of activist I guess right? like he has hobbyhorse issues like ah protectionism steel tariffs. Whatever I actually don't think that many people cared about that. But it was. Ah, very successful. Um, it's like a good sell right? like people want to hear about that. They think it's a good narrative for why they're poor or America is poor or whatever. So I guess the the merging of the um you know elite activists.
47:37.69 cactus chu Yeah, yeah.
47:46.75 rooooon With a narrative that has popular appeal is what I mean by populism. But yeah.
47:50.18 cactus chu Okay, so wait. So what? what narrative is that sorry.
47:59.82 rooooon Um, in his case, it's like you know, like the the protectionist kind of narrative or in in Elon's case I would say it's like the free speech narrative like he is an ideologue with like an elite.
48:07.68 cactus chu Yes.
48:17.62 rooooon Coded sort of idea it all free speech always has been but all of a sudden he has this popular appeal because I don't know random internet People don't want to get kicked off the website. So It's like marrying marrying elite ideas with. A way to sell them to the masses I Guess that's all I meant really.
48:38.61 cactus chu Okay, that that's interesting. So so do you think that's like that. That's what the way change happens is that you marry those 2 things together.
48:47.10 rooooon Yeah, absolutely. But I don't think that's that controversial right? like that's a whole premise of democracy.
48:50.17 cactus chu Okay, that's interesting.
48:58.48 cactus chu I don't think it's controversial. But I might think it's wrong or at least somewhat wrong because like you just look at a lot of these the institutional changes in the last like 10 years
49:01.90 rooooon And sure same more.
49:13.90 rooooon No.
49:15.59 cactus chu And and they're not popular right? like 76% of the country opposes affirmative action. For example, um the kind of mass surveillance snowden stuff. What.
49:20.80 rooooon Ah.
49:27.98 rooooon Haven't we haven't we banned affirmative action by law in many ways, but it's still de facto alive.
49:31.50 cactus chu No, no, no, no, we are in like a pretty ubiquitous state of affirmative action in the United States right now. Yeah I literally talked about this with hanania like this is literally the last episode.
49:47.74 rooooon Ah.
49:51.19 cactus chu Um, yeah, so there's this kind of there's there's this doctrine called a disparate impact which is basically like you know like the correlation versus causation meme. Yeah, so so basically like it's just ah, it's just a true thing that just because 2 things are like related doesn't mean that one causes the other.
49:55.27 rooooon Um, yeah.
50:09.16 cactus chu But um, disparate impact is basically saying that correlation equals causation and it's writing that into law and basically saying that like all sorts of like normal shit that you would want to do is like illegal because it has different impacts on like different races or whatever. And for example, this is why you can't do like.
50:22.60 rooooon Okay, but.
50:27.14 cactus chu This Jurisprudence is why you can't do like Iq tests in hiring.
50:29.71 rooooon Okay, that's fine, but like I'm pretty sure in California they passed an law or an amendment or an article or something that was basically like you cannot consider race in admission to public universities. They still might anyway, but like it's like. Day jury illegal but de facto like completely allowed I'm not really sure. But um.
50:53.51 cactus chu Yeah in California it's actually a bit different because in California when when the kind of polarity was the exact opposite way they got this past but in the rest of the country. No right like that's why you have like the harvard affirmative action cases is that it's not banned yet.
51:01.49 rooooon Um, yeah, yeah, ah right? yeah.
51:12.70 cactus chu Um, they're trying to get it banned now that there is a conservative supreme court and like hopefully they win but like the the kind of intensifying kind of like Neo racism in a lot of these institutions is kind of like discriminating in favor of ah of blacks and latinos and it's like.
51:22.30 rooooon Moving.
51:32.80 cactus chu It's evidently unpopular and it's and in fact, in many cases, It's unpopular among blacks and latinos as well because like they want they don't want to be treated like this either. They want to be treated they they want it to be treated fair and Square just like everyone else does Um, but you have this you have basically like every social issue has. Has gone in the way of like the organized minority. Especially when when it's gone against like a popular majority like even abortion right? like it's It's not even just like only a left right thing even like abortion like the people who care the most are all right wing. So It went in a right wing direction. Doesn't matter what the population thinks.
52:10.84 rooooon Right? Yeah I Guess it's fair I don't know I Honestly I don't think about politics that often anymore I Just think about technology mostly.
52:22.99 cactus chu Yeah I mean that's a reasonable That's a reasonable strategy to make I mean like this this is honestly like most of my friends and the kind of the the monologue that I talked about at the very beginning is kind of directed towards those types of people right.
52:28.10 rooooon Yeah. Oh. People. Yeah.
52:42.83 cactus chu And it's not like it's true that in the long arc of history technology matters way more than politics. But I think a lot of the most kind of bright eye technology people of the past two decades.
52:51.91 rooooon Yep.
53:01.76 cactus chu Have slowly realized that there is no force as good at suppressing technology as politics and I think that's the kind of arc that happened to Andreesen I think it's the arc that happened to Peter Thiel I think it's the arc that happened to like biology and I think it's going to happen to a lot of people in my generation.
53:06.79 rooooon Correct and. I don't know if Peter Thiel had an arc. Yeah.
53:20.70 cactus chu I Think a lot of dreams are going to be crushed because of politics and a lot of those dreams are going to be tech dreams.
53:23.81 rooooon I I think that Peter Thiel never had an arc I know it's a minor nitpick but like he was writing about this kind of stuff in college. He was always like even like a college republican college libertarian type.
53:30.44 cactus chu O. Um, right.
53:42.75 rooooon Not like um he was some idealist thinker and then he became like red pilled or whatever he wrote like the diversity myth in college. So yeah I mean like minor pushback but the other guys for sure.
53:56.30 cactus chu Okay, but like why why was he like that. Why was he like that right? like I don't know when I read 0 to one. That's that's the impression I got at least is that he kind of had this hope.
54:01.62 rooooon Andreessen.
54:07.30 rooooon Um.
54:12.55 cactus chu At some point maybe maybe by the time in college it was dashed but he had this. He had this positive some hope at some point and obviously he's still positive some like he did these companies and stuff.
54:20.14 rooooon I think what happened at Thiel is that he no longer believes in democracy and that's what changed for him. Um, yeah I mean he he wrote a whole essay on it and like basically.
54:29.47 cactus chu Wait. What.
54:37.91 rooooon Then Publicly denied. He meant any of it. But it is like clear he meant it He was basically like Liberty is no longer compatible with democracy or something and he wrote at length. Yeah, and that's clearly his true beliefs especially considering like who he hangs out with um.
54:45.46 cactus chu Oh I remember that screenshot. Yeah.
54:57.83 rooooon Like he cultivates like the and Nrx kind of subset and it certainly hangs out with moldbar or whatever the neo-reactionaries. They're kind of like Anti-liberal anti-democratic. They.
55:02.94 cactus chu For for the audience. What's the nrx.
55:16.90 rooooon Some of them want to return to like um dictatorships or Monarchies or Neo-monarchies or whatever. But in general they they do not like liberalism is what I would say.
55:28.30 cactus chu Yeah, like what do you think of them.
55:33.22 rooooon Um, I think that many of them have interesting ideas I think that several of them are the only people willing to like point out obvious hypocrisies that nobody else is willing to. In general I find them kind of like bitter and distasteful online but you you kind of have to be in this day and age the people who are agreeable are not going to be political dissidents people who are extremely disagreeable are the only ones that are going to find themselves outside the ah. Ah, the boundaries of the whatever is tolerable in society especially because of how vehemently we protect that nowadays like saying the right things and only the right things and whatever.
56:24.80 cactus chu I Don't know I think the or I mean I agree with the kind of second part of that but I don't think you have to be bitter about it either like you can be like happy and you can be like super enthusiastic about the world and interested in the world and also like be willing to like take the fight to them.
56:28.60 rooooon You know.
56:37.96 rooooon Yeah, yeah.
56:43.95 cactus chu If you have to I think that's the best political personality and I'm not just saying that because it's mine like I think that is Trump I think that is Biden too I think that's like that that is kind of like Obama too I don't think Obama is like as disagreeable as Trump or Biden but like I think he is kind of like that too. He is like willing to take the fight when it counts.
56:50.50 rooooon Oh.
57:03.40 cactus chu And he's He's like happy about it and I think like a lot of people are like this actually like I don't think I don't think teal is like that either I don't think Teiel is a domor. Yeah, so yeah I do agree with you that that like the near reactionaries are kind of like that.
57:10.50 rooooon Of course, not no.
57:19.30 rooooon Yeah, yeah, yeah, basically yeah I mean I.
57:21.70 cactus chu They are kind of like especially like Yarvin right? Curtis Yarvin is like give up now don't don't don't be kind of passionate about these things and and I just think that's not like there's so many good things in the world that you can like actively actively do that I don't think that's quite right.
57:35.70 rooooon Yeah, yeah, agreed I think that completely giving up on democracy is like stupid and an unworkable philosophy and it's like it's a wordcel move. You know like it's like planning for eventual.
57:52.95 cactus chu It's a wordcel move. Yeah.
57:55.20 rooooon It's like planning for eventualities that are not going to come in my lifetime I think but I don't know I also like I don't follow people like Yarvin as closely as I follow someone like Nick land because he speaks about. Technology more specifically. Um, and in this Yeah yeah, um, how to summarize okay like.
58:16.70 cactus chu I actually don't know that much about Nick Land and the audience probably also doesn't either so like who is Nick land. Why is he important what are his ideas.
58:33.90 rooooon Nick Land is he wouldn't self profess himself as a schizo philosopher which I think is in like the tradition of some other philosopher but I don't remember who um and he basically thinks that. Like techno capital is and he like kind of got this from dellu andtari as well. But ah like techno capital is like a god or a demon of some kind like it's alive it like reorganizes human society. Basically at will and that it will ultimately lead to like I don't know like a post-human anti-human future sort of and I'm probably like I haven't fully read anything. He's written honestly I just get the vibes from what he says. So I might be butchering this but the overall point is it's like a very funny aesthetic, a very like fun aesthetic as well raise like you know coldness be my god like I don't care about human warmth anymore just accelerate into like Ai oblivion. Um, so yeah, like it's funny aesthetic that I've been pushing on my followers a little bit or I just like quote tweet like dystopian looking things and just say coldness. Be my god um, but but yeah, he's like definitely the norex. Person that I have learned the most about I think.
01:00:13.39 cactus chu Coldness is my God So what kind of ethos is that So basically is it like Nihilism What what is it I don't really get this.
01:00:22.80 rooooon Um, would you say that humanity and technology are opposed to each other or competing in some way.
01:00:37.60 cactus chu Would I say that humanity and technology are opposed to each other opposed no competing maybe like I think there are tradeoffs but I don't think they're like opposed.
01:00:46.11 rooooon Um, yeah, um, yeah, right, they're not ° but they might be like ° I don't know. Um.
01:00:54.89 cactus chu They're like correlated in some way but not like 100% correlated
01:01:02.48 cactus chu I don't even think it's like ° I think it's like I don't know ° I think they're positively correlated. Yeah, yeah, there's some, there's some correlation there.
01:01:06.24 rooooon Right? Yeah I Actually think it's more like ° but you know just for the sake of argument. Um right? and he's kind of like okay Like. Abandon Humanity and this is like sort of a reaction I think against ah like people that want to slow down progress. Um, and it's like kind of like a funny extremism where is this like okay I want all the dystopian outcomes now. Um, yeah, if that makes any sense.
01:01:45.51 cactus chu Um I don't think it makes that much sense like like I don't know I think there's so much loaded into dystopianism that like this is like a subversion right? But the problem is I don't get the subversion I don't know what what he is like replacing like like a subversion is like you're taking this kind of negative thing and you're replacing it with something else.
01:01:51.74 rooooon Yeah, yeah.
01:02:02.51 rooooon And well he is replacing it with like what I would consider like a worship of techno Capital right? It's like.
01:02:04.32 cactus chu But like I don't get what he's replacing it with like what what is like a Nick Landian vision.
01:02:14.40 cactus chu Yeah, but like why is Techno-capital good.
01:02:19.95 rooooon Well I have my own reasons I don't know what Nick Land's reasons are like I I mean for the reasons I said it's like I think that they're like 10% apart I think humans and tool use are completely inseparable.
01:02:23.88 cactus chu Oh I Still want to hear yours then.
01:02:38.93 rooooon I think that technology makes defines humans like we literally call our predecessors homohabilus like tool users. Um and like our entire evolution for the past hundred Thousand years has been shaped by the invention of like. Controlled fires and stuff. It's like why are like why don't we use our ah we have vestigial organ organs that came from the time before we add like cooked food. You know like we've been co-evolving with technology for like hundreds of thousands of years it's part of us now and. It seems to mostly make our life better. Yeah, there are a few There are many externalities but like like I said it's it's 10% to me. It's not 90%. It's not one eighty and there's plenty of people on the internet who think it's like one eighty or whatever where. They want to return to this kind of disaggregated nomadic lifestyle where they take care of a specific part of the earth and um, you know like make sure it's perfectly sustainable or whatever. Ah whereas i. Am making a bet on the future that whatever problems that arise today we are probably going to fix in the future with more technology because that has been the case for the past. However, many iterations of this happening. You know like. Okay, we live together in cities as a technology it causes public sanitation issues and that's a problem. Okay, we invent vaccines now public sanitation issues are less of a worry etc it goes on um things like that I guess if that makes any sense.
01:04:30.12 cactus chu Yeah I think the main pushback to the techno optimism crowd is the existential risk argument right? How familiar are you with that? Yeah, so like.
01:04:35.50 rooooon Yeah, I'm very familiar. It's like it I think it goes back to the coldness fee. My God thing. It's like leading into the optimism somehow. And being like fuck it I don't respect the X risk like we're going to solve it and we need to accelerate anyway. But anyway keep going.
01:05:03.50 cactus chu Yeah, so like for my audience what is existential risk. So let's build it up first and then we can tear it down with why does it does not matter the opposite of the question I usually ask.
01:05:09.12 rooooon Yeah, um, um, existential risk is anything that can cause like catastrophic unrecoverable damage to civilization. Or to humanity or to earth or whatever so examples might be a gamma ray burst that that hits earth and wipes out all of the human civilization or global warming is a popular one where ah maybe you know the oceans rise by some percent and. Causes like a complete collapse in civilization the most popular sexy 1 these days is um, ai x risk we invent computer intelligence that ultimately is not aligned with our goals and does something disgusting disgusting and horrifying to humanity. Or to the world or to the universe or whatever if that's do you think that's a good explanation.
01:06:10.32 cactus chu Ah, yeah, so like a single sentence summary I would give is basically things that can end civilization as we know it's um I've always thought the sexiest one was pandemic risk.
01:06:17.12 rooooon Easy.
01:06:25.63 cactus chu And and this was like before covered and like after covered like there's no way it's not the sexiest 1 right? What's it's got to be pandemic risk come on we we literally like just did okay.
01:06:25.67 rooooon Who I don't think so at all I don't think it's Xy. But did any did at any point you feel that your civilization was threatened because I certainly did not right, but we had a pandemic.
01:06:42.13 cactus chu Well no because covered had a death rate of like 0.1 for sites like what if covid had a death rate of like 50%
01:06:55.28 rooooon And because we had one I feel like like we know how to handle it somehow like if there was no pandemic in my lifetime like when this started when covid started I was really panicked I was like holy fuck there's going to be state collapse like. There's going to be people dying on the street. We're going to run on our ventilators. Whatever um, and that did not happen what seemed to happen is that we actually had bunch a ton of issues that we solved very quickly via mostly technological means. Created vaccines in record amounts of time and you know mostly shut this thing down so like next time a pandemic comes around I will I will be certainly less paniced than this time I mean I'm going to be depressed like fuck I'm gonna have to sit inside for 2 years but I would not feel existential dread regardless of the death rate.
01:07:52.68 cactus chu I didn't know I think that there's kind of a huge difference between something with a high death rate and something with a low death rate like eventually like if we eventually learn to live with it.
01:08:01.97 rooooon Yeah, um.
01:08:08.70 cactus chu Because the death rate is so low right? and obviously because of the vaccines but like I don't know even if if the vaccines had their current efficacy or like their current like relative risk reduction but the starting risk was like you you had like a death rate of 50%
01:08:10.25 rooooon Then. Um.
01:08:25.47 cactus chu If we started it as a death rate of 50% we could have the vaccines we could have like the death rate reduction right? or like the hospitalization reduction. It would still be worse than the covid that we started with right? So like.
01:08:34.31 rooooon Yeah, but have you ever seen this like parito curve of you know, lethality mortality versus like the the they are not or whatever you know like viruses.
01:08:47.60 cactus chu Yeah I think I'm familiar with what you mean if you kill really fast or like if you kill a very high percent and you kill like very fast. It's harder to spread and I think that there is a tradeoff I Just don't think it's I don't think it's quite that extreme like even even like the normal curves.
01:08:56.18 rooooon Yeah, exactly me.
01:09:07.56 cactus chu It's like quadratic right? We was just like not quite parado.
01:09:09.93 rooooon Yeah I mean wait sorry say that again. What do you mean by that I don't know we forgot on. Yeah.
01:09:15.64 cactus chu Yeah, what are we talking about again I'm pretty sure like the tradeoff that I've seen is that it's like it's like it's like it looks quadratic ish it doesn't look exponential.
01:09:29.91 rooooon It's possible.
01:09:30.22 cactus chu Yeah, um, anyway, so yeah, like the original argument was like basically how big of a problem are like ask X risks right? Our existential risks so like lay out more of your reasoning as to why? like they're less of a problem than I think they are.
01:09:38.89 rooooon Um.
01:09:46.56 rooooon Well I think they are a problem but it's like the ultimate the reality of life on earth is that we will get wiped out by the sun expanding no matter what. In a finite period of time. A well-defined period of time. So the probability of extinction the end of our civilization is certain over a long enough stretch of time and other shit may happen before that that we don't know about you know mass extinction events asteroids. Takes out the dinosaurs. Whatever um, like massive ecological collapse due to some other thing happening right? Super volcano. Ah so the risks introduced by technology in relation do not seem as wild to me when. I know that we're a destined for a certain death. Anyway, right? Um, so yeah, like maybe like Ai is particularly scary and it is but. There's a number of reasons I don't feel viscerally threatened by ai risk the number 1 being that our universe isn't paperclips right now. Um, which is yeah.
01:11:05.64 cactus chu Okay, so like to to summarize your position. You're like anti a I risk but you're not X necessarily Anti X risk right? There are a lot of other X risks that are like that are like things that we should be paying attention to.
01:11:18.56 rooooon I'm not I'm not anti- a I risk I'm not anti- X risk I don't think that ai risk is that well Understood. So like I'm happy when people choose to study it I Also think there's a lot of people like navel gazing about it for sure like they just discuss it incessantly without. Really doing anything of productive value right? But yeah, yeah, correct. Um and I respect people that are actively trying to align currently existing models. Um.
01:11:38.37 cactus chu Yeah, it's like the wordcel thing right? You're like manipulating symbols and stuff like that without actually touching the underlying reality.
01:11:55.48 rooooon Ah, like a lot more I think that's like a way more fruitful endeavor? Um, but like like the by the Copernican principle right? like we should not be the first ones to ever come to ai like humanity. There should be others in the galaxy that. Developed at first if I mean I don't know what you believe the Copernican principle is that like basically nothing is special about this time or place. Um, in terms of the grand view of the universe right? like.
01:12:15.56 cactus chu Wait Sorry what's the Copernican principle.
01:12:31.20 cactus chu Okay.
01:12:32.41 rooooon People thought that the earth was at the center of the universe or center of the solar system. It turned out not to be. We thought that you know the milky way might be the only galaxy that existed and it's like clearly not There's been billions of years of like history before us and there's going to be billions of years of history after us like that's a copernican principle. Um, and so if you take like the hardline view on that you should like believe that there have been civilizations as technologically advanced as ours before somewhere. Um, out there and if that's a case I don't know how much you agree with this by the way feel free to call me an idiot but like if that is a case then they should have come to the same computer technology that we had they should have come into contact with. Powerful intelligences that are potentially dangerous and yet I don't see my galaxy turned into paperclips or tiled into reward functions or whatever that you know people dream about but I don't know there may be other. More insidious failure modes that I'm not thinking of um, what do you think about that.
01:13:54.95 cactus chu Yeah I think it's more or less right? I mean it basically just plays into the very big kind of Drake equation argument right? And for the audience that's kind of basically a prediction of how many aliens in the world and the original prediction was if there's a lot of planets that look like us. And there's a lot of planets in the world and there's a lot of ways for life to form. There's all sorts of kind of parameters that you kind of have to guess at, but the conclusion was that there should be a lot of aliens and of course this is a contradiction because we haven't seen a lot of aliens. Um.
01:14:13.82 rooooon Um.
01:14:29.71 cactus chu And so it just kind of gets into that debate right? Like how many other kind of worlds are there? Yeah, and to that I just like I just don't really know like I don't really have an opinion on this I just don't I mean like if I had to guess there's probably like there is this kind of like binary that people usually present which is like basically there has to be like 0
01:14:32.40 rooooon Right.
01:14:47.49 rooooon Um, yeah.
01:14:49.54 cactus chu Or there has to be like there has to be like a lot and I don't think it's I don't think that's actually the binary I don't think there's a lot I don't think there's a lot because I think like if there were a lot it. It would just be like very easy to accidentally find 1 Um, so I actually would like.
01:15:01.87 rooooon Ah.
01:15:08.13 cactus chu This is this is also super uninformed I don't know that much of the kind of Alien Life literature but I would actually be willing to stake out a position if I had to if I had to bet on this I would actually stake out the rare like few position I think there are like a few other civilizations but like not a ton.
01:15:19.29 rooooon You know I see I mean I think that there's nothing wrong with the dichotomy of like 0 versus many because like we think about these probabilities in a log scale right. Like we don't know how rare the abiogenesis of life is like how does life come from non-life. Ah, nobody fucking knows and so we have gases that range like across orders of magnitude and so if you're like oh yeah, the answer is between. 10 to the negative 15 to 10 to the negative like 9 then it's clearly better to think about that in terms of the log scale. Um, and.
01:16:06.88 cactus chu Yeah I would say when I say like many I mean in the context of like how far you can travel I mean in terms of like density so like on average how how far do I think like civilizations are from each other.
01:16:14.62 rooooon Yeah, right.
01:16:23.65 cactus chu Could they be like ten light years away I don't think so because we could have probably communicated with one by now right? could they be like a hundred light years away and I agree that it's also still a kind of log scale. But I think it has to be like I think it has to be like a pretty large number. Yeah, like.
01:16:24.74 rooooon Mean right? Yeah, you might be right? Maybe it's like 1 civilization per galaxy like. That rarity of life formation is entirely possible.
01:16:47.48 cactus chu Not even per galaxy right? like galaxies aren't like Andromeda is what for light? no that can't be right that can't be right? No but I don't think they are so far. Okay, maybe I should just Google this.
01:16:55.50 rooooon No.
01:17:04.49 rooooon You know it's it's it's ah, very far I think it's on the order of millions of light years for us.
01:17:06.24 cactus chu But I think that was wrong.
01:17:12.57 cactus chu Okay, two point five million yeah you're right? You're right.
01:17:20.31 cactus chu Yeah, so on the question of on the question of kind of artificial intelligence in general or like machine learning in particular I think people have a kind of like very bad understanding or like by people I mean people in the public who don't actually work with machine learning people have like I think that.
01:17:32.21 rooooon Um, yeah, ah.
01:17:40.00 cactus chu The 2 things that happen at the same time is that people think tasks are way harder than they are and people think machine learning consequently is way more complicated than it is. Would you agree with those 2 things.
01:17:43.77 rooooon Ah.
01:17:50.60 rooooon And I guess but I also think that the tasks are complicated you know because I'm the one that has to sit down and face them and like see how much some machines struggle but ah.
01:18:02.95 cactus chu Fair enough fair enough.
01:18:08.76 rooooon Um, I think that it's entirely possible to build something super complicated based on extremely simple principles. So like machine learning as a science is like super simple like the math is kind of it's laughable to people that do ah.
01:18:16.98 cactus chu Yes, just.
01:18:28.35 rooooon Anything more hardcore even even like control theory which which I would say is like a predecessor to many types of machine learning methods today is more complicated in a lot of ways than machine learning is in terms of the math you have to know? um.
01:18:29.34 cactus chu Yes.
01:18:46.66 rooooon And like the numerical optimization and stuff like deep learning by comparison is like a lot of vibes and bullshit honestly and it's kind of funny but it's also like I don't know it's It's cool to me that. You can build such complicated answers out of very simple axioms right.
01:19:09.90 cactus chu Yeah, so let's actually go a little bit into the into the details here because that's one of the things I wanted to do today and I'm kind of I'm kind of having a little bit of trouble doing this because like I'm kind of coming from the same perspective as you and I know you kind of.
01:19:12.63 rooooon Um, okay so.
01:19:26.44 cactus chu Come from the same perspective or you might even have a better view of this but I kind of like struggle I had this question with Steve Shu I talked about this 2 episodes earlier of basically like how does a person with high math skill think about people who do not have high math skill and I think that this is just like very difficult for me.
01:19:28.10 rooooon Um.
01:19:33.78 rooooon Man.
01:19:40.35 rooooon Ah.
01:19:46.00 rooooon Um, it's a good question. Ah.
01:19:46.10 cactus chu Um, but yeah, basically like okay so so let me just do give you give you like room to kind of freestyle on this. What do most people think machine learning is and what is machine learning actually.
01:20:03.75 rooooon Let's see I think what people I actually have no understanding of what the public thinks machine learning is I don't I don't even know what they think about computer science. Um, they probably think it's a lot more like.
01:20:11.90 cactus chu Um, yeah, no.
01:20:21.67 rooooon Rule-based and ah you know like trying to write lines of code that depict capture elements of reality. Um, which ironically is like something that we're now trying to do a little bit in terms of capturing inductive biases and stuff to make learning problems easier. But. The majority of it obviously is framing really good optimization problems and letting the world's best optimizers run on them and make like soups of linear Algebra work Better. So. Know if that flowery language really helps. But yeah, the short answer I have no idea what the public thinks about machine learning.
01:21:06.52 cactus chu Yeah, so here's how I would put it and here's how I usually explain it to like non math people is so let's start with the idea of like a linear aggression because this is usually easier for people to understand and a linear regression is basically saying like you have a.
01:21:11.87 rooooon Um.
01:21:17.58 rooooon And.
01:21:25.59 cactus chu You have a sneaking suspicion that there's a certain pattern between some variables so that if you know if you know like 3 things about about someone or 3 things about like a certain thing that's happening then you can figure out a 4 thing and that you can kind of have basically widgets.
01:21:30.72 rooooon Um.
01:21:45.28 cactus chu You can have like sliders that you turn in order to give different numbers different like weights to ah to each of those variables that you take in and then you can give a pretty good prediction of what comes out and this actually solves a lot of kind of old school statistical problems. This is how we do like. Ah, public opinion polling or how we guess like who's going to win ah win the presidency for example, um, and the way you normally do that the normal way you normally get a regression you get those widgets that you use is you basically do some statistics and. You do? um, you do some algebra and you basically get the best guess now the problem with this is that it's very bad at having like complicated things that all feed into each other It's very bad at having and you already know this obviously It's very bad at having kind of multi-layered phenomenon. So how do we fix that we fix that by going to something that looks kind of similar but also very different. You have a lot of widgets now but instead of. All these widgets just kind of like multiplying together and adding together in order to get to your answer. You have multiple layers and the widgets get fed into other widgets that have um that have some necessary minimum requirement and also have some kind of. Similar thing where they take a bunch of widgets they get a new value and they pass that value on to the next widget. Um, and in order to do and in order to get something that solves the problem with those widgets. Ah, you use use machine learning and then. What the machine learning actually does is it basically like it guesses a bunch of random widgets. It sees how that does and basically you can use the thing that you're trying to get to the the thing that you're trying to maximize in order to change those widgets so that it does better and then you do this over and over again and you go back through each layer. Through some Calculus and you figure out how much we need to change and then eventually you get to your answer. Um, how how is that for like the this explain it to like ah a 5 year old kind of machine learning breakdown.
01:24:03.30 rooooon Yeah I think that was pretty good I think you have thought more about explaining it to beginners than I have.
01:24:10.78 cactus chu Yeah, so like I had a lot of time as ah or not a lot of time a few years I had a few years teaching basically high school students computer science. Um, these are like very very like very good high school students. It was like ah was like a contest class. So it's like a basically like ah um.
01:24:18.41 rooooon M.
01:24:29.80 cactus chu Yeah, just a very like advanced level class of high school students. But yeah had some experience doing that. But yeah I basically think I basically think what's happening in media though is that you have like these problems and they are like very.
01:24:34.99 rooooon Nice. That's really cool.
01:24:49.49 cactus chu They are like complicated problems conceptually right? or some of them aren't right? but but some of them are very complicated ah questions conceptually like you can't like just tell someone how to play chess it's like not an easy thing to do. Um.
01:24:54.30 rooooon Yeah.
01:25:04.78 rooooon Right.
01:25:08.37 cactus chu But there are certain types of problems where basically like there is a lot of Hidden. There's a lot of hidden information and that you can You can just grab that you can kind of you can grab it for free if you have like very advanced statistical methods or like not even advanced like very. Actually like yeah not advance at all if you have like basic statistical methods literally the opposite basic statistical methods and you have just a lot of patterns or a lot of way to get patterns. Um in order to feed into it. Um, and I think what happens here is that people kind of. Overestimate themselves right? This gets back to what you were talking about about how like a lot of people are non agentic.
01:25:47.66 rooooon Yeah, yeah.
01:25:53.28 rooooon Yeah, well I mean yeah I don't want to like I don't want to like dehumanize people and say that Ai is better than them or whatever. But um, for sure like we all know that. There are many many tasks that humans do today that like probably are extremely simple and you know like there are people out there that copy paste information from like. You know, like 1 screen to the other right like from a web application to an excel spreadsheet or something someone in the Philippines is being paid to do that for sure and like we all know that that's not even the best use of that person's cognition like it's It's not a hard task. We can get machines to do it of course and but like then there's like this natural inclination to defend the job right? like oh but this person is it's their livelihood. They've worked so hard to get good at it. It must not be that easy to solve. Because a human does it. But yeah, it's it's just like not how the world works like we don't make humans do the things that are super hard. We do things that are useful to us on mass right.
01:27:18.60 cactus chu Yeah I think that's about right? Although here's the thing there's kind of like a weird horseshoe or like a weird like you curve of what's optimizable or like I guess the opposite It's like ah it's like a reverse U curve like an end curve or whatever where there's a bunch of like.
01:27:22.56 rooooon M.
01:27:38.80 cactus chu There's a bunch of things that you don't expect to be super optimizable or super automatable that actually are right? There's a ton of like high class jobs upper upper class jobs that are like this.
01:27:40.93 rooooon Yeah, right? Yeah and it's interesting because you know I think in all of those cases. It's because we find that intuition can be. Like an individual human intuition is pretty good but often does not stand up to the test of like massive scale big data on neural nets intuition right? like a good example. The toy example is just go right? The board game. Go.
01:28:17.63 cactus chu Yeah.
01:28:19.36 rooooon Which like it's like computationally Intractable. You can't solve it with like classical branch and bound methods at all Alphabetta pruning. Whatever it doesn't matter I don't want to get into that but um and like even if you talk to like go masters like the best of the best. They kind of say you know this move felt right to me or something like because they've seen enough games that they've trained their internal neural nets to predict the value of a certain move like this this move feels aesthetically good to me.
01:28:50.11 cactus chu Yeah.
01:28:54.85 rooooon Why they don't even know I don't know like I can't tell obviously as an amateur but they can't tell the commentators can't tell and then like 20 moves later. It turns out to be like brilliant or something. Um and it's like we all have those moments in real life right? like where. We do things that feel right and they end up being right? and it's based on not like logic or language reasoning or deduction but like simple skills that our ancestors have picked up and then we've built on and you know what evolution has gifted us. Um. And those kind of things seem like fairly easily beatable if you have like the massive dataset that you need to like fix that up like a great example would just be the I don't know like like pricing problems right.
01:29:51.64 cactus chu Oh yeah, I'm I'm just expressing interest in the actual in the actual content but sorry go on.
01:29:52.95 rooooon Hotel pricing. Um, yeah, can you hear me? yeah.
01:30:01.30 rooooon Right? right? You know like in in the previous day and age like there would be people like Don Draper in a boardroom selling ads from like a client to a customer or like an ad buyer to an ad seller. Um. Have a marketing campaign be like this is how much it cost and the newspaper guy would be like okay this is what it's going to cost to take out like ah um, an editorial page or say a classified page or whatever and it's all like it's all like their experience that they've used to come to like. Reasonable heuristics about what something should cost but the algorithms do it much much better at scale right? when they're selling like I don't know like a trillion ads a year whatever crazy number it might be in into Facebook newsfeeds. And they get live data and they track all of it and it turns out that at that scale. Um, like there's no human intuition that isn't going to be beat by large heuristic models right? or like large deep learning models or whatever. Um. And there's many things like that like high class jobs that rely on the intuition of some person that are going to get replaced.
01:31:26.76 cactus chu Yeah I think that's right Actually here's another kind of long arc narrative that I really have which I called a twilight of data which is basically right now we have a lot of jobs where like you said.
01:31:35.45 rooooon Okay, ah.
01:31:42.68 cactus chu People are manually pasting and doing simple processes on data and then they're getting paid paid fairly high wages in order to actually do this stuff and here's the problem is that these are not just optimizable in the future but they're optimizable in the present.
01:31:44.58 rooooon Yeah.
01:31:51.13 rooooon Right.
01:31:58.64 rooooon Um.
01:32:00.31 cactus chu So if you just look at the curve of people working these menial data jobs I Think what you you're seeing is that if you just look at the historical trend. You're just going steadily up and up and up and that's kind of even accelerating a bit and then you're just going to have a peak. You're going to have a huge crash and those people are going to be well-connected.
01:32:12.29 rooooon Yeah, yeah, yeah.
01:32:19.93 cactus chu People are going to be ah, kind of like a real political constituency but they're going to find themselves. Not even just out of a job but not like just individually out of a job but having like entire.
01:32:27.70 rooooon Are.
01:32:37.96 cactus chu Entire specializations or entire areas just deleted off of the map and that we're kind of like on the on the precipice of this. Do you think that? That's a yeah, go ahead. Do I think it's do I think it's happened several times throughout history.
01:32:40.56 rooooon Ah. Yeah,, don't you think think it's already happened though happened several times or.
01:32:57.63 cactus chu Um I don't think to like the scale I don't think you've had a situation where like a class of here's the thing right? You've had situations where people are people are are losing their jobs piece of competition from automation.
01:33:14.60 rooooon Um, yeah, yeah.
01:33:15.80 cactus chu I Think that that's happened plenty of times industrial revolution so on and so forth I Think oh that's happened I think that there are kind of equivalent jobs in the same class in the same kind of in the same kind of style or the same kind of specialization.
01:33:30.36 rooooon Well I don't know about that like the industrial revolution for example, didn't just destroy one type of job. It destroyed like every type of job. It's like no like you can't. Sew clothes at home and sell them the and like the economies of scale just don't work put all those people out of business with like cheap synthetic materials and like giant factories and whatever and it wasn't just textiles. It was also like. Energy and manufacturing and a hundred other things that were being like handled by like manual low-scale labor and they had to like completely change their way of life probably to become like these industrial urban laborers. So it was a massive shift. It wasn't small. So.
01:34:23.41 cactus chu Yeah, it was a massive shift. Okay I can see this as a reasonable. Yeah, the main distinction here for me is that you basically have this group of people who are created out of.
01:34:29.35 rooooon But um.
01:34:41.13 cactus chu Not not right away but over time kind of created out of out of the ether out of the void. This is a very counterintuitive thing to have this kind of like repetitive mental work and there's there's been large parts of of time where it's just not been a thing. Um.
01:34:41.58 rooooon Um, yeah. Yeah, yeah.
01:34:58.14 cactus chu Or at least been a very very small minority nowhere near the extent where it is today and so we've kind of taken this thing out of whole cloth and we're kind of just going to delete it soon. Um, and you ask yourself like what is something that can be substituted for that right? um.
01:35:00.95 rooooon Right.
01:35:08.43 rooooon Yeah, you mean like the ah.
01:35:17.50 cactus chu And I think the the answer to that is like is like much less satisfying than like an answer to like what can be what can be substituted to like kind of like ah like a craftsman would carver right? I think that person can actually find different things doing like construction or doing like. I mean still like very high-end artistans still exist. But let's say you're a mid-level 1 You can do like construction. You can do like there are still things that can be substituted right.
01:35:43.62 rooooon Well, it's like it's post talkc reasoning. You know like I'm I'm sure that the Artisan at the time did not feel that way. But I also like I'm trying to push back against your specific argument. But I don't think that that means that.
01:35:48.55 cactus chu Yeah, it's a fair critique.
01:36:01.40 rooooon The advent of Ai will come without problems. Okay, so like it's it's true that I think probably everybody that wants to be employed will still be able to get a job in the future right? because.
01:36:14.80 cactus chu Minute.
01:36:16.39 rooooon Like this is the Ricardo David Ricardo Ricardian trade principle like as long as you know there's a comparative advantage in something like if the Ai and the company that owns Ai does job a 1000000 times better than you but job b only a thousand times better than you.
01:36:19.65 cactus chu Yes.
01:36:36.33 rooooon Then there's still going to be value in you doing job B because there's not infinite resources Infinite Compute Subs straight. Whatever Um, if there is then I don't know I'm not sure but like then you're in like real post-scarcity and it doesn't matter anymore. But um.
01:36:38.71 cactus chu Yeah.
01:36:54.73 rooooon So these people will still have reasonably good ways to trade their labor for money in a way that's productive for them and for the economy but the question there becomes do the shareholders of Google become so unimaginably rich that it's like They're not even in the same species anymore. It's like the inequality is to the level where ah, you know it's it's inhuman people don't want to live in such a world because you know we have we have quite a strong aversion to inequality I think so yeah gut level reaction. Um. Where people just do not want to see like a homeless man in America is living like a king compared to like medieval standards but a shot of a homeless man next to the salesforce tower or whatever still like stirs the hearts of many people.
01:37:42.80 cactus chu Yes.
01:37:52.20 rooooon Because it's like Inequality is a thing we're disgusted by not poverty. You know, um poverty is almost.
01:37:57.95 cactus chu I Don't know though I actually want to push back on that because I think Inequality is kind of like or like this this aversion to Inequality is kind of like racism right? I think that we can fight I don't think it'll go away.
01:38:08.98 rooooon Me.
01:38:15.90 cactus chu But I think if we have an incentive alignment and we have we have like an understanding we have like a system that's built up so that these emotional kind of these emotional impulses are contained and are channeled.
01:38:28.73 rooooon Finished.
01:38:34.15 cactus chu I Think that you can overcome it I think you can overcome it in the same way you overcome racism in the same way you overcome other kind of impulses and this doesn't mean that all this appear completely. But I don't think I mean I think the experiment is like you hop back.
01:38:41.24 rooooon I disagree.
01:38:51.15 rooooon Are.
01:38:52.30 cactus chu In time I mean this is like the classic libertarian thing you hop back in time you ask someone whether they would live in if what they would rather live in the present world as like the poorest person and most of them would say yes even if they were relatively high status then right? so.
01:39:05.56 rooooon On.
01:39:11.13 cactus chu The question is whether whether that system can be created in time or or whether we'll get something like ah, something like what you described.
01:39:21.59 rooooon Um, I mean is it's a good point I go back and forth on this for sure but like my strong intuition is that yeah like tribalism is like racism I would say is like a subset of tribalism. It's like a very very natural inclination.
01:39:36.71 cactus chu Yeah, definitely.
01:39:40.91 rooooon Um, and we have certainly never like we're not going to get rid of racism. We might get rid of race but we're not going to get rid of racism. Um, but like you can hack the sense of what a tribe is fairly easily in the modern world like okay, you're black, but you're also like a democrat so you're my friend right.
01:39:54.66 cactus chu Yes.
01:40:00.89 rooooon Um, or something like that a tribe is amorphous and and thank god because there would be no civilization if it wasn't so easy to hack the meaning of tribe. But the the thing about inequality is that it seems like. Less hackable more of a human fundamental that the um like you want to like go to war with ah people more powerful than you like that's the only kind of story that ever sells right? It's like underdog versus ah. You know, incumbent like you don't want to you don't want to hear a story about how like fucking like Google ah refactored their ads division or something. It's like too boring because you know they've already won you want to hear a story about how Facebook. Like as an upstart beat Google and became like its own power or something like that because that's the only kind of story that I don't know like is yeah, entertaining resonates with people like the change in balance of power and. I think that's deeply tied to disgust with inequality you know, um.
01:41:19.10 cactus chu I think there's kind of 2 things happening here right? which is that there's there's inequality that gets resolved due to competence that gets resolved due to like 1 person literally being better like Facebook Facebook didn't win.
01:41:28.62 rooooon Um, yeah.
01:41:36.44 cactus chu Because it kind of did kind of like redistribution right? Facebook won because it was just better. Um, and it's it's kind of the same thing with all of the best stories like what are all these sports stories. It's like someone who's initially worse right? um.
01:41:40.73 rooooon Right.
01:41:52.88 cactus chu This like rocky right? He he like trains he goes through the grind he he earns it. He earns it he he and by the end he is better and he wins because he is better right? and I think that's.
01:41:56.57 rooooon Um, the.
01:42:04.22 rooooon I Mean it's.
01:42:07.67 cactus chu Starkly different from the impulse to be against inequality and kind of like real life.
01:42:12.50 rooooon Yeah I guess but it seems to me like the combatants factor is like an afterthought and like in the vacuum of that element of the story like people fill things in they're like oh yeah, Elon Musk only succeeded because.
01:42:25.27 cactus chu Yes.
01:42:30.20 rooooon His dad had like an Emerald mind Emerald mine or something and you know like they're willing to backfill that part of the story as long as you have like the Underdog conflict like the poor man versus the rich man. Ah so. I Like I guess I don't have an answer to your question but I don't think it's that simple.
01:42:54.59 cactus chu So what do we do about it. Do we just say it's inevitable.
01:42:59.10 rooooon Um, I think that like I think that the ruling class of um, whatever it might be at the time the country humankind whatever is going. Have the same kind of Disgust inclination about their own inequality that other people do you know and that might be like weird and controversial but like almost like a shocking number of billionaires turn to philanthropy right.
01:43:35.90 cactus chu Yeah I mean what else are you going to do.
01:43:37.57 rooooon Like what else you're going to do and I think that the people most likely to develop strong Ai today are quite dead set on like things like Ubi and like redistributing the um you know. The spoils of post-scarcity ai like I've talked to Sam Altman personally about this and like he spends like a nontrivial amount of time thinking of ways to distribute money to people on Mass. Um, and I was like really surprised by that I was like. Sam Surely you don't think that the payment rails are the most important thing to think about right now but he's actually thinking about like methods to pay humanity at large and so I don't know I think that. Redistribution is going to happen at a scale that prevents like I don't know the the worst. Ah.
01:44:45.58 rooooon I'm not entirely sure. But I think that the people in control of these Ais are aligned enough to do things like Ubi if that makes sense.
01:44:56.93 cactus chu Yeah, the problem is like do they really understand human nature though like are they going to really do this in a way that doesn't that actually works that actually like doesn't just piss people off more.
01:45:09.32 rooooon Ah I don't know it's a great question many X factors.
01:45:16.37 cactus chu Because yeah, this goes back to to the point about modeling different people right? I think that the variance in Human Psychology is just so great that it's just.
01:45:23.39 rooooon And. Rather.
01:45:31.55 cactus chu I Don't know I wouldn't I wouldn't trust myself to do this for example, i't I wouldn't trust myself to be able to design something like that. Um, and there's also this problem of like selection bias all of these smart independent minded people are all surrounded by other smart independent minded people or at least friends with those people.
01:45:32.86 rooooon Yeah.
01:45:40.98 rooooon Um, yeah, yeah.
01:45:51.30 cactus chu And so you get a really distorted view of humanity like I remember I think like there's this like X K Cd Comic about like everyone everyone on like a bus thinking like oh Wow All these other people must be sheep but like I think that's just ah, just reflecting the Author's bias.
01:46:03.17 rooooon Um.
01:46:09.12 cactus chu Because like you actually interact with people like no, that's not what they're thinking about and they're not thinking like anything interesting. You actually like interact with normal people like I go I Just like grew up in a very kind of like average environment I think and it's like most people are not like that most people are not interesting as you X Kcd author.
01:46:20.00 rooooon Ah.
01:46:28.18 cactus chu Right? Most people are like it is It is his bias that he actually like knows all of these interesting people like most people do not.
01:46:33.51 rooooon Yeah, ah but I guess what is your point you can you can still enjoy the rewards of post scarcity without being interesting right? like I have ah I have this friend Anton he had a great quote.
01:46:42.90 cactus chu Yeah, but I think the the problem is like.
01:46:51.81 rooooon The other day. A great Great Tweet. It was like I think most people aren't really good at anything and don't want to be but I Also think that nobody should have to work to live you know like it's kind of a weird mix of like right wing and left wing thinking that I resonated with for sure. Like you shouldn't need to be interesting to get a ton of resources to do whatever the hell you want with it.
01:47:18.90 cactus chu Yeah I mean I wouldn't even say that as right wing it's kind of like or like the first thing is right wing. It's kind of like saying like climate change is left wing right? I mean like statistically or like in terms of like who supports it? Yeah, but it's also just like true. Um.
01:47:31.36 rooooon Yeah.
01:47:37.38 cactus chu But I don't know here's here's this problem right? and it's this problem of like Inequality I I'm not sure that people will just be satisfied of having like having more resources I Think there's like a serious problem of people just being like against. A world where they are like basically deprecated I mean this is like very far. This isn't even like short term Ml gains right? but this is like long and ml gains but like like they just.
01:47:58.96 rooooon Ah.
01:48:09.56 rooooon But what do you mean deprecated I Guess they're no longer useful.
01:48:15.25 cactus chu Like like you said they they don't have like they they don't have like anything particularly useful to do.
01:48:19.94 rooooon Well I I don't know I mean I think even today there's a lot of people that feel like they're not useful but we seem to be able to invent. A lot of work for ourselves and and like I said if they want to trade with the economy they always will be able to in a productive way is going to be something you can do that makes money so like the people that are. You know this is also like a distributed trait like right like how much people feel the necessity to work or the need to provide or something like that. There's a lot of people that just don't care about that at all, you know they'd rather sit at home and like play video games or whatever. It's not like universal let hum. Someone feels a need to have a job and feels a need to be like the best at something or like strive for excellence or whatever a lot of people are happy like going to parties and I don't know like anyway like the people that want to get jobs I think will still be able to get jobs. Is my point.
01:49:36.12 cactus chu Yeah I think that's fair I think that's fair I don't know what's your outlook on Gen Z.
01:49:46.24 rooooon I Think that they're like really weird in a good Way. You know like they grew up with the internet and the internet is like naturally balkanizing and like generating subcultures in. Turning people into strange corners and I think that's a good thing like I would like to see an explosion in the like the schools of thought and the weird communities and subcultures and ways of thinking that exist in the world. Um, a lot of people. Say that the internet is like totalizing and like shrinks people into certain like legible buckets right? like right-wing Leftw Wing going to get exposed to a new media environment on the internet but my experience on the internet and I don't know if it's unique. Has been that I get sucked into these like weird little niche subcommunities that is like maybe a few thousand people in them that all like deeply care about something or the other or they have like a social style. That's like super similar and it evolves in a way that. Makes them almost unintelligible to like the outside world like you have your niche Jargon and like your weird in-jokes and you have like your cultural shibbolats and whatever. Um, and I think that's happening to Gen Z at like a large scale and. Probably I'm excited for them. I mean I am Gen Z by the way. So Ah yeah.
01:51:25.80 cactus chu Yeah I think that this is this is exactly right? So something I've said a lot is that basically Gen Z is the divergent generation any kind of like coherent trend that you're trying to.
01:51:37.72 rooooon Oh oh.
01:51:44.24 cactus chu Have across the entire entire ah cohort. The variance is going to go way up just because ah I think this is like a tealism but like freedom variance and Inequality are like all the same thing.
01:51:47.43 rooooon Um, yeah.
01:52:00.25 rooooon Yeah, agreed.
01:52:03.97 cactus chu The more the more freedom you have the more ability you have to get addicted on Porn or the more addicted or the more freedom you have to start an internet company and become absurdly rich. So like.
01:52:09.22 rooooon You could.
01:52:15.12 rooooon Absolutely.
01:52:20.38 cactus chu It all depends on priorities and it all depends on kind of inherent ability and so you're going to see like the tails of these things just just spike all of the way.
01:52:21.86 rooooon Okay.
01:52:27.64 rooooon Yeah, yeah, and I don't know like it Absolutely more freedom more inequality more variance more everything like that's what's happening like ah like you can see it in media. For example, like I mean movies.
01:52:44.50 cactus chu Yes.
01:52:46.78 rooooon Um, where the prito outcomes have become extremely strong. You know like 1 media conglomerate like controls 90% of the movies played in theaters like it's all like Marvel blockbusters or whatever and. But on the other hand, there's like this explosion of amateur content that you know each each Tiktok will get viewed by like 10000 people or something. Um I mean not really, but my point is like successful tiktokers will get decently sized audience right? right.
01:53:17.69 cactus chu Yeah, you can go zero two hundred incredibly quickly.
01:53:24.70 rooooon Um, so yeah, like I don't know is this interesting right? There's like this these dominant cultural forces become more dominant due to like freedom but then like there's this absolute proliferation of like tiny outfits right.
01:53:40.88 cactus chu Yeah I think what's happening here. What's happening here is that there was this quote I forget who it's from which is basically that all of our all of our intellectual interests are different but all of our kind of base interests are the same and.
01:53:43.27 rooooon Oh.
01:53:56.81 rooooon Um.
01:54:00.57 cactus chu I Think that that's actually very false I think people's base interests are actually super different. It's just hard to kind of inspect them in the same way that it's hard to inspect your intellectual interests but with something like social media. You actually can't do that You can do that and you can do that like basically even faster than you can just do the kind of like mass.
01:54:03.21 rooooon Um.
01:54:18.76 rooooon Yeah, yeah.
01:54:19.42 cactus chu Mass Market appeals and that means that you're going to have these like you said these communities these shibboleths.
01:54:25.96 rooooon Um, yeah for sure I mean like I think you're right? Yeah is but like can you explain the phenomenon of Marvel like what is it doing. Why is it so successful like is it. Does it have lessons to be learned for the media environment at large.
01:54:50.10 cactus chu See I'm I don't like Marvel so I don't know I think Marvel movies are lame.
01:54:53.16 rooooon You're not a movie guy I know but I I don't like them either. But we can still like analyze them for why I don't know I don't think so.
01:55:00.28 cactus chu Maybe it's because we're like too too much of a shape rotator. No, like really I think like I think a certain kind of like numeracy makes or like that's not even true. That's not even true because there are tons of numerate people who like I think it's just certain type of like.
01:55:08.68 rooooon I think I'm a wordcel actually.
01:55:13.22 rooooon Ah.
01:55:24.72 rooooon I actually.
01:55:25.28 cactus chu Like I think there's a correlation I don't think it's complete but like I do think there's like a correlation between like liking liking numbers and stats and stuff like that and not liking these kind of like narrative arcs.
01:55:36.91 rooooon Oh no way. Um I disagree actually I think that I think that like engineer brain types actually really prefer like simpler media like they like Elon Musk loves the Colbert report or whatever.
01:55:41.65 cactus chu Interesting.
01:55:51.96 cactus chu Ah, names.
01:55:56.15 rooooon Know it was Jeff Bezos who like really loves the Colbert report or whatever and whenever you look into like the what kind of media like super rich people are watching. It's always like trash. You know you're like really you like that. Um, and.
01:56:07.50 cactus chu What.
01:56:12.66 rooooon I Think that the analytical mindset actually does not um I'm not sure I'm not sure where I'm going with this but I do think that I do think that I do think that numerate people dislike narrativizing.
01:56:20.29 cactus chu Okay, and I don't think either of us have any clue what we're talking about on this topic.
01:56:31.16 rooooon Like that's something I've noticed for sure. Um, but they also do seem to like fun bullshit like Marvel I don't know.
01:56:38.58 cactus chu Okay I need to have I'm going to ask like Steve I'm ask Steve Shu if he ever comes back on. Um because this seems like something that you would have a much better insight on. But yeah I just don't know this is seems like a complete question mark to me.
01:56:47.84 rooooon Me.
01:56:55.73 rooooon I've I've noticed that like like people that have narrative skill do disproportionately well in Silicon Valley because um, it's yeah yeah, but like.
01:57:03.65 cactus chu Yes, I mean what do you mean? disproportionately like it's a skill. What do you expect.
01:57:13.40 rooooon If I think I have some narrative skill. But I think if I went to like l a and I was like I'm going to write movies now I don't think that I do particularly well I'm not that good of a ah writer. Okay, but you get my point like let's say I went to like Brooklyn and I'm like I'm going to become a new media guy.
01:57:14.95 cactus chu Is it.
01:57:22.72 cactus chu Yeah, because the industry is just not doing well.
01:57:32.71 rooooon There's a lot of very skilled writers in that world right? Um I would not do but that's my point like why are they so different like.
01:57:34.59 cactus chu Yeah, the markets are just different. Okay, okay because Tech is a lot more profitable than like whatever they're doing.
01:57:48.24 rooooon But that doesn't explain it at all like why? Why is why are people that can write about technology so rare like communicate the ethos of the Silicon Valley So rare.
01:58:01.15 cactus chu Because like you you have to have like 2 things at the same time right? is it's just like it's just like probability. They're just not correlated so so it it's like are they though.
01:58:05.91 rooooon Um, yeah, yes, they actually but my point is actually anti correlated I think so.
01:58:20.15 cactus chu I don't know or like okay here's the effect here's the effect that I think is happening is that there's kind of like an adverse selection which is that you make more money doing like if you have tech skills. It doesn't matter how much your like verbal skills are because you'll just make more money doing tech right.
01:58:21.23 rooooon Yeah.
01:58:29.91 rooooon Ah, my boy.
01:58:37.40 cactus chu So for example, I'm some I think I'm someone who has like very above average, kind of like wordcel skills for someone who does like machine learning right? and I think you are too actually but like let's face it both of us make more money in machine learning So like.
01:58:40.22 rooooon Um, okay, but I also like to write tweets online like half my waking day. I don't about half like I'd say like maybe.
01:58:53.13 cactus chu Yeah.
01:58:59.77 rooooon Like three four hours a day but like my point is like I clearly enjoy it a lot. There's enough time in the data like do it if you really want to um and you you clearly have a podcast going in a media circuit or whatever though you have a day job as well.
01:59:07.50 cactus chu Yeah.
01:59:16.18 rooooon There's not many people that want to do things like that and especially not many that are successful right? like ah like um, do you know? Do you know?
01:59:24.40 cactus chu Well yeah, well you ask how many people are doing like Journalism jobs especially like legacy Journal jobs. Um, who are like who are doing it because they really really care as opposed to them just kind of like.
01:59:38.36 rooooon Ah.
01:59:42.63 cactus chu Getting put on this pass from like college or whatever and I think that number is actually fairly low I think like they'll say they care but like you you can run the kind of experiment you can run like you can run a kind of like um, longitudinal study and I I would expect at least.
01:59:43.20 rooooon Yeah.
02:00:00.98 cactus chu That you find that that's not really the case but the problem with this is that the people who really care are almost by definition going to be like outliers right? they're going to have a reason they really care and so they're going to look like outliers as well, right.
02:00:01.48 rooooon Okay, you should run the study.
02:00:12.45 rooooon Um, ah you mean.
02:00:20.62 cactus chu Even you or I who is not the level of an agm Antonio Garcia Martinez or like ah a bally shnavain right? who are kind of super successful in both tech and um.
02:00:25.18 rooooon Yeah. Move.
02:00:38.13 cactus chu And like writing or communication. Even someone like us I would say is still relatively rare and that's just because the base rates of these things are low, like caring a lot like you have to be like 3 things at this point you have to have the math skills.
02:00:38.26 rooooon Mean.
02:00:44.40 rooooon Me.
02:00:56.54 cactus chu You have to have the verbal skills and you have to care a lot enough to do this basically like on the side.
02:00:57.99 rooooon Yeah, yeah, and I want to like yeah I don't think that our verbal skills are as insanely rare as theirs. Like the people that write long posts inside of like Facebook or Google or whatever like product managers and they're like analyzing some aspect of their business and they write a long report about it. That's clearly like verbal skill like there's many people that are managers and. Product people and whatever that are quite good at that. But that's not the same as narrative or like storytelling I guess I'm not sure. So yeah, there it's like there's like you definitely have to care a lot. You definitely have to like to know enough about you to be good enough at like.
02:01:38.36 cactus chu Yeah.
02:01:49.60 rooooon Technical skills that you are embedded in the tech Industry. You have to have an active interest in telling the story which is rare. Um, and yes, it's a lot of criteria. But I think my point is that the covariance between several of those skills is low, like they're actually anti selected for each other So I'm not sure.
02:02:18.46 cactus chu I Don't know whether I'm pretty sure from the kind of intelligence research verbal and ah verbal and geospatial or the kind of like Mathy side stuff are correlated are like positively correlated.
02:02:33.36 rooooon It's not strictly about it's not strictly about intelligence though you know like there's clearly many people that have high verbal intelligence that could not make a presentation about what they do at work or something like that. You know so.
02:02:37.13 cactus chu Yeah, the.
02:02:45.25 cactus chu Right? right? Yeah, it's kind of like it's kind of extroversion although I don't think either of us are particularly extroverted. Ah yeah, it's kind of like yeah like the presentation like the art of this.
02:02:53.50 rooooon No, it's not even that it's something else. Yeah.
02:03:02.90 cactus chu Like it's art or like the sales right? It's part of the deal. It's like the Trump thing, it's charisma. Yeah, that's the word. Yeah I don't know yeah knowing more about that. There's kind of like a podcast but or like a Youtube channel about like charisma something like that charisma on command.
02:03:02.31 rooooon Um, man maybe.
02:03:19.32 rooooon Yeah, yeah I don't think I'm particularly charismatic maybe on the internet but not in real life.
02:03:21.40 cactus chu I should probably get those guys on. I don't know, like they seem to be pretty interesting. Yeah I don't know this seems like something you had changed though I don't know I think I went from not being charismatic to at least being fairly charismatic just from learning to speak English properly which I did through doing a podcast and doing and like going to the clubhouse.
02:03:31.50 rooooon Perhaps.
02:03:46.30 cactus chu Like I'm not even joking I mean like I knew how to speak english that I didn't know how to speak english well and then now I think I'm a lot better. You just look at episode 1 of my old podcast metapolitics and it's just like just like words it's just like it's shaped rotation.
02:03:46.90 rooooon Mmm nice.
02:03:57.38 rooooon Yeah.
02:04:04.50 cactus chu But like with words if you took a shape rotation. You took like you took like the output of that and you just tried to like you just tried to like put it into a sentence.
02:04:13.60 rooooon Yeah, interesting. I personally don't think I will ever be top 10% at speaking like that. I wouldn't even bother trying. I think maybe at writing and like a certain subset of writing. But yeah, anyway, do you want to do this?
02:04:28.31 cactus chu Ah, you underestimate yourself you underestimate yourself like I'm going to put out the clips from this podcast right? and I think they're going to do well. I think they're going to do well.
02:04:33.00 rooooon Perhaps on ah probably but for the content not for the delivery I Assume anyway do you want to do the M O lightning round that sounds fun.
02:04:41.33 cactus chu Oh yeah, that's true. Oh yes, yeah, you can see like this like yeah I just added this because I thought wait this is a great idea I should I should do this because I did a lightning round with a previous interviewee as well. So here's how it's going to work.
02:04:56.73 rooooon These are usually fun.
02:05:01.10 cactus chu Yeah, we're gonna we're gonna finish it off with this and we can do the explanation after but I'm going to ask you for each of these each of these different things I'm going to say is easy , medium or hard. So 1 word answer is easy , medium or hard. Ah for machine learning to do and then we can explain it after you are ready.
02:05:06.93 rooooon Um.
02:05:17.67 rooooon Sure I'm ready. It's hard. Ah, that's really hard and is not an Ml problem I think.
02:05:20.24 cactus chu All right? How hard is driving finance trading.
02:05:33.75 cactus chu Coding writing code.
02:05:35.10 rooooon Medium I would say I think we're getting pretty good at that. That's actually easy. I think the human face displays a lot of emotion and it's pretty pretty simple too.
02:05:39.29 cactus chu Understanding emotion.
02:05:53.25 rooooon Gather.
02:05:55.30 cactus chu Treating emotion or changing people's emotion.
02:05:57.19 rooooon Medium I think that I think that language models will if they're not already able to soon. We'll be able to change people's moods by something that they say. Or like move people with a piece of text or a story or something I know for a fact that something that I co-wrote with Gptthree made me a little bit emotional so already happening if.
02:06:23.21 cactus chu Who oh shoot those too many words man um, too much word selling? Um, but how about marketing or like making materials for marketing making like graphics and stuff.
02:06:32.20 rooooon I
02:06:36.30 rooooon Easy.
02:06:39.32 cactus chu Writing math proofs.
02:06:42.32 rooooon Ah, that it could range across the entire spectrum based on the proof.
02:06:48.22 cactus chu Yeah, that's fair, making like novel math proofs okay, and 1 more one more for the lulls central planning.
02:06:52.25 rooooon Medium to hard.
02:07:01.52 rooooon Fairly easy actually.
02:07:04.44 cactus chu Okay, those are a lot of answers and we can talk about any of them. Which ones are you excited to talk about? Awesome. Let's talk about central planning.
02:07:11.10 rooooon I want to talk about central planning. Um I think that several big corporations today are centrally planning the economy in. A way that would have made the soviet salivate. Ah I think that Amazon is forgetting Amazon it's too sexy like even Walmart is a logistics delivery behemoth.
02:07:31.60 cactus chu Yes I have exactly the same take awesome. Okay, go on explain your end.
02:07:49.37 rooooon Delivers goods across the world prices and well ah knows exactly how much of each thing to buy like understands like each individual item from where they buy it to where to enter is a consumer's hands. Have like minimal rates of slippage or theft or whatever. So It's clearly somewhat proof that large portions of the economy can be centrally planned although I mean I don't know if that satisfies. Most hardcore definition of central planning. But and they don't even need like super hardcore ai algorithms though they become much better with forecasting and things like that. Um, even like really clever. Yeah.
02:08:37.28 cactus chu Yeah I think the novel tech I think actually makes a lot of difference especially for like Amazon just in time.
02:08:41.92 rooooon Yeah, oh for sure. But like they've been doing that for like 10 years I would say modern Ai really only started around like 20112012
02:08:53.99 cactus chu Wait How much of how recent have they been doing just in time though. That's the question I think, like the lower the latency the more the more you have to be predictive.
02:08:59.21 rooooon Um, I don't know what you mean just in time.
02:09:10.41 cactus chu Basically like basically like or like 2 components of this which is like 1 is um, you have exactly the capacity in your warehouses and stuff like that in order to fill things up or in your transport. Basically there's very low slack and number two that you can like.
02:09:18.62 rooooon Um, yeah.
02:09:27.72 cactus chu Get things right away you can have like same day delivery or whatever.
02:09:28.74 rooooon If you're talking simply about low slack supply chains that've been around since like the 70 s with like japanese manufacturing. Maybe the 80 s I don't know but like the one day delivery thing is clearly yeah, it's an incredible feat of like. Prediction technologies yeah.
02:09:49.55 cactus chu Yeah I think yeah, there's this book. There's this book by I think like a pretty far left author called like the People's Republic of Walmart that makes exactly this case. I actually talked to someone like this is like my most obscure interview ever. I interviewed someone and like 10 people watched it for my old podcast.
02:09:57.11 rooooon Earth. Ah.
02:10:06.55 rooooon Oh.
02:10:08.79 cactus chu Um, where like this was one of the things we were talking about and so you ask yourself the question like what is the actual big central planning problem. Um, and it's about like it's about preferences right? and this actually gets super dark which is that. What 1 is like you can just ignore people and you can keep a lot of a political power which is kind of like 1 of like the normal normal critiques of like central planning and I think it's like a true critique I don't really disagree with it. But the other is like people's preferences are super malleable and this is like 1 of the dark things about it as well. Right.
02:10:30.11 rooooon Um. Um, yeah.
02:10:47.32 cactus chu Is that like if you like scale this you can basically like you can basically like move a lot of sales by like basically you like manipulating the media and I think that's quite dark.
02:11:02.50 rooooon True. Yeah I don't disagree with any of that.
02:11:04.88 cactus chu Yeah I was yeah I was really excited to talk about like the understanding emotion thing because I think like a lot of people have like the opposite take like especially like a lot of people who don't know that much about Ml are like oh man, it's going to be super hard for them to understand like these things that they consider like deeply human. Um.
02:11:16.49 rooooon Month. Yeah.
02:11:24.71 cactus chu So like actually once again like give your reasoning first and then I'll hop in.
02:11:27.89 rooooon Um, okay so how does a human determine emotion? How do we do social reasoning? Yeah exactly it's instinct right? Like we take the gestalt of a human standing before us like they and we have such crude like.
02:11:33.12 cactus chu By Instinct. Basically yeah.
02:11:46.76 rooooon Low dimensional ways of communicating emotion right? like water will literally start falling out of our eyes like it can't be more obvious certain things right? Um, like our face will flush when we're aroused or something and or like when you're embarrassed any of that stuff like your eyes twist.
02:11:52.83 cactus chu Yeah, yeah.
02:12:01.76 cactus chu Have you actually ever seen that happen in real life? I've never had that happen in real life to anyone I know.
02:12:09.69 rooooon I really like pale people. It works better. But yeah, if they're really angry or something, their face will definitely get flushed. Um, and yeah, entirely possible? Um, but it happens even with them. It's just a little bit harder to see.
02:12:15.87 cactus chu Maybe it's because all my friends are East Asian or South asian.
02:12:26.83 rooooon Um, and so yeah, you know it's not some sort of high-order reasoning Problem. It's like a simple instinctual Problem. You can tell from body language facial expressions like eye tweaks etc like what. Ah, like how a person is feeling like it's not the same as reasoning about moving bodies on the freeway or something like it's actually harder. Um, that's actually harder. Yeah physics. Not people. But um, so like I think.
02:12:56.26 cactus chu And by bodies you mean like physics objects. Yeah yeah.
02:13:06.20 rooooon Understanding What people are feeling is probably fairly simple, like there's obviously a long tail of really complicated emotions that you wouldn't be able to understand without Context. Um, but the simple stuff for sure and. But that doesn't I. I don't think it translates to like what a human does knowing what someone else is feeling is more interesting right? Like how does it change the way you approach a conversation like are you going to be strategic in how you talk to someone because you know they're angry.
02:13:40.34 cactus chu Yeah.
02:13:41.92 rooooon Um, so that social processing is a lot more interesting and complicated than merely recognizing that someone is um, embarrassed or sad or like some combination thereof.
02:13:54.10 cactus chu Yeah, my take from it is just like ask yourself this question, take like a person who you kind of know, not especially well not or like even like maybe your wife or something like that right? if you the listener have a wife or a husband. Ah.
02:13:57.91 rooooon Are.
02:14:12.80 cactus chu And ask yourself like really how often do you understand them right? How often do you take in all of the information that you can follow your instinct whatever and you actually guess right and how many times do you misunderstand them or how many times you have no idea and I think that for most people.
02:14:17.49 rooooon Yeah, yeah, right.
02:14:29.17 cactus chu The answer is a small percentage of the time. Do I have any clue and so like that's not a high bar. Um, and that's basically that, that's basically my reasoning is that we think that humans are good at understanding emotion. But really, they're not.
02:14:31.70 rooooon Right? no.
02:14:43.92 rooooon Yeah, the.
02:14:46.26 cactus chu And that's why I think that comparatively it will do a good job and I think there are a lot of towels you can pick up on that most people don't.
02:14:50.55 rooooon They're not yeah humans humans aren't great. Humans aren't great at empathy but they're also surprisingly bad at understanding their own emotional valence, especially like I think a certain kind of person likes.
02:14:59.93 cactus chu Yes, yes.
02:15:05.24 rooooon There There are people that are more in touch with themselves or whatever but I don't think I personally am like I think I will spend entire days in a funk and then I'll like take a walk outside and realize I didn't need anything except like fucking like fresh air. Um, and. And I guess my point is that even human- like self understanding of emotions is an incredibly crude process. I try to remember the anecdote but like. I Think Tylenol will soothe emotional pain sometimes and I think like um, there's this thing where no no these things are placebo controlled these these studies. But um, there's another thing where people um.
02:15:47.80 cactus chu That's like a placebo man. Okay, okay.
02:15:59.73 rooooon Like really relax the tension in their upper body or something they find like they find it's very hard to be angry like the physiological response to anger is to like tighten up and those 2 those 2 things seem to be incredibly highly linked.
02:16:12.40 cactus chu Oh yeah.
02:16:19.19 rooooon So it's just like it's a fucking shit show in your head like none of it really is built like logically right. It's just ad hoc um me you know.
02:16:26.15 cactus chu Yeah, so I had Robin Hanson. He was actually my first guest on the show and he wrote a book called The Elephant in the Brain. The case in that book is basically that it's better for you. Not to know? there's a reason that these things are being kept from you.
02:16:39.40 rooooon Yeah, yeah, yeah.
02:16:43.51 cactus chu And part of it is to function better socially. Part of it is to um, just not consider as much information and not have to handle as much information and that basically like there's a lot of benefit to this. So maybe this is like Pandora's box that you don't really want to open.
02:16:55.50 rooooon Yeah, that's interesting. I haven't read that book but I really like Robin Hansen. I think he's a I think he's a king of the internet auts for sure I love him. Ah yeah, okay like the internet.
02:17:00.53 cactus chu Yeah, it's ah it's a really cool book. It's a really cool book. How can that not be Elon musk.
02:17:14.61 rooooon Like Autist philosophers. Maybe.
02:17:16.35 cactus chu I Still think it's you. Ah.
02:17:20.24 rooooon Now Elon is not a philosopher no matter how you slice it.
02:17:24.70 cactus chu Okay, okay, fine, um all right. This is like the real This is probably maybe the thing that we have the most data on or maybe this is finance but like why is driving hard.
02:17:32.13 rooooon Um, ah ah, um, driving is hard mostly because vision is hard ah because there's a shit about weird things that happen.
02:17:45.91 cactus chu Yeah, that's fair.
02:17:51.20 rooooon On roads like you know there's ah and it's like such a long tail of things right? like there's yeah, what do you even do right? like maybe the plastic bag was not in the dataset right.
02:17:55.99 cactus chu Yeah, you get a plastic bag flying across the road like what the heck you do you do.
02:18:06.92 rooooon Now you have to recognize amorphous objects and reason about how they're going to collide with your car. If it's a danger, you can't be slamming the brakes in the middle of the freeway because there's a plastic bag. Um, there's like there's a visual reasoning aspect like.. There's this funny example of a truck carrying a bunch of stop signs on the back of the truck like it's just literally transporting stop signs but you know the car is not going to pick up on that and actually I think in this case, it did like the car didn't stop or something.
02:18:28.87 cactus chu Oh.
02:18:43.11 rooooon But it clearly recognized it as a stop sign and it was kind of a confusing edge case. Um and is the social reasoning of like okay, there's a human walking across a sidewalk like what is our risk tolerance that he's going to like just ah, start sprinting right. Ah, so there's like the social reasoning There's like who goes first in this intersection without stop signs. Ah, it's like this herd behavior. So. It's like yeah yeah.
02:19:13.76 cactus chu Yeah, there's this kind of like dynamical system process. Ah concepts right? Where were like the fact that you you are a self-driving car changes. How drivers might behave. Yeah.
02:19:20.65 rooooon Yeah. Um, exactly So There's like it's like a myriad of individual problems and at their intersection is like an even much harder problem to do all this at once and it's like. It's surprising how good we are at it and getting better every day. So while I say it's hard I don't mean unsolvable. I think we're really close actually.
02:19:45.80 cactus chu Yeah, all.
02:19:53.36 cactus chu Yeah'll, Actually play the counterpoint which is that I'm going to say the same thing that I said about the last issue: a car like look at humans looking at how often they crash right? A car is never going to be drunk. The car is never going to fall asleep on the wheel.
02:19:57.65 rooooon And. You? Yeah yeah.
02:20:08.61 cactus chu Ah, a car is never going to have like ah have like a heart attack a car is never going to just do like those those random random like human things that just are basically are like uncontrollable or unpredictable. Yeah, that's fair.
02:20:18.96 rooooon Right? But there are some random Ai failure modes that are unpredictable in different ways right? like the lighting was wrong or something of course that's the open question I think at the moment the ai one is larger but I think it's shrinking. It's shrinking as we speak.
02:20:26.95 cactus chu But like which one is smaller, like really which one is smaller.
02:20:33.27 cactus chu Okay.
02:20:38.20 rooooon And I do think the convergence is soon. But what Elon has been saying is that it needs to be 10 times safer before the regulators in the public will be comfortable enough to allow it like with full access rights.
02:20:53.70 cactus chu Yeah I think that's yeah I think I see that being the way things are going to go maybe even more like it more than 10 times. Yeah I think that that's very convincing to me even though I kind of started closer to your position probably um, okay ah what about finance.
02:20:59.60 rooooon Um.
02:21:07.71 rooooon Um, yeah.
02:21:13.15 rooooon Um, Finance is very rarely about um like finding like there is very very very little signal in the stock trading data compared to the noise.
02:21:13.54 cactus chu Why is this super hard?
02:21:32.42 rooooon And the way most people make money in finance is not really ah like this market microstructure stuff where you're analyzing like the stock flows or something. It's more like I don't know they are there. Betting on macro trends or something like that and yeah, they're like statistical traders for sure and like these prop shops but the assets under management of those shops are surprisingly small. It's um I don't know maybe like a couple billion dollars across like all of like. Change street and jump trading and all all those high speed shops. Um, so I'm just not sure. Yeah, like I don't think that the I know that um citadel has is it citadel.
02:22:23.32 cactus chu I Don't know what you're going to say so I don't know what the answer is.
02:22:24.88 rooooon I Don't know one of them and I Cqs or something said it all. Anyway, one of them has an ah Ml team and I'm sure they make some money that way. But I think it's probably really specialized and not like a general thing. Um. I Think that it's an adversarial system and probably the money is to be made in like reasoning about the future in like macro terms and not micro terms. So yeah.
02:22:46.10 cactus chu Or.
02:23:00.10 cactus chu Yeah I guess we just have different conceptions of finance because let me tell you a story-ish thing and get your judgment on it. So something we learned recently we kind of guessed that this could happen but something that we basically learned for sure recently.
02:23:03.65 rooooon A.
02:23:14.46 rooooon Um, ah and.
02:23:16.86 cactus chu Is that you can predict pandemics spread by looking at people's Google searches like people will Google search for their symptoms and you can tell where the pandemic is going to go now. Maybe there are some smart people who did this but I didn't know anyone famous who did this but you can look at this Google search data. It's public, you can look at this Google search data and then you can just make bets.
02:23:27.27 rooooon Yeah, you know. Yeah, okay, okay, that's super reasonable. Yeah I agree with you. There's other cases where you know people use ah ml to do things like Pattern recognition of farmlands or something to predict crop yields.
02:23:35.93 cactus chu Based on it if you think there is this pandemics thing that's going to happen and you can reasonably make some bets on it. Yeah.
02:23:54.46 rooooon But I think that what I really meant is that um the reasoning it took to come up with that conjunction like if crop yields are down like we can utilize. Drones to look at crop yields to see if they're down via machine learning and then take a bet on these specific futures because of that as a like reasoning problem that I don't see machine learning doing I Guess that was my point. Um.
02:24:24.15 cactus chu Yeah, but here's the thing right? You have the bet that you have a lot of free data right? You have a lot of public data. You have a lot of these kinds of correlations and all you need to know is the correlation right? and you can.
02:24:32.70 rooooon But a.
02:24:40.96 cactus chu Set your Ml algorithm on it and try to find these correlations. Um, and now I guess the question is like how dense are they right? How much signal is there in the noise. Um, when you take this larger data set, not just the data set of stocks. Um, and ah.
02:24:42.40 rooooon Sure.
02:24:48.52 rooooon Um, yeah.
02:24:58.54 cactus chu My intuition would have pegged this at medium. But maybe I'm wrong I don't actually know what the answer to this question is but like here's the thing right? All you have to do is find the correlations and I don't think it should be that hard to find the correlations.
02:24:59.95 rooooon Um, okay, interesting. Yeah.
02:25:12.85 rooooon Um, um, the correlations in what exactly.
02:25:16.48 cactus chu Like if you have like for example in the search data example. It's just like when the search data goes up, you should get or like these other stocks will go down right? So you basically do like you do, um, you do like time you have 3 variables you have time search data.
02:25:22.74 rooooon Yeah, yeah.
02:25:36.36 cactus chu Um, stock prices. Well that's obviously more than like 3 3 individual variables but like yeah.
02:25:37.12 rooooon Yeah I don't know about that again I don't know about that because the stock price is like the signal to noise there is like almost to nothing almost nothing like you you have to do more reasoning than that you have to like build a model of. The finances of that company and then predict some variable in the finances of that company because of something like what you gained with your Ml data and then make a stock prediction based on like value investing or whatever. I don't think you can just say like here's a time series to optimize it or correlate. It.
02:26:12.25 cactus chu Have.
02:26:15.53 rooooon Because there's too much shit going on in that data like.
02:26:17.17 cactus chu Yeah, that's fair. That's fair I didn't Okay, so here's like the thing is like this is like very we're we're dealing with so many unknowns here that like it could swing either way because here's the reason why I think it could swing back to like the easier end.
02:26:33.69 rooooon M.
02:26:35.81 cactus chu Is that sometimes you just get big correlations? I think that if you just like you just like very strong signals. Even if there aren't too many of them, like the pandemic thing, I think it is like a strong signal that like people didn't really realize, maybe a few people realized and that I really don't think was priced in well like evidently was not priced in.
02:26:50.73 rooooon Yeah, but would knowing that particular spread inside like America would have helped you predict stock prices like which major stock I see one.
02:26:54.79 cactus chu Um.
02:27:02.71 cactus chu Or it was not even just that it was like between countries. So like we kind of knew that Italy was going to be next? um, well like we like as in like the public didn't really know but like we could have figured that out right now like maybe the Alpha is gone from this now. But
02:27:10.49 rooooon Um, um, but yeah.
02:27:22.20 cactus chu Maybe there are a lot of other things that are like this that are just strong that you just have to notice. But once again, this is like maybe I don't know if this is actually true or not yeah yeah, it was really cool talking to you man.
02:27:23.71 rooooon But Bret.
02:27:30.62 rooooon All right? Yeah, all right Dude I am kind of tired I think we should call it.
02:27:41.74 cactus chu Ah I think this was probably the most chill podcast now not a lot of not a lot of kind of high octane high octane questions but definitely a lot of ones that matter Anyway, though I think it was really great talking to you man.
02:27:50.78 rooooon Ah, yeah, it was good talking to you I'm not yeah I wasn't in the mood today for a screaming match or anything so there's a good this a good convo you all right man anything else or.
02:27:59.27 cactus chu Ah, yeah, awesome.