12 Comments
May 6, 2023Liked by Brian Chau

So is this like a "harm reduction" approach to relationship facsimiles? You can't get them off the drug entirely, but can at least give them clean needles.

Think I'm with Harrington.

Expand full comment

The main difference is not *who* wins the profits of the parasocial relationship competition (honestly, why would anyone outside that game care about that?), but just about the relative power of ParasocialAI versus anything manageable by real people.

Think drugs more than parasocial relationships; something where revealed preferences might not be real preferences due to human weakness against adversarial inputs. For many people, drugs make their life worse in most ways, but they are not free to get rid of them. Of course, some people who can easily try out some drugs and stop, but most people never try the drugs at all. Note that in case of drugs, there are many factors for why people don't try them (in addition to drugs being commonly understood as bad things, *creating and dealing drugs is a crime* in most non-US locales), none of which apply in the AI case.

Now to preempt the obvious reply: my values definitely include many people vulnerable to drugs, but even if your values do not extend to people that are too weak to fight off drugs, there's no real guarantee that ParasocialAI harms the same set of people. Chemical channels surely enable more control over the human body than audiovisual/text channels do, but AI can be optimized much more; and also it's infinitely easier to give someone a taste of ParasocialAI than to make them take some drug.

As for when any of this is relevant... who knows. I would bet a not-too-large amount of money on parasocial relationships being one of the easier things to be very superhuman at? Not sure how to formalize the bet, because there are large cultural incentives against actually doing this.

Expand full comment

More inclined towards Harrington/Haywoods position than yours. Empathy that is freely given and not earned is pouring gasoline on a burning heap of self pity. Make real friends who have a real chance to dislike you, and force you to change certain cringe behaviours. Or else let those behaviours accumulate and fester in infinite regress with your AI empathy bot... Lol. Really guys, can we seriously not normalize this AI "empathy" crap? To call it a parasocial relationship is even kind of a stretch. Listening to a podcast is a parasocial relationship, because at the very least, it's a real conversation operating under real conversational and social dynamics. Even if you are not actively participating, you are at least learning, by osmosis, the conversational exchange so maybe in the future, your type 2 system might readily recall the patterns and make it available to you in a real convo. Having a conversation with an AI is pure self masturbatory solipsism, absolutely approaching zero useful feedback (although maybe those AIs with the nice filter turned off could prove a bit different). As Chesterton once said, no adventure is without a risk to your life. No empathy is worth having if there is no risk of rejection

Expand full comment

The danger of social media is the feedback. Showing Brian a video of Trump doing something Brian approves of, and seeing Brian gives some minor positive feedback about Trump teaches the AI about how Brian can be manipulated. The AI has this data from a large number of people. The AI then feeds similar videos to Brian, and after some time, Brian may show some positive feelings about Trump, and may actually vote for Trump over Biden in 2024.

Expand full comment

"What I take aim at is this intermediate position that AI is the line that’s too far. If you have porn, Twitch streamers, OnlyFans, etc. you’ve already done the disembedding. You’ve already created the world in which people prefer fake empathy to embedded empathy, AI just cuts out the middlemen."

This is missing the crucial dimension of degree. The fact that we had spears and arrows doesn't make nuclear bombs less scary, or "just more of the same".

"There is honestly not much to say about this other than the fact that it lines up with the historical pattern of Luddism and regulation."

I don't think historical Luddism is what you think it is. Luddism for one was never about Luddites not being the "center of attention" or about some dislike of technology qua technology, and the reasons they revolted were more than justified.

"I’m typically a strong believer in self-deception, but I think the degree of separation between parasociality and real empathy is so clear and undeniable that a distinction between the two has to re-emerge"

And yet as already established, people bond with Twitch streamers and OnlyFans models, substituting this for intimacy.

Why would such a distinction"re-emerge" because of LLMs, when LLMs would be able to offer all that those OnlyFans and Twitch streamers do (given the inevitable emergence of LLMs combined with human-like 3D models), while actually being orders of magnitude better at personalized two-way communication and rapport?

Expand full comment