Discussion about this post

User's avatar
Dain Fitzgerald's avatar

So is this like a "harm reduction" approach to relationship facsimiles? You can't get them off the drug entirely, but can at least give them clean needles.

Think I'm with Harrington.

Expand full comment
Daniel Paleka's avatar

The main difference is not *who* wins the profits of the parasocial relationship competition (honestly, why would anyone outside that game care about that?), but just about the relative power of ParasocialAI versus anything manageable by real people.

Think drugs more than parasocial relationships; something where revealed preferences might not be real preferences due to human weakness against adversarial inputs. For many people, drugs make their life worse in most ways, but they are not free to get rid of them. Of course, some people who can easily try out some drugs and stop, but most people never try the drugs at all. Note that in case of drugs, there are many factors for why people don't try them (in addition to drugs being commonly understood as bad things, *creating and dealing drugs is a crime* in most non-US locales), none of which apply in the AI case.

Now to preempt the obvious reply: my values definitely include many people vulnerable to drugs, but even if your values do not extend to people that are too weak to fight off drugs, there's no real guarantee that ParasocialAI harms the same set of people. Chemical channels surely enable more control over the human body than audiovisual/text channels do, but AI can be optimized much more; and also it's infinitely easier to give someone a taste of ParasocialAI than to make them take some drug.

As for when any of this is relevant... who knows. I would bet a not-too-large amount of money on parasocial relationships being one of the easier things to be very superhuman at? Not sure how to formalize the bet, because there are large cultural incentives against actually doing this.

Expand full comment
10 more comments...

No posts