Discover more from From the New World
What The Hell Happened To Effective Altruism
How A Measured Movement Fell To The Politics Of Fear
EA used to be great. But there’s no way to make EA great again.
This story starts close to the end. It’s September 2022 and EA Global DC is in full swing. Me and a few thousand others are at the heart of the blob, the Ronald Reagan Building, where the Centre For Effective Altruism flew in world-class autists to talk to sympathetic staffers and college students.
I wasn’t doing anything interesting this conference. Just watching from the sidelines. At a closed door event which he’s since been happy to describe, Tyler Cowen told the audience of mostly EAs that this was peak EA. It was the best it was ever going to be. Soon, EA would fall victim to its demographics. The homogeneity of coastal college students would dominate the free-thinking, quantitative spirit that we all thought was the heart and soul of EA.
Turns out that Tyler was right, and the audience was wrong. I specifically was wrong in telling people there was hope for EA. In truth, even Tyler was probably late on the call.
The Strawmen Are Simply Real People
This article is addressed to me circa the start of 2023, right about when I was marking EA Global London on my schedule. The degree of split within the EA sphere was unfathomable to me.
Many older EAs feel strawmanned, especially by recent critics. In many ways those individuals specifically are being misrepresented. They are criticised for oversimplified behaviours and beliefs that they do not hold, not because the critics are acting in bad faith, but because the median self-identified effective altruist simply does not share any of the admirable traits of the old heads.
I’d already met some of the new EAs, who were clearly college students following the trendy new movement. “This can’t be what EA is, not after so much effort has been put into ensuring it specifically isn’t that,” I thought to myself. I was a lurker in Lesswrong and EA forums for half a decade, and was at that point the Vice-President of Waterloo EA. It is second nature to go to the EA conferences and meet up with the good people you’ve known for years or decades in some cases and brush aside the less savoury characters.
But the evidence was there. Personal attacks against Nick Bostrum, Robin Hanson, among other leadings figures, not for their positions on the most important issues, or even for being wrong, but for not conforming to the socially desirable position on the current thing, which is quite frankly completely irrelevant from the EA perspective. Complaints about meritocracy and the difficulty of providing evidence for your claims. Leaks to legacy press smear merchants. But it is so easy to say “those people might post on the EA forums, go to EA meetings, and work at EA organizations, but they’re not real EAs”.
EA was founded on rejecting a great falsehood: a lie that almost everyone believed, that is not just false but the opposite of truth. This was the lie that donating based on your feelings was the best way to help people. This lie wasn’t just false, it was obviously the opposite of the truth. Every year, there are plenty of news reports of how fraudulent charities would steal money, or somehow even make the problem worse. Instead, EA would do its best to use quantitative evidence to make its decisions.
In the end, the problem was hypocrisy. Sam-Bankman Fried didn’t just ignore the evidence of his company’s collapse, but tried to hide it (according to the courts that convicted him). And then we have OpenAI.
OpenAI was set up to be a non-profit, then later a capped-profit structure. “For-profits are bad” was the sentiment. What evidence was this based on? All the evidence was to the contrary, much of which EAs helped to compile. Non-profits were much more likely to be fraudulent, or to simply fail to achieve their goals.
In reality, the cause was social desirability. “Capitalism is evil” is a great falsehood closely connected to the great falsehood that EA was supposed to reject. The tribal instincts which lead people to support ponzi wells are identical to those which make people hate the actual functioning institutions we have. These same tribal instincts are what leads to safetyism, a malicious pathology of equality and prevention, of stifling freedom and innovation in the name of protection. EAs understood this in the context of drug development and longevity. They understood that the neurotic regulators which banned all drugs by default have a body count in the millions, up there with the great dictators of history.
If you’re an old head, or simply one of the good EAs, you probably feel very strawmanned right now. Let me repeat, this article is for you to read specifically, to inform you about other people who self identify with your movement. I don’t want to associate you with the people I’m describing, but in practice they have a greater impact on how ‘EA’ funds are invested and ‘EA’ actions are taken than you. As Tyler said, demographics are destiny. I wish there were some Robin Hanson-esque hero, immune to social desirability, who is in charge of EA funds and continues to operate by the principles of early EA, but there’s little evidence that’s the case. Maybe the Collisons are the closest, but I’m not sure whether they want to be associated with EA, or marked as ‘EA’-adjacent at this point.
No High Ground
The propaganda process is inevitable. It is inevitable that the EA brand will now be associated with irrational feminized college students rather than interesting quantitative thinkers willing to bite socially undesirable bullets. In many ways this association will be correct in practice, because we are outnumbered. Old EA will exist only as a memory. But in public it will be easier to say you were never an EA than to explain what actually happened. I think many in the ‘EA-adjacent’ crowd realized this and didn’t identify with EA for this reason. There are many more who haven’t and would be far less misunderstood if they did.
What I have watched is the total deterioration of that movement into something indistinguishable from the emotional and narcissistic charity blob EA was born to combat. I was in denial of this deterioration for years. It’s an incredibly easy thing to deny. It’s normal to stay away from people at EA conferences who don’t share the original EA values and consequently surround yourself with either literal old heads or old heads in spirit. But every year there’s more of them and fewer of us.
And thus EA went from caring about proving its points with empirical evidence to socially desirable narratives. It went from wanting to repeal the FDA to wanting to make a new one for AI. It went from being curious to being neurotic. It went from unrepentant exploration of every issue in search of the most important thing that could be done to crying for more state intervention without a second thought to whether the state would increase existential risk rather than reduce it.
In short: I’m sorry for our loss.