When I think of what social conservatism is, the core question that comes to mind is “how much do you leave to tradition and social norms, rather than individuals and experimentation?” I don’t mean the particular culture war fight of the day. I mean things like going to church, not having sex until marriage, marrying early, or abstaining from drugs, alcohol, and pornography. Not only do I believe that practicing social conservatism is good, but I believe that a greater degree of social pressure, along with the acquiescence to that social pressure, is good.
The one premise of this argument is that we have better things to be doing than sex and drugs. Maybe you don’t believe this. In that case, this article isn’t for you, and I honestly don’t think I could say anything that would convince you. But if you think that scientific research, entrepreneurship, philanthropy, or really anything at all is a higher, more important, or more moral thing to be doing, then this argument is for you. In practice, it’s targeted towards the Effective Altruist types in my audience, though I’m sure the conservatives in my audience will appreciate the article for encouraging them to do things they’re already doing (like and share, guys).
I’ve seen the empirical case made several times, including by Tyler Cowen in person at EA Global DC and Lawrence Newport making the historical case in response. Religious people are happier and more engaged in their communities. It encourages people to donate and volunteer more. There are positive externalities to religion. However I rarely see anyone trying to argue for the social control imposed by these communities. This is reflected by EA in practice: they have conferences, group houses, meetups, and are certainly not lacking in community. But they rarely have any kind of social control and a subgroup are often far more interested in drugs and polyamory (promiscuity). I argue that the social control aspect is good and that EA would benefit from more of it.
The most simple version of this argument is that you just have better things to be doing. One of the more generalizable premises of EA is that you can evaluate the potential in different categories of philanthropy, or cause areas. You can predict ahead of time that research in some fields is more likely to make far more of a difference than others, for example, in machine learning and biotechnology. The same can be said of thinking about your own beliefs and practices.
The naive model is one of straightforward time investment. What is the relative expected value of having premarital sex compared to buying malaria nets or creating alternative proteins? What is the expected value for yourself compared to conducting research, starting a business, or simply learning more about the world? It isn’t even close.
I mention that this is the naive model because there are several objections to the nature of the tradeoff. One might argue that it takes as much effort to control your desires than it does to indulge them. This is true in some extreme cases, like the Islamic Republic of Iran. I titled the article “10% More Social Conservatism” for a reason. I’m not arguing for Iran and in almost all cases don’t think military enforcement is practical or cost-effective. Instead I’m arguing for a shift in informal social norms, which will won’t fully prevent anyone from doing the things I oppose, but hopefully encourage some fraction of people not to.
In the case of social norms, I find it very unlikely that time investment outweighs the benefits due to the sheer asymmetry of time spent. For example, it took me an hour or two to write and edit this article, and I’m a rather slow writer. Even the most slick pick-up artist would have to initiate a conversation, travel to a partner physically, find a private space, do the act, and go home. Even if I’m only convincing one or two people to abstain once in their lives and spend that time doing something worthwhile instead, this would be net positive expected value. Moreover, this example is overstating the time investment to enforce a social norm. Simply giving your opinion and recommending that a friend abstain takes a minute at most? Although I haven’t seen this formally measured, my anecdotal experience with EA is actually that social norms push people further into sex and drugs, not against it. If this is the case the marginal cultural change actually involves not spending time telling others to be more socially liberal, which is a time save in and of itself.
It may be the case that for some people, no amount of reasoning or social pressure can convince them to spend their time doing something more worthwhile, even something they themselves would admit is being more worthwhile. However, the upside of enforcing voluntary social norms is that there can be vast benefits to the marginal EA-turned-social-conservative with little cost to the edge case. Particularly for a high-iq community like EA, it’s not too difficult to say that the median EA should be more socially conservative while there’s nothing we can really do about true edge cases.
I’ll take this case a bit further. Not only should the median EA practice a more socially conservative lifestyle, but also communicate this lifestyle in a socially conservative way. In other words, unlike what I’m doing now, the primary form of communicating socially conservative ideas should be cultural, not logical. This is because the things that social conservatism opposes: premarital sex, drugs, alcohol, etc. are addictive in nature. They manipulate evolutionarily adapted instincts which no longer function well in modernity. Even many people who are publicly socially conservative (here I am talking about the general public, not in EA), give in to vice despite their beliefs. If we are convinced that 10% more social conservatism is a net good, we should also consider attaining that 10% more social conservatism most effectively.
Optional Metacommentary
What I said about helping one more EA to do something worthwhile is true. However, If I were to be a bit more ambitious, I would say that my goal with this article is to reorient EA towards a more triaged approach where low-impact decisions are biased by default towards order and self-restraint while the really time-intensive discussions are spent on the cause areas which they prioritize. For example, while I have some disagreements with EA artificial general intelligence timelines, I consider it to be a very admirable and serious position to say “I don’t care what Nick Bostrum said either way because AI will change the world and Nick informed me about it well ahead of time”. A term that I particularly like is Patrick Ryan’s “human weather”. A lot of headlines are exactly that, passing storms and sensationalism that take up far too much undeserved time of EAs. This isn’t to say there isn’t any value to understanding them. For example, if you want to change US government policy, you should have some sort of understanding of how to use culture war and partisanship to your advantage. How this relates to social conservatism and norm enforcement is internal discipline.
Richard Hanania writes with regards to EA and Bostrum:
I’m afraid that without an explicit understanding about how and why mainstream institutions are crazy on race and gender issues, and the downstream effects of identity politics, these kinds of things will keep happening.
There’s two ways of evaluating people who attacked Bostrum. The pure rationalist view is that they were simply wrong, they should be allowed to make their arguments, face no consequences, and if you disagree with them then you should make your own better arguments. It’s important to note that these people were not so generous to Bostrum. And when it comes to who provides more value to EA, it really isn’t close. So the instrumental view is that order and internal stability should be prioritized, particularly on issues that have no chance of becoming a serious cause area. It would be different if Bostrum had conducted gain of function research or something equally as potentially destructive. Then there is at least some semblance of an argument that he should be stigmatized and punished. But in such a clear cut case in which the anti-Bostrum attackers were so clearly incorrect and destructive, norm enforcement should happen against them, since the marginal unit of discussion spent on them is negative.
If your argument against having sex is that people have better things to be doing, shouldn't you be equally against monogamous sex within marriage? Sex doesn't take less time just because you're married (at least, for the sake of married people, I hope not :p)
The issue with this is that there is little overlap between effective altruists, who tend to be secular and libertarian, and people that are socially conservative, like evangelical Christians.