Why Are There So Many Rationalist Cults?

Published By

Categorized as Uncategorized Tagged , ,

The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally. You would think, then, that they’d be paragons of critical thinking and skepticism — or at least that they wouldn’t wind up summoning demons.

And yet, the rationalist community has hosted perhaps half a dozen small groups with very strange beliefs (including two separate groups that wound up interacting with demons). Some — which I won’t name in this article for privacy reasons — seem to have caused no harm but bad takes. But the most famous, a loose group of vegan anarchist transhumanists nicknamed the Zizians, have been linked to six violent deaths. Other groups, while less violent, have left a trail of trauma in their wake. One is Black Lotus, a Burning Man camp led by alleged rapist Brent Dill, which developed a metaphysical system based on the tabletop roleplaying game Mage the Ascension. Another is Leverage Research, an independent research organization that became sucked into the occult and wound up as Workplace Harassment With New Age Characteristics.

For this article, I spoke to ten people who were associated with various rationalist-adjacent groups, including Black Lotus, Leverage Research, and the Zizians. I also spoke with people who were familiar with the early development of the rationalist community. I myself am a rationalist, and the rationalist community is closely knit; my interviewees included exes, former clients, and the dad of my kid’s best friend. I am close to my subject in a way most journalists aren’t. At the same time, I got an unprecedented level of access and honesty from members of a community that is often hostile to outsiders.

The rationalist community as a whole is remarkably functional. Like any subculture, it is rife with gossip, personality conflicts, and drama that is utterly incomprehensible to outsiders. But overall, the community’s activities are less drinking the Kool-Aid and more mutual support and vegan-inclusive summer barbeques.

Nevertheless, some groups within the community have wound up wildly dysfunctional–a term I’m using to sidestep definitional arguments about what is and isn’t a cult. And some of the blame can be put on the rationalist community’s marketing.

The Sequences make certain implicit promises. There is an art of thinking better, and we’ve figured it out. If you learn it, you can solve all your problems, become brilliant and hardworking and successful and happy, and be one of the small elite shaping not only society but the entire future of humanity. 

This is, not to put too fine a point on it, not true.

Multiple interviewees remarked that the Sequences create the raw material for a cult. To his credit, their author, Eliezer Yudkowsky, shows little interest in running one. He has consistently been distant from and uninvolved in rationalist community-building efforts, from Benton House (the first rationalist group house) to today’s Lightcone Infrastructure (which hosts LessWrong, an online forum, and Lighthaven, a conference center). He surrounds himself with people who disagree with him, discourages social isolation, and rarely directs his fans to do anything other than read his BDSM-themed fanfiction.

But people who are drawn to the rationalist community by the Sequences often want to be in a cult. To be sure, no one wants to be exploited or traumatized. But they want some trustworthy authority to change the way they think until they become perfect, and then to assign them to their role in the grand plan to save humanity. They’re disappointed to discover a community made of mere mortals, with no brain tricks you can’t get from Statistics 101 and a good CBT workbook, whose approach to world problems involves a lot fewer grand plans and a lot more muddling through.

Black Lotus used a number of shared frameworks, including the roleplaying game Mage: the Ascension, that would allow them to cut through social norms and exercise true agency over their lives. Brent supposedly had the most insight into the framework, and so had a lot of control over the members of Black Lotus — control he was unable to use wisely.

However, if Brent wasn’t there, Black Lotus would have been fine. One interviewee said that, when Brent wasn’t there, Black Lotus led to beautiful peak experiences that he still cherishes: “Brent surrounded himself with people who built the thing he yearned for, missed, and couldn’t have.”

But in other cases — as in Leverage Research — the toxic dynamics emerged from the  bottom up. Interviewees with experience at Leverage Research were clear that there was no single wrongdoer. Leverage was fractured into many smaller research groups, which did everything from writing articles about the grand scope of human history to incubating a cryptocurrency. Some research groups stayed basically normal to the end; others spiralled into self-perpetuating cycles of abuse. In those research groups, everyone was a victim and everyone was a perpetrator. The trainer who broke you down in a marathon six-hour debugging session was unable to sleep because of the panic attacks caused by her own.

Worse, the promise of the Sequences is more appealing to people who have very serious life problems they need desperately to solve. While some members of dysfunctional rationalist groups are rich, stable, and as neurotypical as rationalists ever get, most are in precarious life positions: mentally ill (sometimes severely), traumatized, survivors of abuse, unemployed, barely able to scrape together enough money to find a place to sleep at night in the notoriously high-rent Bay Area. Members of dysfunctional rationalist groups are particularly likely to be transgender: transgender people are often cut off by their families and may have a difficult time finding friends who accept them as they are. The dysfunctional group can feel like a safe haven from the transphobic world.

People in vulnerable positions are both more likely to wind up mistreated and less likely to be able to leave. Elizabeth Van Nostrand, who knows many members of dysfunctional groups both rationalist and non-rationalist, said, “I know people who’ve had very good experiences in organizations where other people had very bad ones. Sometimes different people come out of the same group with very different experiences, and one of the major differences is whether they feel secure enough to push back or leave if they need to. There isn’t a substitute for a good BATNA.” <!– –> 1 <!– –>

Still, vulnerability alone can’t explain why some members of the rationalist community end up in abusive groups. Mike Blume was a member of Benton House, which was intended to recruit talented young rationalists. He said, “I was totally coming out of a super depressive and dysfunctional phase in my life, and this was a big upswing in my mood and ability to do things. We were doing something really important. In retrospect, I feel like this is the sort of thing you can’t do forever. You burn out on it eventually. But I would wake up in the morning and I’d be a little bit tired and trying to get out of bed and I’d be like, well, you know, the lightcone <!– –> 2 <!– –> depends on me getting out of bed and going to sleep and learning how to program. So I’d better get on that.”

Content retrieved from: https://asteriskmag.com/issues/11/why-are-there-so-many-rationalist-cults.

Trenton, New Jersey 08618
609.396.6684 | Feedback

Copyright © 2022 The Cult News Network - All Rights Reserved