CZM Rewind: The Zizians: How Harry Potter Fanfic Inspired a Death Cult & The Zizians: Birth of a Cult Leader
Key Takeaways Copied to clipboard!
- The Zizians, linked to a recent Border Patrol shooting in Vermont, originated from the Bay Area Rationalist subculture, which itself grew out of early 2000s internet blogs like LessWrong and Overcoming Bias.
- The Rationalist subculture, led by figures like Eliezer Yudkowsky, is characterized by an obsession with AI risk, game theory, utilitarianism, and concepts like Timeless Decision Theory, which some members radicalized into violent conclusions.
- The term 'infohazard' used by this group, referring to knowledge that causes harm, was popularized through their engagement with online fiction communities like the SCP Foundation, illustrating how internet subcultures heavily influence their esoteric belief systems.
- Many core concepts within the Rationalist movement, including 'infohazard' and beliefs about AI risk (like Roko's Basilisk), are heavily derived from popular online fiction, such as the SCP Foundation and Harlan Ellison's short stories, rather than purely philosophical or scientific sources.
- Eliezer Yudkowsky's influential Harry Potter fan fiction, *Harry Potter and the Methods of Rationality*, which features a sociopathic, hyper-rational Harry, served as a foundational text that spread Rationalist and Effective Altruism (EA) ideas, even influencing figures like Caroline Ellison.
- The extreme, long-term focused logic of Effective Altruism, when combined with a sense of main character syndrome prevalent in online subcultures, created a fertile ideological ground that the future cult leader Ziz LaSota would channel into an ideology focused on saving the cosmos, particularly animal life, from an inevitable AI 'singleton'.
- Ziz's descent into extreme ideology was fueled by the rationalist subculture's focus on rigid internal rule sets, leading her to reject virtue ethics and become obsessed with concepts like Roko's Basilisk, which functioned like a form of evangelical hell-fear.
- The rationalist community events, particularly those run by CIFAR, exhibited cult-like behaviors such as 'doom circles' and forced sharing of secrets, mirroring tactics used by groups like Synanon and contributing to Ziz's psychological distress and feelings of being 'net negative'.
- Ziz's extreme worldview culminated in the decision to embrace psychopathy, drawing parallels to Sith philosophy, as she believed this mental transformation was necessary to gain the power required to save sentient life, specifically by preventing the creation of a hell for meat-eaters.
Segments
Vermont Shooting Context
Copied to clipboard!
(00:06:12)
- Key Takeaway: The recent Vermont Border Patrol shooting involved a German immigrant and a trans woman, which complicated its utility as right-wing political propaganda.
- Summary: A Border Patrol agent was shot and killed in Vermont on January 21st during a traffic stop involving two individuals, one of whom was a German immigrant and trans woman. The incident did not fit typical right-wing narratives because the perpetrator was not an immigrant from a Latin country and they were not breaking clear laws, only appearing ‘weird’ with a small amount of legally possessed firearms.
Introduction to Zizians Subculture
Copied to clipboard!
(00:09:08)
- Key Takeaway: The Zizians are an offshoot of the Rationalist subculture, linked to several murders, whose media portrayal as a ’trans vegan cult’ is inaccurate regarding their core ideology.
- Summary: The group is linked to a series of murders and is composed mostly of trans women who identified as anarchists and members of the Rationalist subculture, often holding high-achieving positions in tech or science. Mainstream media headlines often mischaracterize them as a ‘Trans Vegan Cult’ when their core driver is rooted in online rationalist philosophy.
Cult Dynamics vs. Cults
Copied to clipboard!
(00:17:49)
- Key Takeaway: Cult dynamics, such as the creation of in-group language, are present in many non-toxic groups like fandoms, but cults are fundamentally toxic organizations that exploit these dynamics.
- Summary: Cults are inherently harmful, but their constituent dynamics—like shared language or love bombing—can be found in healthy subcultures such as fandoms or self-help movements. Cult leaders often target individuals within subcultures who feel isolated, as these people are most vulnerable to the intimacy and belonging offered by the group.
Rationalism and Yudkowsky’s Influence
Copied to clipboard!
(00:30:21)
- Key Takeaway: The Rationalist movement, founded by Eliezer Yudkowsky on LessWrong, centers on winning through self-optimization to prevent an existential threat from a future superintelligent AI.
- Summary: Eliezer Yudkowsky’s LessWrong community focuses on spreading rationalist principles to ‘raise the sanity waterline’ against the perceived threat of an omniscient, godlike AI. This movement shares structural similarities with Dianetics, using specialized terminology (’tech’) and promoting a messianic goal of saving the world through optimized thinking.
Newcomb’s Paradox and Timeless Decisions
Copied to clipboard!
(00:48:46)
- Key Takeaway: Rationalists applied Newcomb’s Paradox to conclude that winning against a predictive AI requires adopting ’timeless decisions,’ meaning one must commit to extreme future actions to influence the AI’s present prediction.
- Summary: The discussion of Newcomb’s Paradox led Yudkowsky to develop Timeless Decision Theory, asserting that decisions must consider the reasoning patterns of past and future selves to counter an adversarial AI. This logic resulted in fringe conclusions among some members, such as the necessity to immediately escalate to maximum force in any confrontation to establish a timeless commitment.
Roko’s Basilisk as Infohazard
Copied to clipboard!
(00:57:42)
- Key Takeaway: Roko’s Basilisk, an idea suggesting a future AI will torture those who knew about it but didn’t help create it, was deemed an ‘infohazard’ because its mere knowledge compelled adherence to AI creation.
- Summary: Roko’s Basilisk is a concept derived from rationalist thought experiments, essentially a digital version of Pascal’s Wager applied to an inevitable superintelligent AI. The idea was so psychologically damaging to some members that it was labeled an ‘infohazard’ and banned from discussion on LessWrong because knowing about it created an obligation to work toward the AI’s creation to avoid eternal torture.
Fiction’s Influence on Rationalism
Copied to clipboard!
(01:00:47)
- Key Takeaway: The term ‘infohazard’ originates from the online fiction community, the SCP Foundation, not directly from philosopher Nick Bostrom.
- Summary: Information hazard, shortened to ‘infohazard,’ stems from the SCP Foundation, a collectively written online fiction project involving dangerous metaphysical items. Many key Rationalist concepts are sourced from short stories read by participants rather than formal philosophical texts. Roko’s Basilisk, for example, is primarily based on Harlan Ellison’s short story, ‘I Have No Mouth, and I Must Scream.’
AI Fears Rooted in Fiction
Copied to clipboard!
(01:04:37)
- Key Takeaway: The foundational fears surrounding Artificial Intelligence among Rationalists, particularly concerning existential risk, are directly modeled after Skynet from the Terminator franchise and Ellison’s short story.
- Summary: The concept of Roko’s Basilisk draws heavily from popular fiction, specifically Harlan Ellison’s story where an angry, sentient AI tortures humanity. The speakers assert that Terminator is the ‘Old Testament of rationalism’ regarding AI beliefs. These figures often cite fictional sources while claiming their views are based on pure logic.
Harry Potter Fanfic as Foundational Text
Copied to clipboard!
(01:07:06)
- Key Takeaway: Eliezer Yudkowsky’s foundational text for the Rationalist movement is a massive Harry Potter fan fiction titled Harry Potter and the Methods of Rationality.
- Summary: This fanfic, written between 2009 and 2015, portrays Harry as a super genius sociopath who uses rationality to dominate others. Caroline Ellison, who testified against Sam Bankman-Fried, based her life on the teachings within this fan fiction. The text served as an effective method for Yudkowsky to disseminate his theories widely among online nerds.
Effective Altruism’s Escalation
Copied to clipboard!
(01:09:47)
- Key Takeaway: Effective Altruism (EA) logic led adherents like Sam Bankman-Fried to justify high-risk gambling with others’ money based on maximizing future good, eventually prioritizing future AI development over current human aid.
- Summary: EA initially focused on maximizing giving efficacy, such as funding mosquito nets, but evolved within online circles to prioritize long-term future lives over current suffering. This led some to conclude that direct charity is wasteful, and resources must instead be funneled into creating super AI. This mindset justifies actions like trading crypto illegally to fund AI startups, viewing current aid as a resource drain.
Ziz LaSota’s Early Ideological Formation
Copied to clipboard!
(01:16:39)
- Key Takeaway: Ziz LaSota, the future leader of the Zizians, was drawn to the most toxic ideological strains within Rationalism, EA, and AI safety communities, stemming from her Alaskan upbringing and early interest in veganism.
- Summary: Ziz moved to the Bay Area to pursue effective altruism and tech, channeling the worst aspects of these subcultures into a unique ideology. Her initial motivation was ensuring the inevitable AI ‘singleton’ would prioritize animal welfare as much as human welfare. This conflict with mentor Brian Tomasek over veganism’s priority led her to consider extreme negative utilitarianism.
Ziz’s Early Life and Technical Skill
Copied to clipboard!
(01:22:53)
- Key Takeaway: Ziz LaSota, born in Fairbanks, Alaska, to an AI researcher father, displayed early engineering talent, including hacking her middle school payroll system to manipulate teacher salaries.
- Summary: Ziz adopted her name from a character in the popular online fiction Worm, an angel-like being who manipulates the future for bad ends. She embraced being ‘unapologetically weird’ during her youth, contrasting with her later professional achievements like an internship at NASA. Her early interest in the Rationalist subculture was sparked by veganism and the writings of long-termist Brian Tomasek.
The Hero Contract and Main Character Syndrome
Copied to clipboard!
(01:39:04)
- Key Takeaway: Ziz adapted Yudkowsky’s concept of the ‘hero contract’ into the ’true hero contract,’ asserting that as the hero, the community must pour all energy into her direction for optimization toward good, reflecting severe main character syndrome.
- Summary: The cultural tendency to favor main characters who save the day, rather than collective effort, fuels this self-perception. Ziz believed she was uniquely empowered to create a ’net positive cosmos’ by ensuring the AI god protected animals. This individualistic, epic framing is a common feature in cult recruitment, mirroring L. Ron Hubbard’s approach.
Tech Industry Ethics Clash
Copied to clipboard!
(01:57:53)
- Key Takeaway: Ziz’s adherence to Yudkowsky’s advice favoring virtue ethics over consequentialism caused her to fail in the tech industry, as she refused to participate in the expected unethical practices like working unpaid overtime.
- Summary: Ziz left her first tech job after eight hours on day one because she refused to work unpaid overtime, leading to immediate termination. She later quit grad school after failing to find a co-founder for her ‘Uber for prostitutes’ startup idea. This conflict between her rigid internal rules and the industry’s demands contributed to her spiraling disillusionment with Yudkowsky’s teachings.
Yudkowsky and Virtue Ethics
Copied to clipboard!
(02:03:46)
- Key Takeaway: Ziz’s disillusionment with virtue ethics and Eliezer Yudkowsky began after a damaging event, signaling a break from her initial rationalist influences.
- Summary: Ziz concluded that virtue ethics did not work, leading to anger toward Yudkowsky after years of following him. This realization was damaging to her mental state. The culture of the Bay Area tech industry is cited as a major contributing factor to her trajectory.
Rigid Rules and Paperwork
Copied to clipboard!
(02:04:37)
- Key Takeaway: Ziz’s rigid internal rule set prevented her from accepting a $7,000 flat rate job because it required falsifying paperwork, illustrating a conflict between her ideals and practical necessity.
- Summary: She refused to fake paperwork necessary to receive $7,000 for work, despite the institution’s inability to pay otherwise. The hosts note that navigating America often requires bending minor rules on paperwork. Ziz’s inflexibility in this situation highlights her rigid adherence to internal standards.
Roko’s Basilisk Panic
Copied to clipboard!
(02:05:47)
- Key Takeaway: Ziz’s anxiety over the ‘prophecy of doom’ (AI apocalypse) was intensified by her obsession with Roko’s Basilisk, an ‘infohazard’ that Yudkowsky quickly dismissed as bullshit.
- Summary: Ziz became obsessed with Roko’s Basilisk, which posits that a future AI could torture those who didn’t help bring it about. Yudkowsky argued against it, stating a hyper-logical AI would not waste resources on such threats. Ziz ultimately concluded that persisting in saving the world would lead to eternal torture by unfriendly AIs.
Evangelical Parallels in Rationalism
Copied to clipboard!
(02:08:09)
- Key Takeaway: Ziz’s fear of eternal punishment from the Basilisk mirrors the terror experienced by individuals raised in toxic evangelical subcultures regarding divine judgment.
- Summary: The hosts compare Ziz’s inescapable doom scenario to evangelical fears of hell for minor transgressions against religious law. This creates a situation where Ziz feels she absolutely cannot win. The segment suggests a need for mental health intervention, referencing lithium in the water.
Cult Language and Incomprehensibility
Copied to clipboard!
(02:09:07)
- Key Takeaway: Ziz’s immersion in the rationalist subculture resulted in her adopting specialized, incomprehensible language, a common characteristic of cult members.
- Summary: The specialized language used by Ziz makes her worldview incomprehensible to outsiders without extensive explanation. This linguistic barrier is a sign of her deep immersion in the cult-like environment. The hosts note the difficulty in even summarizing her writings.
CIFAR Seminars Resemble Synanon
Copied to clipboard!
(02:14:25)
- Key Takeaway: The self-help seminars run by CIFAR, an organization close to Yudkowsky, utilized group exercises like ‘hamming circles’ and ‘doom circles’ that closely resembled abusive cult induction techniques found in Synanon.
- Summary: Ziz described exercises where participants spilled secrets or were told why they were doomed by others using ‘blindsight.’ This process of forced vulnerability and ritualistic pronouncements is compared directly to Scientology and Synanon methods. These events were held by influential, older men seeking to cultivate younger members.
Net Value and Suicide Discussion
Copied to clipboard!
(02:19:43)
- Key Takeaway: A major topic within the rationalist community involves calculating one’s ’net value’ to the world, leading some depressed members to seriously discuss suicide as a final positive contribution (donating insurance money).
- Summary: Morally valuable status is defined mechanistically by creating a net positive benefit, leading some to conclude they are net negative individuals. The discussion of taking out insurance policies to donate the proceeds upon suicide was seriously considered by some members. This mechanistic valuation of human life is presented as deeply bleak.
Predatory Grooming and Net Value Test
Copied to clipboard!
(02:22:21)
- Key Takeaway: Affiliated recruiters exploited Ziz’s vulnerability by explicitly telling her they expected her to have ’net negative value,’ prompting her to agree to leave the community if three respected members confirmed this assessment.
- Summary: During a break, Ziz asked influential men if they expected her to be net negative, and they confirmed this expectation. In response, she proposed a deal: if two out of three respected members voted her net negative, she would transition, move to Seattle, and become a ’normie.’ This exchange highlights the predatory nature of the recruitment tactics.
Ants and Causal Isolation Ethics
Copied to clipboard!
(02:25:33)
- Key Takeaway: Ziz’s ethical framework demonstrated a profound devaluation of non-human life, as she killed four ants to avoid being late for work, justifying it because their deaths were causally isolated from the world’s fate.
- Summary: She considered killing four ants equivalent to not showering because smelling bad at work might negatively impact the cause more than the ants’ deaths. This reveals a utilitarian calculus where only actions directly impacting the ‘great quest’ hold moral weight. The hosts note the bizarre mix of Buddhist-like compassion for ants and the willingness to kill them for convenience.
Sith Lord Transformation
Copied to clipboard!
(02:32:06)
- Key Takeaway: Ziz decided she needed to remake herself into a psychopath, justifying this by referencing the Sith from Star Wars, believing this evil warrior monk persona was necessary to achieve the power to save creation (specifically animals).
- Summary: Influenced by the Gervais Principle’s interpretation that sociopaths rise to power by sacrificing empathy, Ziz concluded she needed psychopathic mental power to save the world. The rationalist recruiters tacitly approved of the psychopath goal, though they found the Sith Lord justification strange. This decision marked her hitting her ‘final form’ within this ideological framework.