All Episodes

March 18, 2025 65 mins

Robert tells David about Ziz's glorious plan to take to the sea and sever the right and left brains of her followers in order to make them psychopaths god that sentence was weird to write trust us the episode is weirder.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media, Welcome to Behind the Bastards, a podcast where I,
Robert Evans, am going to war like a one man
like Rambo, like one of the later Rambo movies, not
the first one that was actually about the cost of
PTSD and Imperial War, but like the later ones where
he's a one man army. I'm doing that, and I'm

(00:22):
doing it against Microsoft because I fucking hate co Pilot.
With me to talk about how much we hate Microsoft
co Pilot, my producer Sophie Lichterman, and our wonderful guest
David Borie. David, how do you feel about Microsoft co pilot?

Speaker 2 (00:37):
Ah Rambo three, let's go okay, okay, okay, kill a
ton of brown people.

Speaker 1 (00:43):
Well, no, I mean this one. We're just it's just
like Microsoft co pilots. We're killing Okay, all of them
are not really people.

Speaker 2 (00:51):
It's so bad.

Speaker 1 (00:52):
The outlook is terrible. Microsoft has really gone far off
of making a lot of products that people hate to use.

Speaker 3 (01:00):
Speaking of speaking of products people hate to use, David
said before we started recording that he was excited to
hear this story. I just watched it.

Speaker 4 (01:09):
Know.

Speaker 3 (01:10):
Where we're starting on this story is page twenty one
of the script, and where we add the script is
page forty nine.

Speaker 1 (01:18):
WHOA, I I made a mistake in doing this. I'm
gonna admit that right now before we get further, I'm
gonna say I aired in this. And it's you know,
I've made peace with the inevitability of fucking stuff up,
especially when like every week you're doing a different chunk
of history and we're veering from like we're talking about

(01:39):
fucking seventeenth and eighteenth century France and then like now
we're talking about like a fucking gay you died, a
genocide and Darfur or whatever. Right, like you're going to
these are all important topics, but like you simply can't
every single week cover the breadth of stuff that we
do and not you're gonna misspeak, You're gonna make errors
and stuff. And when it comes to like I'm talking
about I'm talking about like not obviously those are important,

(02:03):
but you know, if I fuck up some fact about
like early nineteen hundreds Germany, I'm not going to be
like too bent out of shape because it's like, you know,
there's there's no perfection in this. But in this case,
it's this tiny little community that nearly all of the
reporting on has been like deeply incomplete, and I feel

(02:24):
like I like the stress over, like what do I
include in here? And the other problem is that none
of these people have editors, and so all of everybody
in this story has a blog, and every blog post
is like forty thousand words.

Speaker 2 (02:38):
So it's just like, now, just say, what media are
you able to get this? Is? You're getting this all
straight from the source, right, A.

Speaker 1 (02:45):
Lot of it's I mean I read, I've read most
of Zizz's blog entries, and I've at least done like
little surveys of the blogs of everybody else involved in this.
There were also a couple of very helpful compilations that
like people, there's like one like a former Sometimes it's
like former members of the community. Sometimes it's folks who
are like rationalists that were trying to warn other rationalists

(03:07):
about Zizians. But like people in and around the community
have put together compilations where they'll like clip mixes of
news stories and like conversations online come and obviously these
folks like nasty work, yes, yes, and I'm deeply great, well,
we'll have source links at everything in here. I note
when I'm kind of like pulling something from something directly,

(03:31):
but like, I'm very grateful to the maniacs who put
together these like documents that have helped me piece together
what's happening. Because really, if you're coming in as an outsider,
if you weren't like embedded in this community while all
is crazy shit was going on, it's a little it's
kind of impossible to like get everything you need to get.
You have to refer to these interior sources. It's just

(03:54):
the only way to actually understand stuff.

Speaker 5 (03:57):
Oh yeah, I as an out cider, I don't know
what's going on.

Speaker 2 (04:02):
I don't know where it's going for sure. I don't
know where it's going.

Speaker 1 (04:07):
It's going, we know where it ends, which you know
what it ends. A member of Congress shows up at
the library in Vermont that the US and Canada shares
because a border patrol agent was murdered there and threatens
to take over Canada, and that's all. Like, there's a
degree to which you can kind of tie heightened tensions
between the US and Canada to the murder of this
border patrol agent, which itself is directly tied to the

(04:30):
fact that Alicia Jedkowski wrote a piece of Harry Potter
fan fiction.

Speaker 2 (04:33):
I love it all goes back to that.

Speaker 1 (04:35):
Yes, yes, it all comes back to bad Harry Potter
fan fiction. So part three we spent last episode talking
about zizz Is moving to the Bay and their first
interactions with the rationalist community. That big Sea Far conference

(04:56):
they went to that was very reminiscent, had a lot
of exercises reminiscent of like Sin and on ship.

Speaker 5 (05:01):
Right, very very a lot of talking murder, Yes, a
lot of talk a murder.

Speaker 1 (05:07):
These people love theorizing about when it's okay to kill people.
Constant factor at all of this, which.

Speaker 2 (05:15):
Is can't be a step in a good direction.

Speaker 1 (05:18):
Yeah, you know, you should. You should be aware of.
There's like, if your community is talking about like the
ethics of escalating to murder in random arguments too much,
maybe be a little worried.

Speaker 2 (05:32):
If someone sits down next to you and says, how
would you murder me? Or whatever? This right? You always
got to get out of that room.

Speaker 1 (05:39):
Yeah you want to, You want to, You want to
leave immediately.

Speaker 2 (05:44):
And even more if they're like, yeah, that's the right way,
even worse sign and then.

Speaker 1 (05:48):
If they're like, yeah, would you would you perform necrophilia?
In order to, in the past scare people away from
attacking you, like, get out of that room, leap bad.
This is not a crew you want to be a
part of. Yeah, maybe just take a pickleball or pickleball.
People never talk about necrophilia playing pickleball.

Speaker 2 (06:08):
I don't think one time. I don't think one time.

Speaker 1 (06:12):
No, they all talk about how they're getting knee replacements,
and that's beauty of pickleball exactly. So, in spite of
how obviously bad this community is, Ziz desperately wants to
be in the center of the rationalist subculture, and that
means being in the Bay. Unfortunately, the Bay is a

(06:33):
nearly impossible place to survive in if you don't have
shitloads of money, and one of the only ways to
make it in the Bay if you're not rich is
to wind up in deeply abusive and illegal rental situations.
You know this, David, I'm I'm not spreading any news
to you.

Speaker 2 (06:49):
Shout out to my landlord, mister lou.

Speaker 3 (06:54):
So.

Speaker 1 (06:54):
Ziz winds up in a horrible sublet with a person
she describes as an abusive alcoholic. I was there. I
don't know if she was the problem. In this part like,
I obviously I've got one side of this story, but
her claim is that it ends in physical violence. Ziz
claims he was to blame, but she also describes a
situation where they're like, after a big argument, bump into
each other and he calls the cops on her for assault.

(07:15):
I wouldn't put it past Ziz to be leaving some
parts out of this. But also I know a bunch
of people who wound up in horrible sublets with abusive
alcoholics who assaulted them in the Bay Area.

Speaker 3 (07:27):
And then La.

Speaker 2 (07:29):
Chrislist is a crap shoot.

Speaker 1 (07:31):
Craigslist is a crap shoot. Yeah, every time I always
I feel like they need to like qualify with like
this is just his's account. But also this sounds like
a lot of stories I know people have had.

Speaker 2 (07:41):
Yeah, it's tough to get by there.

Speaker 1 (07:44):
Yeah, so she calls the or he calls the cops
on her, and then yeah, they do nothing, and he
attacks her in her bedroom that night. So she decides
to like he's like throwing a chair at her and shit,
So she decides, I got to get out of this
terrible fucking sublet. And unfortunately, her next best option, a
very common thing in the rationalist community is to have

(08:05):
whole houses rented out that you fill with rationalists who
don't have a lot of money. It never rents by
the artists yet kind of like artists or like content
producer houses, it never explodes. People never have horrible times
in these. This particular rationalist house is called liminal because

(08:25):
you know, gen Z loves talking about their liminal spaces
on the Internet. One resident of the house reacts very
negatively when Ziz identifies herself as a non transitioning trans
woman and basically asks like, when are you going to leave?
So she has, you know, she says that as soon
as she arrives one of the other residents's transphobes, she
can't stay there very long. Again, all sounds like a

(08:47):
very familiar Bay Area housing situation story. She bounces around
some short term solutions airbnbs, moving constantly while trying to
find work. She gets an interview with Google, but the
hiring process there is slow. There's a lot of different
stages to it, and it doesn't offer immediate relief from
her financial issues. Other potential offers fall through as she
conflicts with the fundamental snake oiliness of this era of

(09:09):
silicon Valley Development, ziz blames on the fact that she
couldn't feign enthusiasm for companies she didn't believe in. Quote,
I was inexperienced with convincing body language, inclusive lies like this.
I did not have the right false face, but very
quick to think up words to say so, like I'm
not good enough at lying that I'm excited about working
for an app to you know, help you do your

(09:30):
laundry better, which is like a third of the bay.

Speaker 5 (09:33):
Yeah, right, yeah, And once again she has like flashes
of like, oh, wow, you really you really have strong
morals and all the aim. Yeah, she's a strong resume, right,
it wasn't she does she wants like an award as
a NASA intern, Right, She's still yeah, she's she really
is good at a lot of this stuff. In all
of these Zizians, as silly as their their beliefs about

(09:56):
philosophy and like cognitive science are, they're all extremely a
comp in their fields.

Speaker 1 (10:01):
Nearly. It's a it's good evidence of the fact that
like it's always a mistake to think of intelligence as
like an absolute characteristic, like I am a genius software engineer,
therefore I am smart. It's like no, no, no, you
you're you're you're you're dumb at plenty of things, mister
software engineer.

Speaker 2 (10:18):
I don't sell yourself. Sure.

Speaker 1 (10:20):
Yeah, So she does start to transition during this period
of time. She goes on finasteride, which helps to avoid
male pattern baldness, and she starts experimenting with estrogen and
anti androgens. She'd wanted to avoid this for I'm sure
she had a variety of reasons, but as soon as
she starts taking hormones they have such a positive effect.

(10:40):
She describes it as a hard to describe felt sense
of cognitive benefits, and she decides to say stay on them.
By October, she'd committed to start writing a blog about
her own feelings and theories on rationalism, and her model
here was Yudkowski. She names this blog Sin Seriously, and
it was her attempt to convince other rationalists to adopt
her belief about like veganism and such. Her first articles

(11:03):
are like pretty bland. It's the scattered concepts and thought experients,
very basic stuff like can God create a rock so
big God couldn't move it? And then like throwing a
rationalist spin on that. So it's you know a lot
of this is like, oh, maybe in an area in
which college didn't cost two hundred grand, you could have
just gotten a philosophy degree, and right, that would have
made you happy. Like, right, you just wanted to spend

(11:25):
a couple of years talking through silly ideas based on
dead Greek guys.

Speaker 2 (11:30):
Well you know the Bay is the place to do that.

Speaker 1 (11:32):
Yeah. Well, unfortunately, so she starts to really show an
interest early on though, and this is where things get
unsettling and enforcement mechanisms, which are methods by which individuals
can like blackmail themselves into accomplishing difficult tasks for personal betterment.
She writes about an app called b Minder, which lets
you set goals and punish yourself with a financial bit
penalty if you don't make regular progress. And she's really

(11:55):
obsessed with just the concept of using enforcement mechanisms to
make people better, writing often you have to break things
to make them better. So not a great path going down?
Here is she following this herself like she's working on
She's trying to use some of these tactics on herself
to make herself to deal with like what she sees
as her flaws that're stopping her from you know, saving

(12:17):
the cosmos. Great stuff, a lot I've brought pressure to
put on yourself.

Speaker 2 (12:22):
Yeah, this poor woman has been under the highest stakes
this whole time.

Speaker 1 (12:27):
Well, and that's again that that comes. That's not Zizz,
that's the entire rationalist subculture. The stakes are immediately, we
have to save the world from the evil AI that
will create hell to punish everybody who doesn't build it,
and that actually will talk about this later. That breaks
a ton of people in this She is not the
only one kind of fracturing her psyche in this community.

(12:49):
So right around this time, she's bouncing around short term
rentals and like desperately trying to get work. She meets
a person named Jasper Wynn, who at that point identified
as a trans woman. Now those by Gwynn Danielson and
uses by them pronouns. That's what I'm gonna refer to them,
but for clarity's sake, I'm gonna call them gwyn or Danielson,
even though they went by a different name at this time,
because that's what they're called now. Gwinn was a fan

(13:12):
of Zizz's blog and had some complex rationalist theories of
her own. They came to believe that each person had
multiple personalities stored inside their brain, a sort of like
mutation of the left brain right brain hypothesis, and each
of these sides of your brain was like a whole,
like intact person, right, Like great, yeah, no, cool, No,

(13:34):
you guys are gonna be fucking with your head's real heart. Great.

Speaker 5 (13:37):
Oh yeah.

Speaker 1 (13:40):
So Ziz falls in love with Gwynn's ideas and she
starts bringing them up in rationalist events, trying to brute
force them into going mainstream among the community. But people
are like, this is a little weird even for us,
and she does not succeed in this, and as a result,
she and Danielson and a couple of other friends start
like talking and theorizing together separately from the bulk of
the community. So now again you've had this. They're starting

(14:02):
to calve off from the broader subculture, and they're starting
to like really like dig ruts for themselves in a
specific direction that's leading away from the rest of the rationalists.

Speaker 2 (14:11):
Literally, all that cold stuff.

Speaker 1 (14:13):
Huh, all that colt stuff, all that cold stuff. Now,
Gwynn and Ziz largely like bonded over their struggle paying
Bay Area rints, and together they stumbled upon a solution
beloved by generations of punks and artists in northern California
taking to the sea specifically. It's great, it's great. I mean,

(14:36):
I've known like three separate people who lived on boats
in the Oakland Harbor because it was like, this is
the only way I can't afford to live in the bay.

Speaker 5 (14:44):
My little brother went to school right right outside of
San Francisco, and his principal lived on a boat right
just like a mile away from the school, and everybody
loved it.

Speaker 1 (14:54):
Yeah, yeah, everybody loved it. I mean, I gotta say,
everyone I know who lived on a boat lived on
a shitty boat. But I'm also not convinced there are
boats that any, boats that stay nice for very long.

Speaker 2 (15:04):
Yeah, it feels like you would be dank. I guess
is the word.

Speaker 1 (15:10):
Dank is a good description of boat life, I think.

Speaker 2 (15:13):
Yeah.

Speaker 1 (15:16):
So Gwinn's boat was anchored off the Encenal Basin, and
Ziz found this a pretty sweet solution. She goes over
to stay over one night and while they're like hanging out,
staying up, probably taking drugs, they don't like usually write
about it, but from like other community conversations, I think
we have to assume an awful lot of the time
when these people are staying up all night and talking,

(15:37):
there's a lot of like ketamine and stuff being used
to that isn't written into the narrative.

Speaker 2 (15:42):
That also goes along with it, also.

Speaker 1 (15:45):
Goes along with the Bay Area pills and powders are
big yeah quote. They talked about how when they were
a child, their friend who was a cat had died
and they had to use their own retroactive paraphrasing sworn
an oath of vengeance against death. Fucking. These are just
people doing great, very healthy from the opposite of what

(16:07):
you want a kid to learn when their pet dies.
Is like, yeah, you know, death is inevitable. It happens
to everything. You know, it'll happen to you one day,
and it's sad, but just something we have to accept. No, no,
no wars death.

Speaker 2 (16:18):
No. They were like, no, no, no, I can fix this.

Speaker 1 (16:21):
Okay, I as a parent have failed in this situation.
This was an unsuccessful step in my child's development. Maybe
no more pets for a while, Maybe no more pets.
Gwynn also spent way too much time online, which is
how they wound up reading hundreds of theoretical articles about
how AGI artificial general intelligence would destroy the world. And again,

(16:45):
AGI is like a mainstream term now because a fucking
chat GPT came out a couple of years ago and
everyone started talking about it at this point two sixteen seventeen.
It's only like real people who are really into the
industry in a nerdy way who are using that frame,
Like regular people on the street don't know what you
fucking mean when you're talking about this stuff, but this
is a term that is in use among them. And

(17:05):
like Ziz Gwyn moved to the Bay Area to get
involved in fixing the problem. They were another kin, are
you familiar with this online community? Which one other kin?

Speaker 2 (17:15):
Other kin? No, I have no, I've never heard of that.

Speaker 1 (17:18):
It's like a it's like the Mormonism of furreedom almost like.

Speaker 2 (17:22):
That that's that's the same you say.

Speaker 1 (17:25):
I don't mean like it's harmless, right, Like these are
people who there's a mix of beliefs. Some of them
like literally believe they're like fantasy creatures. Some of them
just like yeah.

Speaker 3 (17:34):
To be like yeah, like half identify as like it
a non human creature.

Speaker 2 (17:41):
Right, Oh, Like their furry persona is they're true.

Speaker 1 (17:44):
Yeah, yeah, kind of. That's close enough for government work.
And in Gwin's case, it's even different where I don't
think they believe they are literally a dragon, but they
believe that when there's a singularity and the robot god
creates heaven, they'll be given the body of a dragon
because the robot god will be able to do that.
It's a good singularity at least. That's why this is
all so important to them, making sure it's like a

(18:04):
nice AI so they'll be able to get their animal
friends back and get their dragon body. Til is old
time tale, as old as time again. A lot of
this could be avoided by just like processing death and uh,
stuff like that a little better. But we don't do
that very well in our society anyway. We've got a
lot of people who are committed to denying that. So

(18:27):
I'm not surprised like this happens at like the corners
right Like this is this is just a little downstream
from that Brian Johnson guy tracking his erections at night
and trying to get the penis of a nineteen year old.

Speaker 2 (18:37):
Yeah, like.

Speaker 1 (18:40):
A massive sanity gap between these two things.

Speaker 2 (18:43):
It's I think I think it's I think we're drinking
from the same well.

Speaker 1 (18:47):
Yeah, yeah, so this is a result or so, Zizz
commits herself to turning Gwynn to the dark side, which
is a term she started to use. Obviously, it's a
Star Wars term and it comes out as a result
of her obsession with It's called acrasia. Akrasia is an
actual Greek term for a lack of will power that
leads someone to act in ways that take them further

(19:07):
from their goals in life. It's an actual, like I
think akrasia often it was like an early term for
like what we call adhd right, like people who have difficulty,
like focusing on tasks that they need to complete. One
of the promises of rationalism was to arm a person
with tools to escape this state of being and act
more powerfully and effectively in the world. Ziz adds to
this some ideas crib from Star Wars. She decides that

(19:29):
the quote unquote way of the Jedi, which is like
accepting moral restrictions you know about like not murdering people
and the like, is a prison for someone who's like
truly great and has the opportunity to accomplish important goals.
Right if you're that kind of person. You can't afford
to be limited by moral beliefs. So in order to
achieve the kind of vegan singularity that she thinks is

(19:51):
critical to save the cosmos, she and her fellow rationalists
need to free themselves and from the restrictions of the
Jedi and become vegan scyth. That's more or less where
things are going here. So here I should note that
while Gwynnin's's are spinning out on their own, everything that

(20:12):
you're seeing from them, these feelings of grandiosity and cosmic significance,
but also paranoid obsession are the norm in rationalists in
effective altruist circles. There's a great article in Bloomberg News
by Ellen Hewitt. It discusses how many in the EA
set would suffer paralyzing panic attacks over things like spending
money on an ice dinner or buying ice cream, obsessing
over how many people they'd killed by not better optimizing

(20:34):
their expenses, and quote in extreme pockets of the rationality community,
AI researchers believe their apocalypse related stress was contributing to
psychotic breaks. Marie employee, and that's one of these organizations
created by the people around Yakowski. Jessica Taylor had a
job that sometimes involved imagining extreme AI torture scenarios. As

(20:55):
she described it in a post on Less Wrong, the
worst possible suffering in AI might be able to people
at work, she says, she in a small team of
researchers believed we might make God, but we might make
mess up and destroy everything. In twenty seventeen, she was
hospitalized for three weeks with delusions that she was intrinsically
evil and had destroyed significant parts of the world with

(21:15):
my demonic powers, she wrote in her post. Although she
acknowledged taking psychedelics for therapeutic reasons, she also attributed the
delusions to her job's blurring of nightmare scenarios. In real life,
in an ordinary patient, having fantasies about being the devil
is considered megalomania, She wrote here. The idea naturally followed
from my day to day social environment and was central
to my psychotic breakdown. Oh man, just taking ketamine and

(21:41):
convincing yourself you're the devil, normal rationalist stuff.

Speaker 2 (21:44):
Yeah, I mean, hey, we've all been there, right.

Speaker 1 (21:46):
We've been there now.

Speaker 5 (21:49):
In fact, no, this is the least relatable group of
people i've ever heard of.

Speaker 1 (21:55):
No, no, exactly, because there it's this like grandiosity, it's
this absolutely need to whatever else is going on, even
if you're like the bad guy, feel like what you're
doing is like of central cosmic significance. It's this fundamental
fear that all is integral to all of these tech guys.
It's at the core of Elon Musk too, that like,
one of these days you're not going to exist and

(22:17):
very few of the things that you valued in your
life are going to exist, and there's still going to
be a world because that's life.

Speaker 5 (22:24):
That's just yeah, that It's so crazy how it boils
down to just like yeah, man, well I don't know
what you thought was going to happen.

Speaker 1 (22:31):
Yeah, bro, sorry, Yeah, that's just how that's just how
it goes. You know, We've got like ten thousand years
of like philosophy and like like thinking and writing on
the subject of dealing with this, But you didn't take
any humanities and your STEM classes. So no, you know
that you're just trying to bootstrap it.

Speaker 2 (22:48):
Yeah, you just watched Star Wars again and decided you
got to figure it out.

Speaker 1 (22:53):
Yeah, you watch Star Wars one hundred and thirty seven
times and figured that was going to your a place
reading a little bit of fucking Plato or something. Maybe
it didn't work. Also, again, the ketaminees not helping.

Speaker 2 (23:05):
No, no, no, no, God to be a fly on
that wall.

Speaker 1 (23:10):
Oh god. Yeah, the rationalist therapists are raking it in, oh.

Speaker 2 (23:14):
Man, honestly well deserved.

Speaker 1 (23:16):
But yeah, some talk about info hazards.

Speaker 2 (23:20):
Jesus.

Speaker 1 (23:23):
So I have to emphasize here again that I want
to keep going back to the broader rationalist community because
I felt like a risk of this is that I
would just be talking about how crazy this one lady
and her friends were, and it's like, no, no, no,
Everything they're doing, even the stuff that is a split
off and different in like more extreme than mainstream rationalism,
is directly related to shit going on in the mainstream

(23:43):
rationalist community, which is deeply tied into big tech, which
is deeply tied into like the Peter Teal circle. A
lot of these folks are close to in and around
the government right now, right So like that is it's
Ziz is not nearly as much of an outlier as
a lot of rationalists want people to think. Right, Yeah, anyway,
at rationalists meetups, Ziz began to pushing this whole vegan

(24:04):
syth thing hard and again meets with little success, but
she and Gwyn gradually start to expand the circle of
people around them. Meanwhile, in her professional life, that Google
interview process moves forward. Ziz says that she past every
stage of the process, but that it keep getting dragged out,
forcing her to ask her parents for more help. In November,
around the time her blog started to get a following,

(24:24):
she says Google said she'd passed the committee and would
be hired once she got picked for a team. Now
I don't know what happens after this, she says. Google
asks for proof of address, which she doesn't have. She's
just turned twenty six, and she's not on her parents'
health insurance either. She's been pages describing what is a
very familiar nightmare scenario to me of like trying to
get proof of address so you can get a job

(24:45):
and life continue getting like, you know, get on Cali
med and stuff. And I do think it's probably worth
acknowledging that, Like, as her brain is starting to break
and she's she's getting further and further into all these
delusional ideas, She's also struggling with being off of her parents'
health insurance and like trying to find stable housing in
the bay and like that influences the situation.

Speaker 2 (25:07):
And still in the process of transitioning, right.

Speaker 1 (25:09):
Yes, yes, exactly, And still in the process of transitioning. Yes,
a heavy workload, you're doing too much to your brain, right, yes, so,
And then she makes the worst possible decision, which is
to live with her friend Gwynn in her tiny sail
in their tiny sailboat, which is now anchored by the
Berkeley Marina. Again, this is not like a houseboat. This

(25:31):
is like a sailboat with one small room.

Speaker 2 (25:34):
Right, it's got a court like yeah, there's like a
bed table in.

Speaker 1 (25:39):
A sink, writing, like a little bathroom probably maybe a kitchenette.
But it's not like livable for two people.

Speaker 3 (25:46):
Somebody who's like ever lived into small space with your
roommate knows just like, no matter where you're at, it's horrible,
bad idea.

Speaker 1 (25:56):
And imagine if that's shitty tiny apart meant that you
remember from your past was a boat just disastrous and
this is not a good situation. This would later write,
I couldn't use my computer as well. I couldn't set
up my three monitors, there was no room, couldn't have

(26:17):
a programming flow state. For nine hours, I had trouble sleeping.
The slightest noise in my mind kept alerting me to
the possibility that someone like my roommate from several months ago,
was going to attack me in my sleep. So this
is not a healthy situation. And both Gwynnon's's have endured
some specific traumas, and both are also prone to flights
of grandiosity and delusion. And now they are trapped all day,

(26:38):
every day together in a single room where their various
neuroses are clashing with each other and their only relief
is talking for hours about how to save the world.

Speaker 5 (26:47):
Oh my god, this is a it's a real villain story.

Speaker 2 (26:54):
You couldn't get any worse than this.

Speaker 1 (26:56):
It couldn't. And it's like, at this point, I don't
think either of them is like intentionally doing anything bad.
You've just you've kind of created a cult where like
you're trading off on being the cult leader and cult
member for each other, Like you've isolated each other away
from the world, and you're spending time brainwashing each other
together in your little boats. Yeah, how often do you

(27:17):
think they were leaving that boat not nearly long enough?
And Gwynn is on what Ziz describes as a cocktail
of stimulants. Quote mapped out the cognitive effects of each
hour they were on them. They get very angry if
Ziz interrupts their thoughts at the wrong time. And also
like Ziz isn't really sleeping, so they're just talking for

(27:38):
hours and getting on each other's nerves at the same time.
But also like building these increasingly elaborate fantasies about how
they're going to save the cosmos and it's you know,
it's not great. Through these conversations they do develop Gwinn's
multiple personalities theory, mixing in some of Zizz's own beliefs
about good and evil. And I want to quote another

(27:58):
passage from that Wired article that's summarizes what they come
to believe about this. A person's core consisted of two hemispheres,
each one intrinsically good or non good. In extremely rare cases,
they could be double good, a condition that so happened
with Lesoda identified in herself and this is consistently going
to identify herself as intrinsically good, so she's both sides

(28:20):
of her personality are only good. But most people are
at best single good, which means part of them is
non good or basically evil, and they're at war with
this other half of their brain. That's a whole person
that's evil, which is why the other people can't be
trusted to make decisions. You know, like, increasingly, this's attitude
is going to be like, only intrinsically good people can

(28:40):
be trusted to make good decisions, only the double goods,
only the double goods. That's such like a you know,
you're making your own life or well speech. This this
is a bad sign. Yeah, so, Zizz is google ambitions
fall apart at this time. They don't really give us
a good explanation as to why. I kind of think
they started bombarding their contact with Google with like requests

(29:04):
about why the process wasn't going faster, and maybe Google
was like, ah, maybe we don't need this person. Ziz
concludes failing at Google was good because she's gotten she'd
gotten ten thousand dollars from unemployment at this point. Quote
this means I had some time. If they hired me soon,
it would deprive me of at least several months of freedom,
and which of course she is continuing to work out
her theories with Gwynn on the sailboat. Also, if that's freedom,

(29:29):
it's really not freedom.

Speaker 2 (29:30):
I maybe maybe work. I heard the Google campus has
a lot of things to do, and.

Speaker 1 (29:36):
It's the kind of the what if. I think maybe
at this point she still could have pulled out of
this tailspin if she'd gotten a job and worked around
other people and socialized not on the sailboat, but also
a real consistent thing with Zizz is at this point
she has no willingness to do the kind of compromise.
And I'm not just talking about the moral compromise, but like,

(29:56):
even going to work a job for a company, you're
going to spend a large part of your day doing
a thing that like you wouldn't be doing otherwise, right,
because that's what a job generally, that's just work. And
Ziz feels like she can't handle the idea of doing
anything but reading fan fiction and theorizing about how to
give herself superpowers. Right, that's the most important thing in

(30:16):
the world because the stakes are so high, So she
like like ethically can't square herself with doing anything she
needs to succeed in this industry. Where she has the
skill to succeed. And this is this is another trait
she's got in common with the rest of the rationalist
EA subculture that that Bloomberg article interviewed a guy named

(30:37):
quaou Chu Yuan, a former rationalist and PhD candidate who
dropped out of his PhD program in order to work
in AI risk. He stopped saving for retirement and cut
off his friends so he could donate all of his
money to you know, EA causes and because his friends
were distracting him from saving the world. And these are
all this all cult stuff, right. Cults want you to

(30:58):
cut off from your friends, they want you to give
them all your money. He's doing but he's doing it
like independently, Like there's not like a single leader. He's
not like living on a compound with them. It's just
once you kind of take these beliefs seriously, the things
that you that you will do to yourself are the
things people in cults have done to them.

Speaker 2 (31:18):
Right.

Speaker 1 (31:19):
In an interview with Business Insider, Yan said, you can
really manipulate people, and you're doing all kinds of crazy stuff.
If you can convince them, this is how you can
prevent the end of the world. Once you get into
that frame, it really distorts your ability to care about
anything else.

Speaker 2 (31:34):
Man.

Speaker 1 (31:34):
That's yeah, that's kind of a thing. It's harder to
talk about this than like could people talk about Ziz
as like, oh, it's a cult leader and she had
her you know, Vegan trans Ai death cult or something,
and you know, I feel like that's not close enough
to the truth to get what's like to get how
this happened, right, because what happens with Ziz is very cultish.

(31:58):
But Ziz is one of a number of different people
who have cabbed off of the rationalism community and had
disastrous impacts. But it happens constantly with these people because
like it's such.

Speaker 2 (32:09):
An engine for it.

Speaker 1 (32:10):
Yes, it's an engine for making cults.

Speaker 2 (32:12):
It's it's this is a cult factory for sure.

Speaker 1 (32:16):
Yeah, you're creating a cult factory.

Speaker 5 (32:18):
Oh no, to give you the base ideas and then
you can just kind of franchise it how you'd like.

Speaker 1 (32:23):
Yeah, And a lot of prominent rationalists who news is
at the time have since gone out of their way
to describe her as like, you know, someone on the fringes.
Anna Salomon of Seafar described her as a young person
who was hanging around and who I suspect wanted to
be important. And Anna claim, is there anyone here who
doesn't want that? Within this No, that's all of them, right,

(32:46):
that's the whole community. And like Anna was emailing directly
gave that gave Ziz, like some of the advice that
Ziz considered like key to her moving to the Bay
Area and stuff. Right like these these these people, like
the rationalists are really really want you to think that
this was just like some fringe person. But she's very
much tied in to all of this stuff, right, So

(33:07):
for her part, Ziz doesn't deny that failing to convince
other rationalists was part of why she pulled away from
mainstream rationalism. But she's also going to claim that a
big reason for her break is sexual abuse among people
leading in the rationalist community. And there's a specific case
that she'll cite later that doesn't happen to until twenty eighteen,
But this is a problem people were discussing in twenty

(33:27):
seventeen when she's living on that boat. The representative story
is the case of Sonia Joseph, who was the basis
of that Bloomberg news piece. I've quoted from a couple
of times, and it's a bummer of a story. Sonya
was fourteen when she first read Ydkowski's Harry Potter and
the Methods of Rationality, which is set her on the
path that led her to moving to the Bay Area
in order to get involved in the rationalist EA set.

(33:50):
And she's focused on the field of AI risk. And
I'm going to read it a quote.

Speaker 3 (33:54):
This week has been so long that I completely erased
the Harry Potter part of this story from my brain.

Speaker 1 (34:01):
It's never drops too far below the surface. I cannot
overemphasize how important this Harry Potter fan fiction is to
all these murders. I want primary texts are getting abused. Yes, yes,
it's a primary text of the movement. Wow, I'm going
to read a quote from that Bloomberg article. Sonia was
encouraged when she was twenty two to have dinner with

(34:22):
a fortyish startup founder in the rationalist sphere because he
had a close connection to Peter Teal. At dinner, the
man Bragg that Yudkowski had moderate modeled a core Harry
Potter like fit professor in that fanfic on him, Joseph
says that He also argued that it was normal for
a twelve year old girl to have sexual relationships with
adult men, and that such relationships were a noble way

(34:43):
of transferring knowledge to a younger generation. Then, she says
he followed her home and insisted on staying over. She
says he slept on the floor of her living room
and that she felt unsafe until he left in the morning. Jesus,
so great. You know, bragging about your Harry Potter, how
you helped inspire the Harry Potter fan, and then explaining
how twelve year old girls should have sex with adult men.
Good stuff, got rational.

Speaker 5 (35:06):
I gotta say, that's a crazy brag to get chicks. Yeah,
you know it was.

Speaker 2 (35:12):
You know, one of those characters. I'm the snake.

Speaker 1 (35:15):
Yeah, I'm the snipe of this. By the way, what
do you think about twelve year olds? Awesome? I have
a close connection to Peter Teal.

Speaker 2 (35:23):
Yeah.

Speaker 1 (35:26):
Cool, oh man. As that Bloomberg article makes clear, this
is not an isolated issue within rationalism. Quote, sexual harassment
and abuse are distressing, are distressingly common. According to interviews
with eight women at all levels of the community, many
young ambitious women described a similar trajectory that were initially
drawn in by the ideas, then became immersed in the

(35:46):
social scene. Often that meant to attending parties at EA
or rationalist group houses, or getting added to jargon filled
Facebook messenger chat groups with hundreds of like minded people.
The eight women say casual misogyny threaded through the scene
on the low end brick. The rationalist adjacent writer says
a prominent rationalist once told her condescendingly that she was
a five year old and a hot twenty year old's body.

(36:08):
Relationships with much older men were common, as was polyamory.
Now there was inherently harmful, but several women say those
norms became tools to help influential older men get more partners.
And this is also this isn't just rationalism, that is
the California ideology. That is the Bay Area tech set, right.

Speaker 2 (36:24):
Yeah, very techy.

Speaker 1 (36:27):
Yes a man, and it's all super fucking gross the
whole year a five year old and a hot twenty
year old's body thing. What the fuck? Man? How do
you say that? Not hurl yourself off the San Francisco
Bay Bridge?

Speaker 2 (36:46):
Vile?

Speaker 1 (36:47):
That's fucked up, dude, that's bad. Speaking of bad to
the bone are sponsors. Ah, we're back. So this is
important to understand in a series about this very strange
person and the strange beliefs that she developed that influenced

(37:08):
several murders. Ziz had many of the traits of a
cult leader, but again, she's also a victim first of
the cult dynamics inherent to rationalism. And what she's doing
next is she breaks away with a small loyal group
of friends, and she does create a physical situation that
much more resembles the kind of cults we're used to
dealing with, particularly scientology, because next she's going to take

(37:29):
Oh wow, me and Gwnn living alone on this boat.
We kind of hate each other and neither of us
is sleeping, and our emotional health is terrible. But we've
made so many much progress on our ideas. Maybe we
should Maybe we should make this a bigger thing, right,
Maybe we should get a bunch of rationalists all living
together on boats.

Speaker 2 (37:50):
She needs a work life balance.

Speaker 1 (37:52):
Yeah, no, no, what she thinks she needs is she
calls it the Rationalist fleet, which is she wants to
get a bunch of community members to buy several boats
and live anchored in the bay to avoid high bay
area rent so they can spend all their time talking
and plotting out ideas for saving the cosmos. Oh man,
so great, and.

Speaker 5 (38:11):
I get it right. It's expensive here. I want to
get some boats with my friends. It does sound cool.

Speaker 1 (38:17):
We won't go insane together, obviously, you know. She buys
a twenty four foot boat for six hundred dollars off
of Craigslist. And I don't know much about boats, but
I know you're not getting a good one for just
six hundred dollars.

Speaker 2 (38:32):
No.

Speaker 1 (38:33):
No, like a living boat, like a full boat, like.

Speaker 2 (38:38):
A foot boat.

Speaker 1 (38:40):
Yes, a full boat.

Speaker 2 (38:41):
Oh man, that had to be a piece of shit,
to be a shitty, shitty, colossal piece of shit. Yeah.

Speaker 1 (38:48):
She names it the Black Signet, and she starts trying
to convince some of her idea a lot of these
people who have gathered around her to get in on
the project. Eventually, she, Danielson, and a third person puts
together the money to buy a boat that's going to
be like the center of their fleet, a seventy year
old Navy tug boat named the Caleb, which was anchored
in Alaska.

Speaker 2 (39:07):
This is like a.

Speaker 1 (39:08):
Ninety four foot boat. It's a sizeable boat, and it
is also very old and in terrible shape. That's the
that's the crown jewel of the food, right right, that's
our flagship. Man, Danielson sale, They buy this thing with
this third guy, Dan Powell, who's at least a navy veteran,

(39:30):
so like, you know, okay, that boot calls boat adjacent,
but he's I get the feeling. Nobody says this, but
David Powell says that he put tens of thousands of
dollars into buying the Caleb. And I just know from
what Danielson and Ziz wrote about their finances, neither of
them had nearly that much money. So I think, by

(39:50):
far he invests the most in this project. And I
don't want to insult the guy, but he says he
did it because he quote considered buying the boat to
be a good investment, which boats aren't. Boats are never
an investment comically, so like known to nothing depreciates like

(40:11):
fucking raw salmon depreciates slower than a boat. I think
his attitude is I'm going to become like the slumlord
of a bunch of or at least landlord to a
bunch of boat rationalists. But I think correct, I don't
know how you expect this to pay off seventy year

(40:32):
old tug boat for a bunch of like poor rationalists,
punk kids to live in. Who was that ever supposed
to work? What's the P and L statement you put
together here?

Speaker 5 (40:47):
Oh?

Speaker 2 (40:49):
What was the what was the timeline on him getting
his money back? He thought?

Speaker 1 (40:53):
Oh god, I have no idea. He absolutely takes a
bath on this ship, right, yeah, he claims, And I
belie leave him that Ziz lied to him about the
whole scenario to get his money. I do think this
was essentially a con from her, he says. Quote Ziz
led me to believe that she had established contacts in
the bay and that it would be easy for us
to at least get a slip, if not one that

(41:14):
was approved for overnight use. And as it turns out,
when we were coming through the inside passage from Alaska,
it was revealed that we did not have a place
to arrive.

Speaker 2 (41:21):
Wait, oh, I didn't realize he sailed it down from Alaska.

Speaker 1 (41:26):
Yeah, they all sail it together, them and a couple
other rationalists that they pick up. They make a post
on the internet being like, hey, any rationalists want to
sail a boat? Down from Alaska, talk about our ideas
while we live on a boat.

Speaker 5 (41:41):
Oh man, so these people need space, yes, just get
a warehouse.

Speaker 1 (41:47):
Yes, yeah, well get the ghost ship. Fire had happened
by that point, so I don't think warehouse space was
easy to get. Yeah, but this, I think this would have.
I think you're right. In an earlier era, they have
just wound up living in like a warehouse and maybe
all died in the horrible fire because that there were
issues with that kind of life too, But they would

(42:08):
have been an option besides the boat thing. Anyway, the
Caleb is not in good shape.

Speaker 2 (42:12):
Again.

Speaker 1 (42:12):
This boat is seventy plus years old. It is only
livable by punk standards, and while it was large enough
it is a ninety four foot boat you can keep
some people on there, it's also way too big to
anchor in most municipal marinas, especially since the boat has
three thousand gallons of incredibly toxic diesel fuel and it's
not really seaworthy, which means there's this constant risk of

(42:33):
poisoning the water as it sits in that the authorities
are just going to be consistently like, guys, you can't
have this here, guys, you simply can't have this here,
so they just.

Speaker 2 (42:42):
Got to operate out and inter national waters like a
cruise ship.

Speaker 1 (42:46):
No, they're just kind of illegally anchoring places and hoping
that it's fine and periodically getting boarded over it. Another
crew member on the right down from Alaska who's just
kind of there. They're just there, you know, for the adventure.
So they they leave and don't come back after they
get to the bay. But this person expressed an opinion
that Ziz consistently came off as creepy but not scary.

(43:09):
At one point, he says that she confronted him and
told him he was transgender, and when he's like, no,
I'm really not, she told him he was.

Speaker 2 (43:16):
Yes.

Speaker 1 (43:17):
She does this a lot, tells people I know that
you're this, this is and it works like that's how
a number of her followers get to her. But she
also it doesn't work a lot of time. A lot
of people are like, no, I'm not you know whatever
it is you're saying. She does this to Gwynn too,
so I don't doubt his story. Like she just kind
of decides things about people and then tries to brute

(43:37):
force them into accepting that about herself, and when there
are people who are like both desperate for like approval
and affection and also who are housing insecure and need
the boat or wherever to live with her, those people
feel a lot a number of them feel like a
significant pull to just kind of accept whatever Ziz is
saying about them.

Speaker 2 (43:56):
Yeah. I mean, when you're desperate in that way, you
kind of definitely find yourself to have a roof over
your head, like.

Speaker 1 (44:03):
Right, Yeah, And it's a very normal cult thing, right,
Like this is an aspect of all of that kind
of behavior. Now, by this point, a few other people
have come to live in the Rationalist fleet. One of
them is Imma Borhanian, a former Google engineer, and Alex Leatham,
a budding mathematician. The flotilla became a sort of marooned
aquatic salon. Wired quotes Zizz as emailing to a friend

(44:25):
at the time, We've been somewhat isolated from the rationalist
community for a while, and in the course developed a
significant chunk of unique art of rationality and theories of
psychology aimed at solving our problems. Excited for this psychology
you built on the boat, yeah, Wired continues as Lesoda
articulated their goals had moved beyond real estate into a

(44:45):
more grandiose realm. We are built trying to build a cabal,
she wrote. The aim was to find abnormally intrinsically good
people and turn them all into Gervais sociopaths, creating a
fundamentally type of group than I have heard of existing
before sociopathy was at a road would allow the group
members to operate unpooned by the external world.

Speaker 2 (45:05):
Yeah that is because you had said that before, right, Yeah,
they had been that's sort of what they're looking to be.

Speaker 1 (45:10):
Yeah, they're obsessed with this idea of which is initially
like kind of a joke about the office, but they're like, no, no, no,
it actually is really good to have this sociopath at
the top who like moves the and manipulates these like
lesser like fools and whatnot and puts them into positions
below them. Like that's how we need what we need
to be in order to gain control of the levels
of power. We have to make ourselves into Ricky Gervais sociopaths. Yeah, great,

(45:37):
what a good ideology.

Speaker 2 (45:39):
I love that they still love pop culture though you know.

Speaker 1 (45:42):
They're obsessed with it. And again this is you can't
talk about this kind of shit if you're if you're
regularly having conversations with people outside of your bubble, like.

Speaker 2 (45:50):
Exactly it's the thing. Yeah, if you have somewhere to go, yeah,
if you have anywhere to go this campaign.

Speaker 1 (45:55):
Yes, yes. If you've got a friend who's like a
nurse or a contract if drinks with once a week
and you just talk about your ideas once, they're gonna
be like, hey, this is bad.

Speaker 2 (46:04):
You need to stop.

Speaker 1 (46:05):
You're going out of a bad road. Do you need to
stay with me? Are you okay?

Speaker 2 (46:10):
This is clearly a cousin.

Speaker 1 (46:13):
Yes, someone.

Speaker 2 (46:14):
This would be so upsetting for someone to just casually
talk about it like a paint and sip or like Ricky.

Speaker 1 (46:21):
Servas somebody there. Breaks with mainstream rationalism had gone terminal. GW.
Gwynn criticized the rest of the central rationalist community for
quote not taking heroic responsibility for the outcome of this world.

(46:42):
In addition to the definitely accurate claims of sexual abuse
within rationalism, they alledged organizations like Sea Far we're actively transphobic.
I don't know how true that is. Some of the
articles I've read. There's a lot of trans rationalists will
be like, no, there's a very high population of trans
people within the rationalist community. So people just agree about this.
It's not my place to come to a conclusion. But

(47:02):
this is one of the things that Zizz says about
the central rationalist community. Ziz had concluded that transgender people
were the best people to build a cabal around because
they quote from Zizz's blog, had unusually high life force.
Ziz believed that the mental powers locked within the small
community of simpatico rationalists they'd gathered together were enough to

(47:22):
alter the fate of the cosmos if everyone could be
jail broken into sociopaths.

Speaker 2 (47:27):
And yeah, these are all double goods as well.

Speaker 1 (47:31):
Well, no, she's the only double good. Actually, she becomes
increasingly convinced that they're all just single good right, And
this is like her beliefs about heroism from the last episode.
If you've got the community and the hero, the community's
job is to support the hero, right, like blind support, right,
blind support, no matter what. And a lot of the
languages is using here in addition to being you know,

(47:54):
rationalist language. This is all like scientology mixed with gaming
and fantasy media. She talks about the need to install
new mental tech on she and her friends. They which
is like tech is like a scientology term, right, Like
that's that's like a big thing that they say. She
and her circle start dressing differently. Ziz starts wearing like
all black robes and stuff to make her look like

(48:16):
a syth or some sort of wizard. Her community adopts
the name vegan anarco transhumanism and starts unironically referring to
themselves as vegan scyth.

Speaker 2 (48:26):
In the boat community when they move in.

Speaker 1 (48:28):
Yeah, just like, what is going I just wanted to
I'm just an alcoholic. What's happening? I just wanted to
be like Quint from John's Oh no, Yeah.

Speaker 2 (48:39):
I'm just here because my wife.

Speaker 1 (48:40):
Left, right. I think might a different way than a
grape white attack. Now looking bad?

Speaker 2 (48:47):
Yike? Oh man.

Speaker 1 (48:49):
So around this time, Gwynn claims she came up with
a tactic for successfully separating and harnessing the power of
different hemispheres of someone's brain. The tactic was uni himus
spheric sleep, and this is a process by which only
one half of your brain sleeps at a time. In
a critical write up, publishes a warning before the killings

(49:09):
that are to come. A rationalist named Apala Mojave writes,
normally it is not possible for human beings to sleep
with only one hemisphere. However, a weak form of UHS
can be achieved by stimulating one half of the body
and resting the other like hypnosis are fasting. This is
a vulnerable psychological state for a person entering UHS requires
the sleeper to be exhausted. It also has disorienting effects,

(49:30):
so they are not quite themselves. And I disagree with
them that, like, there's no, they're not just actually sleeping
with only one hemisphere. And in fact, I think they
may have taken this idea from Warhammer forty thousand because.

Speaker 5 (49:43):
Its do because yeah, what are you talking about?

Speaker 1 (49:47):
But yeah, that's not a thing. Like, yes, if you
don't let yourself sleep for long periods of time and
like kind of let yourself zone into a meditative state,
you'll get a trippy effect. You will become altered. You're
altering your state, and you cant This is why colts
deprive people of sleep. You can fuck with people's heads

(50:08):
a lot when they're in that space, but this isn't
what's happening.

Speaker 5 (50:13):
I like to think of them on the boat, just
only using one half of their body.

Speaker 1 (50:17):
Line right, like one eye open, watching the office.

Speaker 2 (50:24):
Furiously taking notes.

Speaker 1 (50:27):
So this is how that write up describes the process
of uni hemispheric sleep. One you need to be tired.
Two you need to be laying down or sitting up.
It is important that you stay in a comfortable position
that won't require it to you to move very much.
In either case, you want to close one eye and
keep the other open. Distract the open eye with some
kind of engagement. Eventually you should feel yourself begin to
fall asleep on one side. That side will also become numb.

(50:50):
The degree of numbness is a good way to track
how deep into sleep the side is. Once into UHS,
it is supposed to be possible to infer which aspects
of your personality are associated with which side of the brain.
And the goal of hemispheric sleep is to jail break
the mind into psychopathy fully right, And that's how Ziz
describes it is that's the goal. That's the goal, that's

(51:11):
their goal. Got to make ourselves into psychopaths so we
can save the world. But it also gets used. You
could use it to like I have this thing. I
don't like that I react this way in this situation.
So get me into this sleep pattern and you like
talk me through and we'll figure out why I'm doing it,
and we'll They describe it as using tech to upgrade
their mental capabilities, right, so they're just kind of brainwashing

(51:35):
each other. They're like fucking around with with some pretty
potentially dangerous stuff. And again, drugs are definitely involved in
a lot of aspects of this, which which is not
usually written up, but you could you just have to
infer given that there's some disagreement or there's some disagreement
around all this. But it seems accurate to say that
Gwynn is the one who came up with the hemispheric

(51:57):
sleep idea, but a lot of the language around how
tactic it was used and what it was supposed to
do came from Ziz. And again, the process is just
sleep deprivation. Right. This is cult stuff. It's part of
how cults brainwash people. But it also wouldn't have seemed
inherently suspicious to rationalists because part of that subculture. Being

(52:18):
part of that subculture and going to those events had
already normalized a slightly less radical version of this behavior,
as this piece in Bloomberg explains, at house parties, rationalists
spent time debugging each other, engaging in a confrontational style
of interrogation that would supposedly yield more rational thoughts. Sometimes,
to probe further, they experimented with psychedelics and tried jail

(52:38):
breaking their minds to crack open their consciousness and make
them more influential or agentic. Several people in Taylor and
this is one of the sources sphere, had similar psychotic episodes.
One died by suicide in twenty eighteen and another in
twenty twenty one. So in the mainstream rationalist subculture, they
are also trying to like consciously hack their brains using
a mix of like drugs and meditation and social abuse,

(53:01):
and people kill themselves as a result of like the
outcomes of this. This is already a problem in the
mainstream subculture.

Speaker 2 (53:07):
Yeah, let alone this extremist offshoot right yep.

Speaker 1 (53:13):
In her own writings at the time, Ziz describes hideous
fights with Gwinn in which gwyn What tries to mentally
dominate in mind control Ziz, they've both become believers and
new theories is has that's basically like she uses the
term mana, which she is. She describes as like your
ability to persuade people, which is, if you can convince
someone of something, it's evidence that you have an inherent

(53:34):
level of like magical power. And someone with naturally high
manna like Ziz can literally mind control people with low manna.
That's what she believes she's doing whenever she tries to
talk someone into something about themselves, is she's mind controlling them.
And she and Gwinn have mind control battles. At one
point they start having like one of these arguments where
basically Gwinn threatens to mind control Ziz and Ziz threatens

(53:58):
Gwin back, and this starts of verbal escalation. And the
way Ziz describes this escalation, which is, again these are
two sleep deprived, traumatized people fucking with each other's heads
on a boat. But the way that Ziz describes the
escalation cycle is going to be important because this has
a lot to do with the logic of the murders
that are to come. I said that if they were

(54:19):
going to defend a right to be attacking me on
some level and treat fighting back as a new aggression
and cause to escalate. I would not at any point
back down. And if our conflicting definitions of the ground
state were no further retaliation was necessary. Meant that we
were consigned to a runaway positive feedback loop of revenge.
So be it. And if that was true, we might
as well try to kill each other right then and there,

(54:39):
in the darkness of the Caleb's Bridge at night, where
we were both sitting lying under things in a cramped space,
I became intensely worried they could stand up faster. Consider
the idea from World War One mobilization is tantamount to
a declaration of war. I stood up, still silent, waiting
see see first off, and there's other people there is Well,

(55:00):
it's not just yes, And just like the logic of well,
obviously if you attack me, then I'm going to counterattack you,
and then you're going to counterattack me, which means eventually
we'll kill each other. So we should just kill each
other now, like when you are taking your advice on
how to handle social conflict from the warring European powers
that got into World War One.

Speaker 4 (55:20):
Maybe not a good like positive example, it's just so
like even in understanding how they got there, it still
is such a stress.

Speaker 2 (55:36):
Like, even having all this.

Speaker 1 (55:37):
Back, it's still like, yeah, really taking some leaps, It's
it's yeah, I mean just having a fight with your
friend and then opening your locket which has like Kaiser
Vilhelm and the tzar in it and going what would
you guys do here?

Speaker 2 (55:49):
Yeah, ancestors guide me.

Speaker 1 (55:56):
And again, but you know, part of what's going on
here is this timeless decision theory bullshit, right. Ziz believes
that she makes it clear at this point when they
start having a conflict that the stakes will immediately escalate
to life or death. Gwinn won't risk fucking with her, right,
but by doing this, she also immediately creates a situation
where she feels unsafe. However, in that conflict, gwyn yields,

(56:17):
and Ziz concludes that the technique works right, and so yes, yes,
what she thinks is her man is strong and this
is a good idea for handling all conflicts. Right. So
I'm going to increasingly teach all these people who are
listening to me that this is the escalation loop that
you handle every conflict with, right. Great stuff. One of

(56:39):
the young people who got drawn as is at this
time was Maya Passek, who blogged under the name Squirrel
in Hell. She wrote about mainstream rationalist stuff, citing Yakowski
and Elon Musk, but in her blog there's like a
pattern of depressive thought. In one twenty sixteen post, she
mused about whether or not experiencing joy and awe might
be bad because it biases your perception. So this is

(57:00):
this is a young person who I think is probably
is dealing with a lot of depressive yesses and the
classic stinking thinking, right, and maybe the community is not
super helpful to her. She was working to create a
rationalist community in the Canary Islands. She's kind of trying
to do the same thing Ziz did, But like in
an island where it's cheaper to live, is this something.

Speaker 2 (57:20):
That can exist a lot of places like.

Speaker 1 (57:24):
Share Yeah, I mean yeah, if you've got cheap rent,
you can get a bunch of like weirdos who work
online to move into a house with you. Right, Yeah,
Like that's that's always possible. She found Zizz's blog and
she starts commenting on it. She's particularly drawn to Zizz's
theories on manna and Zin's theory in Gwynn's theory about
hemispheric personalities and one of her most direct cult leader moments.

(57:47):
Ziz reaches out directly to this to Maya as she's
like posting on her blog and emails her saying, I
see you like some of my blog posts. Truly a
sinister opening. Yeah, my true companion, Gwyn and I are
taking us somewhat different than Maria. That's the organization, one
of the rationals organization approaches, they call each other, that's

(58:09):
how they get for their true companions, true opinions. At
this point, or taking a somewhat a different approach than
the Myria approach to saving the world without much specific
test technical disagreements. We are running on somewhat pointed to
by the approach. As long as you expect the world
to burn, then change course. Right, So basically we still
expect the world to burn, so we can't keep doing

(58:30):
what the other rationalists are doing. And she lays out
to this, this girl she meets through a blog post
her plan to find abnormally intrinsically good people and jail
break them into Gervais's sociopaths. She invites Maya to come
out and it's I don't think this happened, but they
do start separately journeying into unbucketing, and Maya gets really

(58:50):
into this uni hemispheric sleep thing, and Ziz is kind
of like coaching her through the process. She tells Maya
that one of her hemispheres is female, because Maya's a transor,
and Ziz tells her one of your brain hemispheres, each
of which is a separate person, is female, but the
other is male and quote mostly dead, and your suicidal
impulses are caused by both the pain of being trans

(59:13):
and also the fact that there's this dead man living
in your head that's like taking up half of your
brain's space, and so you really need to debucket in
order to have a chance of surviving, right, Okay, So.

Speaker 2 (59:26):
She needs to be jail broken to be free.

Speaker 1 (59:29):
To be free, and Maya will basically replace her sleep
entirely with this uni hemispheric sleep crap, which exacerbates not sleeping,
exacerbates your depressive swings, and leads to deeper and deeper
troughs of suicidal ideation. She is believed to have died
by suicide in February of twenty eighteen. She posts a
what is essentially a suicide note that is very rationalist

(59:51):
in its verbiage literally titled decision theory and suicide, and
this is the first death directly related to in Gwen's ideas.
But I think it's important to note that, like the
role mainstream rationalism plays in all of this, suicide is
a common topic at se FAR events, and people will
argue constantly about whether or not, like a low value individual,

(01:00:14):
it's better for them to kill themselves, right, is that
like of higher net value to the world. And it
was also used as like a threat to stop women
who were abused by figures in the community from speaking up.
And this is from that Bloomberg article. One woman in
the community, who asked not to be identified for fear
of herprisals, says she was sexually abused by a prominent
AI researcher. After she confronted him, she says she had

(01:00:35):
job offers rescinded and conference speaking gigs canceled, and was
disinvited from AI events. She said others in the community
told her allegations of misconduct harmed the advancement of AI safety,
and one person suggested an agentic option would be to
kill herself. So there is just within rationalism this discussion
of like it can be agentic, as in, like you
are taking high agency for your to kill yourself if

(01:00:58):
your net if you're going to be a net harm
to the cause of AI safety, which you will be
by reporting this AI researcher who molested you, right, and.

Speaker 5 (01:01:07):
Yeah, because you're taking them man, Yeah.

Speaker 1 (01:01:13):
Shit, these people aren't Like all this whole community is
playing with a lot of deeply dangerous stuff and a
bunch of people are going to have their brains kill,
either kill themselves or have suffer severe trauma as a
result this, all of this, Yeah.

Speaker 2 (01:01:29):
Escaping this is even putting yourself back together after living
in this way seems like it would be such a task.

Speaker 1 (01:01:35):
Again and likely any cult part of the difficulties like
teaching yourself how to speak normally again, how to not
talk about all this stuff?

Speaker 2 (01:01:42):
Right, Yeah, not identify as a vegan sith.

Speaker 1 (01:01:46):
Right, right, because like I gotta say, like there's and
people who are really in the community will note like
a dozen different other concepts and terms in addition to
like vegan sith and gervais as sociopaths and shit that
I'm not talking about that are important to this is ideology.
But like you just can't. Like I had to basically
learn like the like a different language to do these episodes,

(01:02:08):
and I'm not fluent in it, right, Like you have
to triage like what shit do you need to know?

Speaker 2 (01:02:13):
You know? Yeah, it's so deep, it's night, so deep.

Speaker 1 (01:02:18):
Deep and silly. Let's do an ad break and then
we'll be done and we're back. So I'm just going
to conclude this little story and then we'll end the
episode for the day. So this person, Maya, has likely
killed themselves at the start of twenty eighteen, and Zis

(01:02:39):
reacts to the suicide in her usual manner. She blogs
about it. She took from what had happened, not that
like debucketing might be dangerous and uni hemispheric sleep might
be dangerous, but that explaining hemispheric consciousness to people was
an info hazard. She believed that people who were single,
good like Maya were at elevated risk because learning that

(01:03:00):
one of the whole persons inside them was evil or
mostly dead could create a reconcilable conflict, leading to depression
and suicide. And she she comes up with a name
for this. She calls this passex doom. That's what she
like names the info hazard that kills her friend who
she is like fucking with their head. So that's nice, Yeah,

(01:03:26):
as nice as anything else in this story. Is I
think you might have been the doom here.

Speaker 5 (01:03:30):
Yeah, I it's you with the whole problem. But now,
but now it's an info hazard to explain. Yes, a person's.

Speaker 1 (01:03:42):
Like to explain your theories.

Speaker 2 (01:03:44):
Yeah, yeah to a person who can't handle it.

Speaker 1 (01:03:47):
Yeah, And she thinks she comes to the conclusion it's
a particular danger to explain it, like to single good
trans women, who are the primary group of people that
she is going after in terms of trying to recruit folks.
So she like admits her belief is that this this
thought thing I've come up with is particularly dangerous to
the community I'm recruiting from. But it's the only it's essential.

(01:04:08):
This information is absolutely essential to saving the world. So
you just have to roll the dice.

Speaker 6 (01:04:13):
Yeah.

Speaker 2 (01:04:13):
It isolates herself within her own group that she's created.

Speaker 1 (01:04:16):
Well, yes, And it also she is then consciously taking
the choice. I know this is likely to kill or
destroy a lot of the people I reach out to,
but I think it's so important that it's like worth
taking that risk with their lives. Yep, good stuff.

Speaker 2 (01:04:33):
Yeah.

Speaker 1 (01:04:34):
Anyway, how are you feeling?

Speaker 2 (01:04:35):
Got plug?

Speaker 5 (01:04:37):
I am I I am okay, I'm I'm I. You
know what, I'm deeply sad for these people who are
so lost, and I'm also pretty interested because this is crazy,
but I'm okay.

Speaker 1 (01:04:54):
I'll be great, happy to happy, to happy to have
to see that. Well, everybody, this has been Behind the
Bastards a podcast about things that you maybe didn't think,
maybe didn't need to know about how the Internet breaks
people's brains. But also a lot of people surprisingly close

(01:05:15):
to this community are running the government now, so maybe
you do need to know about it. Sorry about that
info hazard.

Speaker 6 (01:05:24):
Behind the Bastards is a production of cool Zone Media.
For more from cool Zone Media, visit our website Coolzonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts. Behind the
Bastards is now available on YouTube, new episodes every Wednesday
and Friday. Subscribe to our channel YouTube dot com slash

(01:05:45):
at Behind the Bastards

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreAboutRSS

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.