Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn this stuff they don't want you to know. A
production of iHeartRadio.
Speaker 2 (00:24):
Welcome back to the show. My name is Matt, my
name is Noah.
Speaker 3 (00:27):
They call me Ben. We are joined as always with
our super producer Andrew Triforce Howard. Most importantly, you are you.
You are here. That makes this the stuff they don't
want you to know. Big shout out to TryForce. By
the way, you guys, he just gave us the coolest
what's the name of that surfer hand you do with
(00:47):
a pink chaka? Yeah, okay, that's different.
Speaker 4 (00:51):
The chakra than the shaka are different.
Speaker 3 (00:55):
Well, we have a big thank you to TryForce. We
have a big thank you to all of you, our
fellow conspiracy realist tuning in. As a matter of fact,
one of you, anonymous, hipped us to an idea that
launched a conversation so good from our listener mail program
that we turned it into an episode. We're talking about
(01:18):
this a little bit off air. Do you guys remember
the film What Dreams May Come? I know you do,
because we just talked about it.
Speaker 4 (01:24):
Yeah, I saw it in the theaters. I think that's
the one time I saw, which was what like ninety nine,
Maybe it seems like a ninety nine er. I remember
it being visually stunning and moving, but thinking back on it,
maybe a little emotionally manipulative.
Speaker 2 (01:40):
Yeah, in nineteen ninety eight, it matched up perfectly with
the way I had been taught. Heaven and Hell work,
at least pretty closely with some of the stuff, where
you know, once you're in Heaven, it's kind of just
all the good things, but if you somehow get to Hell,
it's basically of your own creation, torturous place.
Speaker 3 (02:01):
Yeah, Like Heaven is universally good, Hell is specifically the
worst thing for you. Yeah, And that's why Dante was
so oh so famous for the Infernal Reckonings, which is
really a political allegory. We're mentioning what dreams may come
because it is a visually stunning, if imperfect exploration of
(02:26):
the afterlife. As we're saying wherein no spoilers, folks, one
guy played by the legendary Robin Williams fights to make
things right with his family and his loved ones after
he's passed from what we might recognize as the mortal playing.
We're probably not going to dwell too much on this.
We just thought of this story at the beginning of
(02:48):
tonight's exploration because humans love stories, and in there, if
you think about it, the whole thing is about how
the protagonist learns ideas and thoughts have consequences and gravity,
like the conceptual metaphorical weight of what you believe in
your own mind has genuine effects on your sanity and
(03:11):
upon your soul. And in a recent listener mail program,
we also introduced an idea that's been long familiar to philosophers,
psychologist and the mad brilliant writers of Secure, Contain and
Protect SCP shout out the idea of the info hazard.
Speaker 4 (03:32):
I mean, you know, the first thing that came to
my mind when I was thinking about this, or I
guess when I have been thinking about it in recent
months is kind of like a right wing talking point
that comes out a lot, the idea of the woke
mind virus, without being political at all. I just think
it's interesting it's been kind of politicized. It's this exact
thing we're talking about politicized in a way that's like
(03:54):
talking about how ideas can be infectious and can be like,
you know, a plague almost. It's a very interesting way
of looking at it. But it is also something that
we can kind of understand a little bit. So I
guess I get why it's become such a popular thing
to talk about.
Speaker 3 (04:11):
Would you say a meme? Would you say something in
the zeitgeist?
Speaker 4 (04:16):
I would?
Speaker 3 (04:18):
We dare? Let us cross this rubicon together. The info
hazard the idea of information that can cause harm simply
by knowing the information? What is this cognitive conspiracy exactly?
And could it be real? Here are the facts?
Speaker 4 (04:43):
All right?
Speaker 3 (04:43):
The toxic thought? What is an info hazard? We talked
about this a bit in a strange news program we had,
oh gosh, fairly recently by an anonymous conspiracy realist. Thank
you anonymous conspiracy realist. We learned that the best current
(05:04):
definition of info hazard comes to us in twenty eleven
from a philosopher named Nick Bostrom.
Speaker 4 (05:12):
Yeah, Matt, I know you're a big fan of Nick Bostrom.
What's what's his deal?
Speaker 2 (05:16):
His deal is that he says, and information hazard is
quote a risk that arises from the dissemination of true
information that may cause harm or enable some agent to
cause harm. Now it sounds a little bit weird there,
a little bit just because it's Nick Bostrom talking, and
that dude is smart. Ultimately, it's information that just by
(05:39):
thinking about it and knowing this piece of information, it
could be dangerous to you and or others.
Speaker 4 (05:45):
And it's good because you can kind of slice it
a couple of ways, like, on the one hand, could
be conceptual information that just f's up your whole worldview,
or it could be actionable information that by knowing something
about something else that you shouldn't know about, it causes
some sort of moral quandary of like how do I act?
I sure wish I didn't know that thing?
Speaker 3 (06:04):
You guys have my non physical tentacles quivering here to
acquire the information. Yeah, it's doubtlessly it's a concept that,
as we said, sounds maybe a little out there, little abstract,
but it's doubtlessly familiar to anybody who's read science fiction
or to certain cough cough, real world intelligence industries cough cough.
(06:25):
Bostrom's paper is called you can read it freely online.
It's called Information Hazards colon, a Topology of potential harms
from knowledge. It's first published in Review of contemporary philosophy.
He knew it was going to make some serious waves,
just like that fake whale we talked about earlier, And
(06:47):
why wouldn't it. I mean, it's twenty eleven, right, Cast
your memory back, folks. Civilization is already in the information age.
Most people, groups, and institutions are general on the side of,
you know, knowing stuff. People like knowing stuff. A bunch
of curious cats the humans.
Speaker 2 (07:08):
That's like the Discovery heyday, right in the mid two
thousands to like the early twenty teens. That's when at
least Discovery Channel and all of those programs that were
out there that were specifically aimed at harnessing the curiosity
of humanity.
Speaker 4 (07:25):
It was a golden time for knowing things the twenty tens.
You're totally right, though, it was cool. It was like
in theegeis. Yeah, you know, one hundred percent, we know
we were there. We were part of Discovery, the whole
edutainment wave it was, we were part of it.
Speaker 2 (07:40):
The philosophy was let's learn as much as we can.
I think at least that's that's what was sold to me,
and I bought it hook line and sinker.
Speaker 3 (07:47):
Yeah, the Halcyon era right, no stone left unturned. The
Foya acts are like now a philosopher's stone into speaking
of stones, the Foya acts orsovers stone toward some alchemical
pursuit of knowledge. You could argue that most governments, corporations,
(08:08):
and individual bad faith actors thrive on this idea of
knowing things. But also they thrive on knowing things while
at the same time denying access to information to other parties.
Give me your stuff, you can't see mine. This is
not only the concept of a lot of poker games,
(08:30):
but it's also the concept of information symmetry. That's the
fancy term. I don't know if it applies too much
outside of the world of intel. We know that the
public for a long time has always supported the idea
of free propagation of knowledge, what we call freedom of information.
(08:50):
FOYA is just the acronym for the Freedom of Information Act, right.
Speaker 2 (08:56):
Well, we also know information is kept from us.
Speaker 4 (08:59):
As the public name of show.
Speaker 3 (09:03):
Yeah, yeah, that's will show up at the end because
that is yeah. Uh, bost I can't wait to talk
to the guy.
Speaker 4 (09:11):
Uh yeah.
Speaker 3 (09:12):
Bostrom understands that too.
Speaker 4 (09:13):
I think Bostroum gets it.
Speaker 3 (09:15):
He defines two rough main categories of info hazards in
this paper. First, the adversarial hazard that occurs when some
information obtained somehow can be used by a bad faith
actor for some sort of harm. Again, these are very vague,
let's consider them umbrella categories. The other category is kind
(09:39):
of like manslaughter, Like you didn't mean to hurt people.
It's an unintended consequence when you learn the information you
you might accidentally spread information like a virus. Like let's
think of a pathogen. You know, if you were infected
with a disease or some sort of sort of communicable condition,
(10:02):
you might not always know that you are transmitting that
dangerous disease. Oh I just got back from somewhere. Oh
I just have a cough. I'm not infecting other people.
This is just you know, travel crud, or for the
nerds and the audience con crud. There's not no levelence involved.
Speaker 2 (10:22):
I would I would lump most TV catchphrases and other
things like that almost into this category. They're just almost
annoying things that exist in your mind still from whatever
shows you watched or books you read when you were
a little kid, that still pop in your mind all
the time.
Speaker 3 (10:40):
Airworms, medic airworms.
Speaker 4 (10:42):
It's the tenet of advertising. I mean, it's the central,
like unifying principle. It's like, let's chuck these ideas out
that are catchy and like, hopefully enough people will remember
our little jingle or our little tagline. Yeah, and it'll
force them to do a thing.
Speaker 2 (10:57):
Well in a weird way. It makes me just think
about my love of the show The Office when I
was a bit younger, and every time anyone says anything
that's even the slightest bit of a double entendre, my
head says, that's what she said exactly.
Speaker 3 (11:13):
Another example. And they can be very short too. They
actually function better with brevity.
Speaker 2 (11:20):
My wife, oh there you go.
Speaker 3 (11:22):
You know that's a that's an emetic earworm, that's a horpact.
Speaker 4 (11:27):
An older one was like a Rodney Dangerfield. Thing'd be
like take my wife, you know. Like like little jokey
things that become associated with certain comedians or even gets
so popular that they people don't even think about who
originally said it. It just infects the culture and people
to start repeating it because they've heard it. They didn't
even know where it came from anymore.
Speaker 3 (11:43):
After a certain point, and Bostrom has proposed in this
paper again, in this single paper which you can and
should read, has proposed several subsets beneath these broader categories.
One would be data hazards or data your mileage may vary.
That's a piece of information that can be used to
(12:06):
cause harm. That's something like a like if you know
the DNA sequence of virulent pathogen and you have also
a little bit of homework on gain of function, then
you can make some you can make some nasty special effects,
maybe we can.
Speaker 2 (12:24):
So the idea of just having that, having that piece
of information in existence at your disposal is dangerous.
Speaker 3 (12:33):
Yeah, that would be under adversarial hazard because someone who
wants to, for instance, weaponize something like smallpox by possessing
that information.
Speaker 2 (12:45):
So it's an adversarial data hazard.
Speaker 3 (12:48):
Yeah, yeah, yeah, yeah, We're we're doing taxonomy here, right.
And then we have another one that's very close on
a ven diagram here, idea hazard. IDEA hazard means something
like when you know how to create a thing, if
you have the right resources, you can create the thing
(13:10):
and cause harm. So, for instance, a data hazard, as
we're talking about off air, is simply the idea that
knowing a thing, the possibility that you know a thing
can endanger other people and potentially yourself. The idea hazard
is like the gun with some modifications.
Speaker 2 (13:34):
So to me, it's like if you're thinking about somebody's
terrorist slash freedom fighter. The idea that is scary there
in a hazard is we need to do something about
this opposing force, this government. We need to do something
about these people that oppose us. The idea is we
(13:55):
need to attack them.
Speaker 4 (13:58):
Could it even also be like a religious ideology could
be like weaponized in a way, like by having your
mind turned to a particular view radicalized. Let's just say
whatever causes that? Is that a data hazard or an
idea hazard?
Speaker 3 (14:15):
Yeah, And as you can see, folks, we are still
wrestling with some of the nuances involved here. We do
know that the third subset is the knowing too much hazard, which,
as you know, is anathema to this show. Information that
(14:36):
if known, can in some way cause danger to the
person who knows that information. It reminds me of some
episodes we've done in the past on our peer show,
our sister show, Ridiculous History. People who made scientific or
medical breakthroughs that threatened the social structure or religious thought
(14:57):
of the day. Shout out to the National Invention Secrecy
Act from the fifties. There are a lot of people
who have been straight up murdered by the power structures
of their evenings just because they said very simple stuff like, hey,
maybe the earth orbits the sun, or hey, maybe we
should wash our hands if we're surgeons before we put
(15:21):
them inside living human bodies. I'm knowing, Yeah, two very
specific people who got punished for saying those things.
Speaker 4 (15:31):
Yeah, like the astronomer Giordano Bruno, who has burned at
the stake for his beliefs and his views on the
nature of the universe in existence.
Speaker 3 (15:40):
Following Copernican heliocentrism or heliocentrism. Right.
Speaker 2 (15:45):
Yeah, It's interesting how these knowing too much hazards do
pose a real threat, especially to philosophical frameworks and theosophical frameworks.
So like in the individual, if depending on what your
beliefs are, right, if you know too much about certain things,
it can shatter your worldview essentially. And I you know,
(16:08):
I imagine somebody who's a member of the Church of
Latter day Saints learning a little too much about the
Golden tablets, right or scienolypics, yeah yeah, or or really
like a devout scientologist learning just a little too much
about el Ron Hubbard's past and his background.
Speaker 4 (16:24):
We're about Xenu and all of the stuff they'll tell
you until you pay your way down the bridge, and
then all of a sudden you're like, sorry, this what.
Speaker 3 (16:31):
And then you have some cost fallacy? Yeah yeah, notes
jeepers consider scientology tonight.
Speaker 2 (16:36):
Well, especially because it's it is dangerous to you your
worldview the way and like family members, the people that
surround you, it is like actually dangerous that you might
learn too much about something. And even like the thoughts
of Tansey Bage that we talked to, this just the
concept to the idea that maybe Jesus's bloodline continued on
(16:58):
because maybe he had a line of children or something, right,
which would then kind of shatter the basis of a
lot of your at least individually held beliefs and potentially
your ability to believe in a church that's now in
your mind based on potentially false information.
Speaker 3 (17:15):
One of the most shameful examples of the no too
much info hazard would be people in disenfranchised or marginalized
communities in the past who perhaps knew midwifery, who perhaps
knew how to help address infant mortality, like keep kids alive.
(17:36):
A lot of them got burned as witches back in
the day. And this is the fascinating stuff, because what
we're finding here is that wild Nick Bostrom gave us
the rules of the road as well as the nomenclature
to frame this in the idea of thought hazards is
pretty frick and ancient. When you think about it, people
(17:59):
have been killing each other over information since before the
moon rose on recorded history. I mean, think about how
often secret societies straight up murdered people you can't know
that handshake. Think about the mystery school religions. How did
you know about the Elysian mysteries and you didn't have
prior approval, that's a for you real quick. They didn't
(18:21):
have guns, so I don't know what that pop would be.
But you know, we're doing our own sound effects.
Speaker 4 (18:26):
The sound of their worldview shattering.
Speaker 3 (18:28):
There it is, there it is. And one thought experiment
we mentioned previously we wanted to shout out. Roko's basilisk,
originated in twenty ten on the Less Wrong forum before
Nick Boston's paper was published in twenty eleven and Roco's Bassilisk,
which we mentioned thanks to our anonymous conspiracy realist in
(18:50):
a listener mail program. Roco's Bassilisk is a thought experiment
named after the user Roco who posited concept.
Speaker 4 (19:01):
Yeah, it was a really interesting conversation when we talked
about it on listener mail, and in fact, a friend
of ours, Carly Ben who you know, hit me up
after that episode and said, you talked about the thing.
You've ruined us all, you've destroyed.
Speaker 3 (19:16):
Us all, don't think about the game.
Speaker 4 (19:18):
Yeah, exactly, we did do. To be fair, we did
do an existential dread trigger warning I think before that conversation.
But sorry if we wrecked anybody's you know, minds with
that one. But you know we were wrecked too when
we found out about it, so we're all on the
same page. What is Rocko's Basilisk by way of review.
Speaker 3 (19:38):
Well, spoiler warning. If you're worried about info hazards, as
we always say, you can turn back now three two one.
In a nutshell, the theory argues that a sufficiently powerful
god of hate this term AI agent, would have an
incentive to torture and anyone who imagined the existence of
(20:03):
the agent, yet did not work ardently to bring that
agent into existence. And the argument is called a basilisk
because it's named after folklore, the legendary reptilian thing that
can cause death with a single glance. So the idea
is merely hearing, merely knowing about this will supposedly put
(20:25):
you at risk of torture from this thing, which reminds
us of some of the we talked about that poor
missionary who went to North Sentinel Island to save people
by converting them to Christianity. Again, to me, to one spoiler,
didn't work out.
Speaker 4 (20:44):
Doesn't The AI in this rocos bassilisk scenarios seem intensely
petty to you guys? Is the idea, Oh, you didn't
do everything you could to make sure that existence. Now
I'm gonna punish you forever by like putting you in
a dream loop where you die or suffocated or drown
over and over and over again. Talk about a hell
of your own making, right, I mean, really petty. But
(21:07):
the funny thing is it's not that different than how
some folks describe the idea of a vengeful god. I
find very interesting.
Speaker 2 (21:16):
You know, if you learn the truth and then live
without accepting and living in that truth, then you must
be punished.
Speaker 3 (21:25):
Yeah, I said it in this earlier episode. I think,
uh there was there was this great conversation with an
indigenous person and one of those European Christian missionaries where
they said, wait, so if you didn't tell me, I'd
be fine. They said, yeah, right, and he said, well,
why why did you tell me that? That's a that's
(21:48):
an old school info hazard.
Speaker 4 (21:49):
And he's the question that many of you are probably
screaming at your podcast device right now. But you know,
Ben did trigger warning you, so we did. You did
it to yourself.
Speaker 3 (21:57):
We gave you the Cowtown heyway. Roco's basts is a
terrifying love letter to the concept of what we call
decision theory and the interesting thing I don't know, I
don't recall us getting to it in our listener mail program,
But for this episode, what you need to know is
Roco's bassilisk. The popularity of it, the contagious nature of it,
(22:21):
is primarily due to another Internet thing called the stressand effect.
Because here's the deal. The founder of less Wrong, the
tech forum in which this thought experiment first appeared, as
a person named Eliza Yudkowski and Yudkowski banned discussion of
(22:43):
Rocko's Bassilisk on the blog for several years because they said, Hey,
just to be safe, let's ban any potential info hazards.
This had the opposite of the intended effect. Everybody got
super excited. Even the bob a rational wiki started assuming
that everybody on less Wrong thought this was true. They
(23:08):
did not. They largely rejected the argument without getting into
the weeds. The reason it got public attention and became
an acknowledged info hazard is just because someone tried to
stop it from getting out. Do we want to define
the stressand effect real quick.
Speaker 4 (23:26):
Yeah, it's just this notion that if you try to
suppress information, it just makes it more attractive to those
people who would seek to find it. The idea, I
think it was streisand like wanted a picture of her
house removed from the Internet or something along those lines,
and then by making a big fuss about it, she
drew attention to the fact that it existed in the
first place.
Speaker 3 (23:47):
Yeah, and we also know in the world of info hazards,
our good friends over at Secure contain and protect the
inspiration for an amazing game that Matt you recommended to
me cold control. They love a good info hazard. If
you ever want to have a trippy night, load up
on your favorite vices I like coffee, and then poke
(24:09):
around their website and prepare to lose your evening. With
all that background, we have to ask how real are
info hazards? Do they go past thought experiments? Perhaps more disturbingly,
how real could info hazards be in the future. Here's
(24:32):
where it gets crazy. The answer to our first question
is a resounding yes, and we are sorry to report this.
Info hazards operationally are real. They're just not quite as
extreme as what you read in fiction.
Speaker 4 (24:48):
But I mean, you can put this together on your
own just by knowing about the power of belief. People
can make themselves ill by believing certain things, and it
certainly can manifest itself cycle logically at the very least.
So I mean that alone makes this stuff very real
to many people.
Speaker 2 (25:06):
Yeah, guys, I'm going to give you an example of
what I think is one of the most prominent info
hazards that has had the most effect on humanity as
it exists right now. It was something that was created
by three young guys. I honest, you remember, but September eleventh.
Theories arose on the internet like right after the attacks,
(25:28):
but they were mostly in dark corners of the Internet.
A few people tried to make videos back before there
was even YouTube. People were attempting to share their beliefs
on that kind of thing. People were asking questions, right,
how did those guys take over the plane so easily?
Why didn't we get better footage of the Pentagon? Why
did the towers collapse like that?
Speaker 3 (25:48):
Good question?
Speaker 2 (25:49):
Why was QM, the Kuwait American Corporation so heavily invested
in a few security companies that ran security for both
the World Trade Center and for Dulles Airport.
Speaker 3 (26:01):
What happened to the money in the basement? How did
only one certain nationality cough cough KSA get past the
grounding of all flights right right?
Speaker 2 (26:13):
And why was Marvin P. Bush, George W. Bush's younger
brother by ten years, on the board of directors of Stratsek,
which was heavily invested in by that Kuwaiti company. And
why was the Saudi embassy in the same building as
that company's operations anyway? Blah blah blah blah blah. There
were a ton of questions that existed out there. Then
(26:35):
in two thousand and five, this small group of people
Jason Burmas, Corey Row, and Dylan Avery, they put all
of these questions together into a documentary and released it
on the internet again early, this is two thousand and five. Yeah,
and they called it Loose Change. And then that collection
of ideas, I would argue guys set off a cascade
(26:57):
of just individuals who were and had some of these
same like individual questions, and created basically shattered worldviews.
Speaker 3 (27:07):
I think it's snowballed.
Speaker 2 (27:09):
Yeah, but for hundreds of thousands, if not millions of people,
because millions of people watched Loose Change. I think the
original was just called loose Change.
Speaker 4 (27:17):
Nine to eleven.
Speaker 2 (27:18):
Those were all thoughts, right, they weren't. It wasn't full
on evidence. It was asking questions about specific things and
then connecting potential answers to all those questions in one
person's brain.
Speaker 4 (27:29):
And more recently, maybe even more absurdly, things like QAnon.
You know that got people really really worked up when
they were already in a state of duress, kind of
closed in, shut in during COVID. All that stuff is
just a collection of ideas of varying degrees of you know,
probability that people just totally jumped on and ran with
(27:50):
and then it spread like a like a virus.
Speaker 2 (27:52):
Well, because the ideas if any of this is true,
this like you know, this path I'm traveling down through
these ideas, these they are all if ens right, But
you cause, because even if you watch those things, some
of the arguments are way more convincing than others. But
if you get caught by even one of them, then potentially,
(28:14):
potentially I'm using that word carefully, the government was somehow
involved in nine to eleven. Potentially I can't trust anything
the government says anymore. Potentially the media companies are working for,
you know, the people that did this thing, and it
basically just goes down, down, down.
Speaker 3 (28:30):
The US media does work for the government, though. Shout
out to Washington Post and they're humor columnist.
Speaker 2 (28:36):
Well, it's true. I guess what I mean is if
you are if you hadn't had those ideas prior, right,
and you didn't have that feeling inside already. This one
documentary lasts like an hour and forty minutes or whatever,
now you're set down a completely different thought path.
Speaker 3 (28:53):
It hooks you in because the issue is the advantage
for the virus is always an advantage of offense. It
only needs to get him once right and then danta.
So the thing is, these real life info hazards like
(29:13):
we just described are often considered matters of national or
international security. And I do not apologize for noting that
the former stalwarts of US media have been compromised by
factions of the US government. I also, further do not
(29:34):
apologize for saying that factions of the US government were
at the very least opportunist about nine to eleven. But hey,
maybe loose change got to me right. Sometimes it's your
own folks, I think of the like here's an example
that is not maybe as emotionally close for a lot
of our Western listeners, for a lot of US who
(29:57):
have never traveled to Tehran. If you want to see
a real example of info hazards, just think of the
scientists who have been murdered in recent decades entirely for
their knowledge of nuclear technology, entirely because of the beat
me here triforce of the shit in their heads between yeah, right,
(30:22):
between twenty ten and twenty twenty, not counting the fascinating
story the guy got gassed in two thousand and seven.
Between twenty ten and twenty twenty five, separate Erodian nuclear
scientists were killed, likely by foreign assets. You can guess
who if you wish. They weren't murdered because they were
(30:44):
committing criminal acts. They weren't murdered because they were Wilson
Fisk kingpins. In fact, they were all professors and physicists.
A lot of espionage that doesn't make the headlines is
entirely about technology.
Speaker 4 (31:00):
Sorry.
Speaker 5 (31:00):
Masud Ali Mohammadi January twelfth, twenty ten, a professor of
quantum field theory and elementary particle physics who was killed
by our remote control bomb.
Speaker 4 (31:10):
Attached to a motorcycle. We should round robin these. There's
a whole laundry list here. Yeah.
Speaker 2 (31:16):
Oh, and we've talked about all of these except for
one back in twenty fourteen. Guys, that's crazy to think about.
You can go back and listen to our episode early
early on in the feed, called are people really murdering scientists?
Speaker 3 (31:29):
I just want to predict one good thing. I'm so
tired of us predicting terrible things.
Speaker 2 (31:36):
Well, in this case, it was yeah, I guess we
predicted the last guy's death, but we were just going
out there going. These guys are all getting murdered. I
don't know if y'all noticed another scientist that was murdered'
His name was Majid Sherarhi Shuriari. This person was killed
(31:57):
on November twenty ninth, twenty ten. This was a nuclear
engineer specifically specializing in neutron transport. And Majid was killed
by a bomb attached to his car from a motorcycle.
So imagine or something.
Speaker 4 (32:15):
It just slapped it on there and then scrolled off.
That's crazy. Yeah.
Speaker 3 (32:19):
And on that same day, another professor, Ferry doing Abbasi,
survived a one to one identical attack in Iran or
in Tehran. They meant to get them both to send
a message. Cough cough whomever cough cough. So number three
(32:39):
is darusch Razi Nyajad July twenty third, two thousand and eleven,
a physicist, another expert in neutron transport, fatally shot by
a gunman riding by on a motorcycle. And just to
pause here, you're gonna hear a lot of motorcycle stuff
(33:00):
when you hear about shenanigans in Tehran. Bad faith actors
in Tehran love motorcycles the way the Russians love third
story windows.
Speaker 4 (33:10):
Well, it's also a zippy way to evade the authorities.
We have him here in Atlanta all the time, these
kind of like gangs of like dirt bike riders and
ATV riders that just are able to wreak havoc and
then duck through neighborhoods and stuff. It's a super efficient
way of evading capture. We've got Moustafa Ahmadi Roshan January eleventh,
(33:30):
twenty twelve, the professor researching the making of polymeric membranes
for gaseous diffusion, which is part of a process for
enriching uranium to make ding ding ding nuclear weapons, killed
by a bomb also attached to his car from a motorcycle.
Speaker 3 (33:46):
It's not the door dash you want, Jesus Christ.
Speaker 2 (33:51):
Yeah, And rounding out with another one here from November
twenty seventh, twenty twenty, another scientist was killed. He was
a nuclear physicist, head of Iran's nuclear program. Uh oh,
target on your head. Don't be the head of Iran's
nuclear program.
Speaker 4 (34:07):
I guess.
Speaker 2 (34:08):
He was fatally shot by a remote control machine gun. Yep, yep,
a remote control machine gun that is motion breaking. I
can't say his name is motion f You can look.
Speaker 3 (34:20):
Him up, okay, yeah, Welsen Pardner our pronunciation. We are
not native Farsi speakers. These are just a few tragic
examples of people who were murdered for the thoughts inside
of their head, not necessarily for an incriminal action they took.
And while the thoughts these info hazards did not themselves,
(34:42):
by nature of their existence, harm the people who held
these thoughts, other external forces knew what these guys knew
or suspected these guys knew, and that triggered bloody brutal
action as a result. It is very dangerous to be
the wrong kind of professor, right, and you have to
(35:04):
be aware of folks like a lot. Again, a lot
of the skulldugery read about that gets the headlines or whatever.
The James Bond stuff is not real. What is true
is the immense scrutiny paid to certain types of research
all around the world. And now we know that people
(35:27):
possessing sensitive information can truly fall victim to real life
harm from external factors. But we want to take a
moment here to note that internal factors can have deleterious effects.
You don't need some kind of spoopy spy agency getting
(35:48):
mad at you or hiring a guy at a motorbike.
I think we're all pretty taken with the work of
Fletcher Wartman, writing for Psychology Today, diving into the concept
of internal memes street name intrusive thoughts.
Speaker 4 (36:06):
Yeah. That includes stuff like an inescapable traumatic memory that
rears its ugly head and causes you to react emotionally
or even physically. Also more innocuous stuff like earworm, like
a song that won't get out of your head.
Speaker 3 (36:23):
Yeah, this idea, this idea is strange because we've set
it up as some sort of sci fi thing or
some sort of intelligence, you know, a spy came in
from the cold type stuff. But the memes themselves. I
love what you're pointing out there, and the memes themselves,
(36:44):
those recurring inescapable thoughts can be damning on their own
in the real world. The phrase meme. We talk about
this a little bit in an episode on Thought Forms
or Tulpa's The word meme comes from a guy named
Richard Dawkins, famous somewhat problematic geneticist. He wrote this book
(37:04):
in nineteen seventy six called The selfish gene, and he
took the word meme from the root of memo, memory, mimic,
and Wharpman has this great quote summing it up, which
I I don't know, like we're going straight to our
pal Wartman here, because he sums it up the best
(37:26):
I think.
Speaker 4 (37:27):
Uh.
Speaker 2 (37:27):
The quote is basically, a meme is an idea, the
kind of idea that endures over time, like a memory,
which can be copied or mimicked and shared like a memo.
Did you know what, I guess a picture of uh,
some kind of oligarch falling out of a window or
an Iranian scientist getting shot by somebody that's definitely not
(37:49):
Masad cough cough.
Speaker 4 (37:52):
Yeah, but I mean it's funny too. I mean, I'm
sure Dawkins had this in mind. But the concept of memetics,
you know, as a theory of social evolution, essentially based
on Darwinian principles, complex systems, you know, all of this
kind of stuff. I guess maybe this that whole field
came from Dawkins. It just seems like it already was
a thing, But I don't know. Everything that I'm reading
(38:13):
about it does harken back to the selfish gene and
Dawkins universal Darwinism.
Speaker 3 (38:18):
I think that's where it got the Again. Language we
were talking about this previously. Language communication one of the
earliest human technologies, right, So I think maybe what we're
seeing there is the idea of having the right language
to hang our rough non verbal concepts upon, you know,
(38:42):
like the way you put a coat on a coat rack.
The concept here essentially is can an idea be dangerous?
We believe so every idea that you have ever had, folks,
human and non human alike, is simply this. It's your
own secret rest of arrangement, curation and analysis of fact
(39:04):
mixed with some degree of fantasy and speculation. Again, like
check out our episode on thought forms and tulpas, wherein
some of us cough cough argue that thoughts themselves can
be considered living things even though they are intangible. Like
two people in love, is that experience, that thought of
(39:27):
loving each other? Is that a living thing?
Speaker 4 (39:31):
It's a good question. It always makes me think of
this fantastic sketch on mony Python's flying Circus. I'm sure
I've mentioned this before, where there's this like joke that
exists that is so funny that anyone that hears it
or reads it will die laughing. And in the sketch,
like the government, the military gets a hold of it,
but the generals have to have it in individual pieces
(39:52):
and no one sees it all at once, and it's
used like of course, you know, shenanigans and stews. It's
many python people accidentally read it and then drop dead
and all this stuff. But it's really funny and like
they clearly are commenting on this exact kind of thing.
Speaker 3 (40:05):
Yeah, they're getting some stuff from The Yellow King, right,
shout out, oh gosh, Chambers. Right, isn't Robert W.
Speaker 4 (40:12):
Chambers King famously sort of memed in its own way
in Season one of True Detective.
Speaker 3 (40:19):
Yeah, and in The Yellow King, the entire universe of that,
which is very lovecraftying is it's orbiting around. That's a
nice Carcosa reference, the idea that there is some sort
of play, and if you read the play by the
very act of experiencing that, you will go mad.
Speaker 4 (40:40):
It's in a lot of modern horror too, things like
it follows where you know, this idea of a curse
that can be transmitted in that one, it is like
a sexually transmitted kind of curse. Maybe that's a little different.
But the ring, the videotape, the deadly videotape that once
you see it, you die in However, many days it's
a big thing.
Speaker 3 (40:59):
And like Japanese juw on, I guess and doubt. Uh
there's another there's another article. Our pal wartman has info
hazard warning how internal memes infect your brain. And I
think this stood out to several of us because it
examines how and why some thoughts can become hazardous to
(41:22):
an individual human's health, even if there are no immediate
external consequences.
Speaker 2 (41:30):
Yeah, that makes sense to me. That gets more into
the psychology of this whole thing. I think I was
trying to come up with a good example. I think
the way it becomes this kind of thing can become
dangerous is in interpersonal relationships, often when you've got some
and it is associated with trauma a lot of times.
I think we were talking about earlier, where let's say
(41:52):
you've you're in a relationship with somebody and you've been
lied to before, so you expect a lie to come
from like another relationship or something. So you're almost reading
too much into other people's actions because you expect to you're prepared.
Speaker 4 (42:08):
To you're prepared to be let down, and you live
your whole life in expectation of a thing that may
or may or may not ever happen.
Speaker 2 (42:15):
Yeah, that trauma response often comes into play, I think
with this stuff because you almost meme. You've memified some
either specific action that somebody takes or some specific phrase
that somebody says. I think especially dangerous when it comes
to romantic relationships because they are there are such similarities
(42:35):
between each romantic relationship that if you've had one bad interaction,
then the next one is gonna have a lot of
the same things happening. So then your brain sees bad
stuff where it's actually really good stuff.
Speaker 4 (42:49):
I don't know. That's just at least right.
Speaker 3 (42:50):
Well, it's perfect. Two things on that. First, the old
idiom not necessarily an idiom, the old figure of speech.
Someone helped me with the correct for that. In English
conspiracy diheartradio dot com. A bad apple spoils the bunch
is actually true because of the things, the chemicals and
(43:10):
gases the bad apples emit, they do spoil the bunch anyway. Sorry.
The second thing, the more important thing for our conversation
this evening. We had an earlier exploration wherein we had
this beautiful moment that still sticks with me, which is
this often due to those previous experiences, the insatiable drive
(43:34):
to be pattern recognizers and story constructors. We talked about
how humans are rarely speaking with one another. They're speaking
with their ideas of a person. They're speaking with their
assumptions about a person's motivations, and I think that's a
very disturbing and powerful thing all to say, Absolutely agree
(43:56):
with this conversation. Absolutely agree with Warpman that idea of
a narrative. You know, humans love stories. Maybe to a fault,
humans have always only been the stories they tell about
themselves of one another, which means, by this definition, a
communicable thought, a meme with a bad or a damaging
(44:16):
narrative can function as an info hazard. And this is
we could argue the formative root of a lot of
mental struggles.
Speaker 4 (44:23):
Communicable thought ben like communicable disease. It's like, it's all
right there.
Speaker 3 (44:30):
Yeah, I mean, you don't have to have some kind
of old school fairy tale possession. You don't have to
have intel the external world considers sensitive more dangerous. You
just need a thing or an idea that will not
leave you be and over time to our earlier points,
(44:52):
it can guide your actions, it can influence you toward
increasingly irrational behavior. And we've all unfortunately experienced something like this.
What we're saying is science shows us each and every
human mind is an ocean. It is an open ocean.
For anybody frightened of open oceans, be scared, because the
(45:14):
real one's in your head. And sometimes sinister things are
swimming in the deep.
Speaker 4 (45:18):
Yeah, like giant whale. Let's talk about my nightmare where
I'm just on the open ocean and I know there's
this massive creature lurking beneath the surface, even if it
is a gentle giant, scares the hell out of me.
Speaker 2 (45:29):
You just got to learn how to ride a whale, buddy.
Speaker 4 (45:31):
Nope, good to go, man, call me Pinocchio dog.
Speaker 3 (45:37):
Oh well, I remember that. Hey, we just did a
meme that's like the whale.
Speaker 4 (45:41):
Maybe, by the way, you didn't like the whale. The
whale ate him up?
Speaker 3 (45:45):
Well, Pinocchio is kind of a terrible lesson for children,
but kind.
Speaker 4 (45:48):
Of a Jonah type story, now that I think about it.
It never really occurred to me, but Jonah in the whale,
that's sort of what happens to Pinocchio is definitely some biblical.
Speaker 3 (45:57):
Pinocchio misbehaves all the time and they get saved.
Speaker 4 (46:02):
Just Jonah, that's the whole story. Jonah was was sort
of a dick and he got eaten by the whale
and then the whale spit him up and it made
him like see the air of his ways and like
worship God.
Speaker 3 (46:13):
But does Pinocchio learn the errors he does ways?
Speaker 4 (46:16):
He becomes a real boy?
Speaker 3 (46:18):
Is that an improvement?
Speaker 4 (46:19):
It's a good question. He does become a real boy,
and I think he's only gifted that by the Blue
Fairy because he shows penitence.
Speaker 3 (46:27):
I can't wait for anybody who has read the graphic
novel series Fables to get back to us on Geppetto.
Have you guys read Fables.
Speaker 4 (46:36):
I'm familiar with it and I've meant to start it
for years and I have not gotten to it. You're
reminding me to put it on my list, all right?
Speaker 3 (46:41):
Hashtag spoiler three to one. Geppetto is a huge villain.
Speaker 4 (46:47):
Oh well, he is an hoodcat, guys. Let's talk just
I mean real quick. Geppetto just as a character in fiction,
like he's I guess he's sad. He needs the sun.
He's said lost his smook, and maybe he's not. It
is weird to play god in that way, though, you know.
But I guess he didn't make a deal with the
devil to make Pinocchio come alive. Didn't the Blue Ferry
just kind of come along and do it for him.
(47:08):
He didn't like, do some weird incantation. He just made
a puppet.
Speaker 3 (47:11):
I'm not at liberty to disclose specifics, Matt, What were
you going to say?
Speaker 2 (47:16):
Oh no, I just thinking about another comic series where
you can really see everything we've been talking about in action.
It's a comic series that you can get in these
little graphic novel companion things called Department of Truth, and
it is just without spoiling it too much, it's based
on a thing called the Department of Truth, where it's
(47:37):
a basically secret government society that attempts to make sure
conspiracy theories aren't believed by enough people, Like specifics about
conspiracy theory aren't believed by enough people because it does
become a tulpa thing that we're talking about, and it
manifests in the real world. Let's say, Gollum, Yeah, like
(47:59):
if there's a if there are enough people believing in
flat Earth is the example that the comic gives, then
you could theoretically travel out and meet the ice Wall
because it's now it exists because enough people believe.
Speaker 3 (48:13):
That belief shapes reality, right, is the primary proposition. Department
of Truth is a banger, folks, And I think we're
talking about that in a group chat with one of
our friends earlier.
Speaker 2 (48:26):
Yeah, this one comes out in February.
Speaker 3 (48:29):
One and five, buddy. So what we are also saying
is that, given Bostrom's earlier definitions, we could rightly deem
these intrusive thoughts to be internal conspiracies, to be organic
info hazards of their own.
Speaker 5 (48:47):
Ah.
Speaker 3 (48:47):
But then we get to the phrase huh da da
da organic And this leads us to a second, more
disturbing question, the one we asked right before things got crazy.
What will info have hazards look like in the future? Gentlemen,
I suggest we take a word from our sponsor and
I'll take a big swig at coffee, and we've returned
(49:18):
What will info hazards look in the future?
Speaker 4 (49:21):
Guys?
Speaker 3 (49:21):
How close will we get to the horror story of
the Yellow King? We're simply reading or viewing a thing
will drive the audience mad, possibly to their graves. This
is okay. This sounds silly and dumb, and laugh at
me if you want. But we've argued in the past
with great validity that new breakthroughs in technology are pushing
(49:44):
civilization closer and closer and closer to real life analogs
of things that were once considered purely supernatural. Precognition is
on the way, telepathy is here, doppelgangers right, Oh, cloning,
we're supposed to call it. It's happening. But hazards are
part of this trend as well. We got to draw
people's attention to a game called Cyberpunk, mainly because Kauugh
(50:08):
Cough one of us cough cough just lost eight plus
hours playing for the first time.
Speaker 4 (50:12):
I need to get back to what's it all the
updates that kind of fixed the game. It is fantastic.
I played like maybe the first hour and a half,
two hours of it. It is really really good.
Speaker 2 (50:21):
Yeah, any excuse to hang out with Keanu, I'm into it.
Speaker 3 (50:26):
Yeah, Johnny Silver Yeah something like that, Yeah, Keanu from
earlier Johnny silver Hands, Yeah, silver Hand.
Speaker 4 (50:35):
Silver two Hands, Johnny silver Hands.
Speaker 3 (50:38):
In the world of cyberpunk, cybernetic augmentation is widespread and normalized. Like,
imagine a world where not having some sort of electronic
implant in your body is as weird as not having
a smartphone.
Speaker 4 (50:53):
Today, Ben, have you seen the Netflix cartoon The Cyberpunk. Yes,
it's fricking phenomenal. I told I reminded him myself. You
can go back and watch it. But it addresses the
politics of all of this kind of stuff from a
level of like a school kid and like the rich
kids that have the implants that he couldn't afford and
having to get like counterfeit implants and all. It's it
(51:14):
is epically a Kira level good anime. It's so freaking wild.
Speaker 3 (51:20):
And I imagine here in the real world a lot
of us know someone who doesn't have a smartphone for
one reason or another. But it is eccentric, right, Like
you would say, most most people you guys know has
a smartphone, right.
Speaker 4 (51:37):
M Yeah, it certainly the norm. But then you have
like folks like what's that actor's name, Oh, jeez, he's
got the he's kind of very intense face. He no,
it's he's got you know, he was in Boardwalk Empire.
He played the hjor Murphy. No, he played He played
the the prohibition agent who Frederick Now stop it. It
(52:01):
doesn't matter his name, guys. The point is he actively
eschewes all that kind of technology and it is a
personal choice for his life and his mental health that
he has discussed in interviews.
Speaker 3 (52:12):
That's good. Yeah, And I know a lot of us
listening this evening have made a conscious decision to unplug
in one way or another for as you were pointing out,
no mental health. You know, if that is your mission,
please do so while you still can remember. Just like
Sesame credit, this stuff is opt in until it's not.
(52:35):
That's where That's where we see this cyberpunk thing, which
I do think is worth mentioning, even though it's quote
unquote just a video game. As a result, in the
cyberpunk world, pretty much anybody can access a second universe
through one way or another. Right you plug a thing in,
you jack in, and you can experience all sorts of
(52:57):
vicarious things. You can get any question answered so long
as you have the right access. But just like the
old gifts from supernatural entities in fairy tales, there is
a cost. This doesn't come for free. The access to
this information and this power is a double edged sword
(53:19):
as a result. In this game, people can transmit viruses
directly to your mind.
Speaker 4 (53:26):
But I mean, you know, it reminds me too. I
think a lot of influences taken from the Catherine Bigelow
film Strange Days, which I believe they call it jacking
in or something along those lines, where you've got this
really intense VR playback of sex, of violence, all the thing,
whatever you could possibly imagine, and it becomes an obsession
and an addiction and it's treated like drugs, which is
(53:49):
kind of how it's treated in cyberpunk in a lot
of ways. I think it's a very interesting parallel.
Speaker 2 (53:53):
What's the name of there's a phrase for the sex
thing in that world.
Speaker 3 (53:58):
Of brain dance, brain dance. So what happens here is
that we realize in this fictional universe, and I love
that you mentioned the brain dance there, Matt, in this
fictional universe where viruses can be transmitted directly to another
(54:22):
person's mind, you can deploy info hazards that could seriously
influence people and change their behavior or injure people, up
to and including killing them. And if we pull back
zoom zoom zoom, and we examine the real world, we
look at things like neuralink. Neuralink is proposing to give
(54:44):
you this straight mind to net access while we're removing
the awkward, fumbling fingers of the middlemen of the flesh.
Kids are not gonna Your kids might believe you, but
your grandkids are not gonna believe when you say it
used to have to type stuff with your fingers.
Speaker 4 (55:03):
You know. And it's it's emergent technology. As some of
the tests have been successful, some of them have been
not so successful, but it's there, and it's coming to
your previous point, Ben, I mean, you know, we're at
the like very early stages of a lot of these
kinds of future level you know, precognition, what have you.
And we're all on that path and there's no putting
(55:24):
the genie back in the bottle? Right.
Speaker 3 (55:27):
How long will it be? How long will it be,
folks until someone figures out, like until this stuff gets
normalized right and spread or deployed a mass basis? How
long will it be until someone discovers the best way
to deploy a virus through these systems? I know, I
(55:47):
use the word deployed twice, but it's only because it's
the best word. Further, will the creators of these systems
be capable of predicting and preventing vulnerabilities before they are
out in the world. Historically speaking, the answer is.
Speaker 4 (56:07):
No, yes, no, no.
Speaker 3 (56:10):
The answer is that you nail Yeah, you nailed it. Because,
as we were saying earlier, the the advantage is on
the offense here, because you are if you're defending against
this kind of info hazard, then you need to plug
the gaps in one hundred possible places. But if you're
(56:32):
the offense, you just need to get in through one.
Speaker 2 (56:35):
Oh dude, Okay, So I'm really glad you said that, Ben,
So I think another version of this, another thing that
exists right now that is on the verge of potentially
being even more dangerous and scary, is just the concept
of ocent or open source intelligence, which is that thing
we've discussed many a time on this show. But the
(56:58):
various apps and websites and play on the internet you
can go to get an individual human beings address, phone numbers,
email addresses, all social and dating accounts, all family members,
all associates like that. Basically the stuff that a landlord
would use to assess you as a potential person who
is going to rent from them. Right, that stuff exists,
(57:20):
and I think until you have that moment of like,
I'm just give you my example, in my case, looking
up the phone number of somebody we wanted to interview
for a true crime show. The first time you delve
into that thing and you realize that all of that
information exists, just easily accessible to anybody else out there
(57:41):
on the planet, it becomes a scary thing for you
because not only do you think, oh wow, I could
I could track down anybody I wanted to and learn
anything I wanted to about them, if like in my
case again for an interview, but anybody could do that
for me, anybody.
Speaker 4 (57:59):
Yeah. Not to mention that all of these data plants
can be you know, collated instantaneously and overlaid on living
human beings out in public through the use of VR
goggles or you know, augmented reality and stuff like that.
Speaker 3 (58:13):
Out to the homebrew homebrew solution to metaglasses. The technology
is already there.
Speaker 4 (58:18):
Well, it's it's like a heads up display, it's like
terminator vision.
Speaker 3 (58:21):
But the real question is how to get the larger
public to accept it. We are talking to you at
the at the chasm of the erosion of privacy.
Speaker 2 (58:32):
Oh yeah, But the whole point of that I mean
saying this, guys, is that it's been memified now on
social media, and this is the only this is only
the stuff I've encountered. So tell me if I'm completely
off base here. It's usually a woman who appears very
attractive in the video she's making and joking about how
she already has like literally every piece of information about
(58:54):
somebody before she ever goes on a first date with.
Speaker 3 (58:59):
Well, I mean, yeah, not a but it checks out.
Speaker 2 (59:01):
It's not a bad idea. I'm just saying, like this
concept of all of this put trying to put some
of the stuff together of it is dangerous that that
information is out there. It could be potentially dangerous for
you to use that information again for me one time,
and now I think about it all the time, I think, oh,
I could find out anything I wanted to on that person.
And it's not necessarily even about using it. It's about
(59:23):
knowing that it's there.
Speaker 3 (59:24):
That's an info hazard. There we go. Now we've brought
it back. Yeah, this is uh, it's an easy dragon
to chase, but you have to remember again it's Nora Burros.
Every time you feel like you're chasing the tail of
the dragon, the head of that dragon is right on
your ass as well.
Speaker 2 (59:42):
I hope it likes the taste of ass, and you
know it does.
Speaker 3 (59:46):
You know it does? Classic dragon dragon. Yeah, we're read
up on dragon folklore. You guys catch up this day
one stuff. Humans are not the first of curious creatures,
and you all shall not be the last. Information is power,
our thoughts or weapons, and a great deal of conspiracy
is found within those simple statements. The legendary info hazards
(01:00:07):
do not quite yet exist, but we can argue humanity
is slouching ever closer to the Bethlehem of singularity. We're
going to see this folklore become reality. And you know,
I love this was pointing out earlier as student listeners.
You hang with us, you kick it with us. You
recognize our greater mission. We only hang out to explore
(01:00:31):
and understand the stuff they don't want you to know.
We are also very aware that there's a greater issue
at hand in this episode. Specifically, there is a valid
argument that without proper preparation, there is stuff you should
not know unless you are girded against the consequences of
learning these things.
Speaker 4 (01:00:52):
Gird thy loins.
Speaker 2 (01:00:54):
Conspiracy realists guild thy loins.
Speaker 3 (01:00:57):
Indeed, guild gird peer. You know what. There your Australia.
You run it how you see fit. Just don't hurt anybody.
Here endeth the sermon, and here begins your journey. We
cannot wait to hear from you in our emails. We've
been writing back to a lot of people in our
phone calls. We also try to be easy to find online,
(01:01:20):
but not in a creepy way. Please don't creep on us.
We can't stop you, but we hope you will do
the right thing.
Speaker 4 (01:01:25):
It's you can do the right thing by reaching out
to us at the handle conspiracy Stuff, where we exist
on Facebook with our Facebook group Here's where it gets crazy,
join the Shenanigans and the memory of plenty that goes
on there in that community. You can also find us
at conspiracy Stuff on YouTube, as well as x fka, Twitter,
(01:01:46):
on Instagram and TikTok. However, we're conspiracy Stuff.
Speaker 2 (01:01:48):
Show hey guys. Before we tell everyone how to write
to us, should we be worried about this? One emailer
that sends us like free in fight emails a day.
If there's a couple, dude, you know, if you are
going to send us an email, please just I don't know,
be deliberate with it. Don't automate somesystem to send us
(01:02:11):
twenty five emails today.
Speaker 3 (01:02:12):
All right, it's interesting.
Speaker 2 (01:02:14):
Yeah, I just don't like it in me my old inbox.
So hey, put maybe twenty five of them together and
send one email.
Speaker 4 (01:02:20):
That would be awesome.
Speaker 2 (01:02:21):
Okay, Hey, if you want to call us, do call
the number one eight three three std WYTK. When you
call in, give yourself a cool nickname and let us
know if we can use your name and message on
the air. If you got more to say, they can
fit in a three minute voicemail. Why not instead send
us a good old fashioned email.
Speaker 3 (01:02:38):
We are the entities that read every piece of correspondence
we receive. Be well aware, yet unafree. Sometimes the void
writes back. Now it sounds like we might not be
a United Frontier, but I believe that all are welcome.
To quote the guy from Poltergeist, so walk out with
us here in the dark. We'll see you there. Conspiracy
(01:02:59):
heart radio dot.
Speaker 2 (01:03:00):
Com stuff they don't want you to Know is a
(01:03:21):
production of iHeartRadio. For more podcasts from iHeartRadio, visit the
iHeartRadio app, Apple Podcasts, or wherever you listen to your
favorite shows.