Beneath Your Bed Podcast

Beneath Your Bed is devoted to all things mysterious, the paranormal and true crime. Along the way, we we’ll be drinking cocktails and sharing the sordid details of our Appalachia upbringings.

http://beneathyourbedpod.com

subscribe
share





episode 7: Death Is Optional [transcript]


For some, Artificial Intelligence represents exciting and limitless advances while others see it as a threat to their livelihood and daily life. Aside from the economic consequences, what are the moral implications of AI? What if you could upload your entire life to a robot and become immortal, would you do it?


share







 2020-11-18  41m
 
 
00:01  Speaker 1
I'm
00:01
Jen
00:01
Lee
00:02
and
00:02
I'm
00:02
Jenna
00:02
Sullivan.
00:03
And
00:03
we'd
00:03
like
00:03
to
00:03
welcome
00:04
you
00:04
to
00:04
beneath
00:04
your
00:05
bed
00:05
a
00:06
podcast
00:06
where
00:06
we
00:06
drag
00:07
out
00:07
all
00:07
those
00:07
fears
00:08
at
00:08
work,
00:08
beneath
00:08
our
00:08
beds
00:09
from
00:09
the
00:09
paranormal
00:10
to
00:10
true
00:10
crime,
00:11
to
00:11
the
00:11
simply
00:11
strange
00:13
along
00:13
the
00:13
way,
00:13
we'll
00:13
be
00:13
drinking
00:14
cocktails
00:14
and
00:14
sharing
00:15
stories
00:15
from
00:15
our
00:15
Appalachian
00:16
upbringings.
00:17
For
00:17
some
00:18
artificial
00:18
intelligence
00:19
represents
00:20
exciting
00:20
and
00:20
limitless
00:21
advances
00:22
while
00:22
others
00:22
see
00:22
it
00:23
as
00:23
a
00:23
threat
00:23
to
00:23
their
00:23
livelihood
00:24
and
00:24
daily
00:25
life.
00:25
Aside
00:26
from
00:26
the
00:26
economic
00:27
consequences,
00:28
what
00:28
are
00:28
the
00:28
moral
00:28
implications
00:29
of
00:29
AI?
00:30
If
00:30
you
00:30
could
00:30
upload
00:31
your
00:31
entire
00:31
life
00:32
to
00:32
a
00:32
robot
00:32
and
00:32
become
00:33
a
00:33
mortal,
00:33
would
00:34
you,
00:44
Where
00:44
are
00:44
you
00:44
tonight?
00:45
John
00:45
?
00:45
I'm
00:46
hanging
00:46
in
00:46
there.
00:46
How
00:46
are
00:46
you
00:47
doing?
00:47
I'm
00:47
glad
00:48
that
00:48
it's
00:48
hump
00:48
day.
00:49
And
00:50
I
00:50
took
00:50
my
00:51
dog
00:51
to
00:51
the
00:51
groomer.
00:52
Oh
00:52
yeah.
00:53
Yeah.
00:53
When
00:53
I
00:53
was
00:53
getting
00:54
them
00:54
out
00:54
of
00:54
the
00:54
car
00:55
on
00:55
the
00:55
way
00:55
home,
00:56
I
00:56
saw
00:57
that
00:57
I
00:57
had
00:57
a
00:57
bottle
00:58
of
00:58
blue
00:59
carousel
01:01
and
01:01
the,
01:01
and
01:01
the
01:01
floorboard
01:02
in
01:02
the
01:02
back
01:03
seat
01:03
.
01:03
Oh
01:03
my,
01:04
that
01:04
night
01:04
that
01:05
I
01:05
went
01:05
or
01:05
that
01:05
day
01:05
I
01:06
went
01:06
and
01:06
I
01:06
bought
01:06
that
01:07
whole
01:07
box
01:07
of
01:08
booze
01:08
.
01:08
I
01:08
think
01:08
that
01:08
I
01:09
couldn't
01:09
fit
01:09
that
01:09
in
01:09
there.
01:10
And
01:10
I
01:10
left
01:11
it
01:11
in
01:11
my
01:11
car.
01:12
So
01:13
because
01:13
of
01:13
that,
01:14
I
01:14
thought
01:14
it
01:14
would
01:14
be
01:15
fitting
01:15
for
01:15
me
01:16
to
01:17
drink
01:17
,
01:17
uh
01:17
,
01:17
or
01:17
make
01:18
a
01:18
blue
01:18
Hawaiian.
01:19
Oh,
01:19
that
01:19
sounds
01:20
good.
01:20
But
01:21
you
01:21
know,
01:21
some
01:21
people
01:22
hate
01:22
blue
01:22
drinks,
01:23
but
01:24
you
01:24
know,
01:24
I
01:24
don't
01:24
make
01:24
them
01:25
that
01:25
often,
01:25
but
01:25
this
01:25
one's
01:26
really
01:26
good.
01:26
It
01:26
has
01:27
a
01:27
light
01:28
rom
01:28
of
01:28
course,
01:28
blue
01:29
carousel,
01:29
the
01:30
core
01:30
and
01:30
pineapple
01:31
juice
01:31
and
01:32
some
01:32
cream
01:32
of
01:32
coconut
01:33
and
01:33
a
01:33
cherry.
01:34
I
01:34
like
01:34
that.
01:34
You
01:35
have
01:35
these
01:35
random
01:35
bottles
01:36
of
01:36
hooch
01:37
that
01:39
it's
01:39
kind
01:39
of,
01:40
you
01:40
know,
01:40
it's,
01:41
it
01:41
gives
01:41
you
01:41
a
01:41
reason
01:41
to
01:41
run
01:42
errands.
01:42
You,
01:42
you
01:43
never
01:43
know
01:43
what
01:43
you
01:43
might
01:43
find
01:44
in
01:44
your
01:44
car.
01:44
All
01:44
I
01:44
find
01:44
is
01:45
like
01:45
an
01:45
old
01:45
fry
01:46
that
01:46
my
01:46
husband
01:47
drives
01:48
is
01:48
fast
01:48
food
01:50
that
01:50
I
01:50
tell
01:50
you
01:51
that
01:51
I
01:51
went
01:52
to
01:52
see.
01:53
Well,
01:53
I
01:53
didn't
01:53
really
01:53
go
01:53
to
01:54
see
01:54
,
01:54
but
01:54
I
01:54
saw
01:54
them
01:54
over
01:55
zoom.
01:55
I
01:55
don't
01:55
even
01:56
zoom.
01:56
Actually.
01:56
It
01:56
was
01:56
a
01:56
phone
01:57
call.
01:57
Well,
01:58
I
01:58
met
01:58
with
01:58
a
01:58
medium.
01:59
I
01:59
shouldn't
01:59
say
01:59
that.
01:59
Yeah.
02:00
Yeah.
02:00
You
02:00
showed
02:00
it
02:01
to
02:01
me
02:01
actually.
02:02
So
02:02
I
02:02
did
02:02
this.
02:03
God
02:03
it'll
02:03
be
02:03
two
02:03
weeks
02:04
ago,
02:04
Saturday.
02:05
Um,
02:05
so
02:05
I
02:05
just
02:06
got
02:06
in
02:06
the
02:06
mail.
02:07
I
02:07
got
02:07
the,
02:08
what
02:08
do
02:08
you
02:08
call
02:08
it?
02:08
The
02:09
orange
02:09
chart
02:09
.
02:09
Cause
02:09
the
02:10
meeting
02:10
and
02:10
that
02:10
I
02:10
saw,
02:10
she
02:11
does
02:11
this
02:11
thing
02:12
where
02:12
she
02:13
said
02:13
spirit
02:14
at
02:14
one
02:14
point
02:14
spirit
02:15
told
02:15
her
02:16
that
02:16
she
02:16
should
02:16
actually
02:17
draw
02:17
a
02:17
chart
02:18
of
02:18
people's
02:18
orders
02:18
while
02:18
she's
02:19
giving
02:19
them
02:20
a
02:20
reading.
02:20
So
02:20
she
02:20
started
02:21
doing
02:21
that
02:21
a
02:21
few
02:21
years
02:22
ago.
02:22
So
02:22
I
02:22
got
02:23
mine
02:23
in
02:23
the
02:23
mail.
02:24
I'm
02:24
always
02:24
expecting,
02:25
but
02:26
it
02:26
was
02:26
really
02:26
interesting
02:27
and
02:27
it
02:27
kind
02:27
of
02:27
looked
02:27
like
02:27
a
02:28
kindergartner
02:28
,
02:29
but
02:29
it
02:29
was
02:29
my,
02:30
my
02:30
aura.
02:31
There's
02:31
like
02:31
a
02:31
bowl.
02:32
And
02:33
she
02:33
talked
02:33
about
02:33
the,
02:33
it
02:34
was
02:34
like
02:34
a
02:34
pool
02:35
and
02:35
she
02:35
talked
02:35
about
02:35
the
02:35
waters
02:36
were
02:36
troubled
02:36
at
02:37
some
02:37
point.
02:37
And
02:38
like
02:38
one
02:38
of
02:38
my
02:38
spirit
02:38
guides
02:39
had
02:39
this
02:39
big
02:39
long
02:40
spoon,
02:40
I
02:40
guess
02:40
they
02:41
were
02:41
trying
02:41
to
02:41
like
02:41
scoot
02:42
me
02:42
out
02:42
of
02:42
the
02:42
troubled
02:43
waters.
02:44
Um,
02:44
so
02:44
I
02:44
think
02:44
that's
02:45
what
02:45
you
02:45
thought
02:45
was
02:45
a
02:45
Dick
02:46
was
02:46
the
02:46
spoon.
02:47
I'll
02:47
have
02:47
to
02:47
tell
02:47
you
02:47
what
02:47
I'm
02:47
drinking.
02:48
I'm
02:48
having
02:48
a,
02:49
it's
02:49
really
02:49
boring.
02:49
You
02:50
always
02:50
have
02:50
like
02:50
the
02:50
coolest
02:51
drinks.
02:52
I
02:52
feel
02:52
like
02:52
you,
02:52
you
02:52
make
02:53
drinks
02:53
that
02:53
have
02:53
like
02:53
seven
02:54
or
02:54
eight
02:54
ingredients
02:54
in
02:54
them
02:55
and
02:55
realized,
02:56
yeah
02:56
,
02:56
like
02:56
you're
02:56
the
02:56
cool
02:56
kid.
02:57
I'm
02:57
just
02:57
this,
02:58
I'm
02:58
the
02:58
kid
02:58
that
02:58
brings
02:59
the
02:59
same
02:59
bologna
02:59
sandwich
03:00  Speaker 2
For
03:00
lunch
03:01
every
03:01
day,
03:02
whatever
03:02
you
03:03
like,
03:03
you
03:03
know?
03:04
Well,
03:04
I'm
03:04
just
03:04
made
03:05
a
03:05
juicy
03:06
gin
03:06
and
03:06
tonic.
03:06
So
03:07
it's
03:07
a
03:07
gin
03:07
and
03:07
tonic,
03:07
but
03:08
I
03:08
did
03:08
some
03:09
grapefruit
03:09
juice
03:09
in
03:09
there
03:10
and
03:10
then
03:10
I
03:10
have
03:10
some
03:10
lime.
03:11
So
03:11
that's
03:11
why
03:11
it's
03:12
juicy,
03:12
but
03:13
it's
03:13
,
03:13
it's
03:13
really
03:13
good.
03:14
It's
03:14
tart
03:14
and
03:14
a
03:15
little
03:15
bit
03:15
bitter.
03:16
You've
03:16
been
03:16
getting
03:16
into
03:16
the
03:17
gin
03:17
lately.
03:18
I
03:18
have,
03:18
I
03:19
like
03:19
,
03:19
um,
03:20
I
03:20
got
03:20
the
03:21
Tanqueray
03:21
,
03:21
um
03:21
,
03:21
ring
03:22
por
03:22
Jen
03:22
,
03:22
and
03:23
I
03:23
really
03:23
like
03:23
it.
03:23
It's
03:23
got
03:24
like
03:24
some
03:24
botanicals
03:25
in
03:25
it.
03:25
Like
03:26
I
03:26
used
03:26
to
03:26
drink
03:26
it
03:26
in
03:27
college
03:27
and
03:27
then
03:27
I
03:27
got
03:27
sick
03:28
on
03:28
them.
03:28
Like
03:28
I
03:28
think
03:28
I
03:29
got
03:29
sick
03:29
on
03:29
everything
03:29
in
03:29
college,
03:30
but
03:30
that's
03:30
been
03:31
many
03:31
years
03:31
ago.
03:31
So
03:32
I'm
03:32
,
03:32
I
03:32
can,
03:32
you
03:32
know,
03:32
I
03:32
can
03:33
drink
03:33
it
03:33
again
03:34  Speaker 1
Tonight.
03:35
I
03:35
had
03:36
told
03:36
you
03:36
initially
03:36
I
03:36
was
03:37
going
03:37
to
03:37
talk
03:37
about
03:38
the
03:38
dangers
03:39
of
03:39
artificial
03:39
intelligence.
03:41
And
03:41
then
03:41
after
03:41
doing
03:42
all
03:42
this
03:42
research,
03:43
I
03:43
decided
03:44
I
03:44
would
03:44
do
03:44
it
03:45
about,
03:45
I
03:45
would
03:45
do
03:45
it
03:46
on
03:46
a
03:46
I
03:47
and
03:47
also
03:48
immortality.
03:49
Oh,
03:49
that's
03:49
fascinating.
03:50
So
03:50
if
03:51
you
03:51
ever
03:51
heard
03:51
of
03:51
the
03:51
terrain
03:52
test,
03:52
are
03:52
you
03:52
talking
03:53
about
03:53
Alan
03:53
Turing?
03:54
Yes.
03:55  Speaker 2
I
03:55
mean,
03:55
I
03:56
know
03:56
a
03:56
little
03:56
bit
03:57
about
03:57
Alan
03:57
Turing,
03:58
but
03:58
I
03:58
don't
03:58
really
03:59
know
03:59
much
03:59
more
03:59
than
03:59
just
04:00
kind
04:00
of
04:00
like
04:00
the
04:00
movie
04:01
and
04:01
I
04:01
haven't
04:01
even
04:01
seen
04:01
the
04:02
movie.
04:02
I
04:02
know
04:02
about
04:02
the
04:02
movie.
04:03
Oh,
04:03
I
04:03
saw
04:03
the
04:03
movie.
04:04
It
04:04
was
04:04
really
04:04
good.
04:04
I
04:04
think
04:04
it
04:04
was
04:04
called
04:05
it's
04:05
an
04:05
imitation
04:06
game
04:06
or
04:07
that's
04:07
right.
04:08
Isn't
04:08
it
04:09
better
04:09
?
04:10  Speaker 1
Yes.
04:11
Yeah.
04:11
It
04:11
was
04:11
really
04:11
good.
04:12
He
04:12
worked
04:13
for
04:13
the
04:13
British
04:13
government
04:13
and
04:14
he
04:14
was
04:14
like
04:14
a
04:15
Codebreaker
04:15
and
04:15
he
04:15
was
04:15
able
04:16
to
04:17
develop,
04:17
I
04:17
think,
04:17
some
04:18
type
04:18
of
04:18
machinery
04:19
that
04:19
was
04:19
like
04:19
code
04:20
breaking
04:20
for,
04:21
I
04:21
think
04:21
they
04:21
were
04:21
called,
04:21
it
04:21
was
04:22
called
04:22
a
04:22
Nygma
04:23
messages.
04:23
So
04:23
it
04:23
was
04:23
really
04:24
a
04:24
key
04:24
to
04:25
defeating
04:26
Nazi
04:26
Germany.
04:27
So,
04:28
but
04:28
it
04:28
didn't
04:28
end
04:29
so
04:29
well
04:29
for
04:29
him,
04:29
but
04:29
that's
04:30
a
04:30
story
04:30
for
04:30
another
04:30
time
04:32
he
04:32
was
04:32
gay,
04:32
right?
04:34
Yes.
04:35
Yes.
04:36
And
04:36
I
04:36
think
04:36
he
04:36
was
04:37
finally
04:37
recognized
04:38
for
04:38
his
04:38
contributions
04:39
,
04:39
um,
04:40
not
04:40
too
04:40
long
04:40
ago,
04:41
a
04:41
few
04:41
years
04:41
ago.
04:41
So
04:42
that's
04:42
good
04:42
to
04:42
hear.
04:43
So
04:43
with
04:43
uttering
04:44
test
04:44
,
04:44
what
04:44
he
04:45
hypothesized
04:46
is
04:46
that
04:47
artificial
04:47
intelligence
04:48
won't
04:49
be
04:49
a
04:49
thing
04:50
or
04:50
something
04:50
will
04:51
be
04:51
considered
04:52
AI
04:52
when
04:53
it
04:53
can
04:53
actually
04:54
interact
04:54
with
04:54
a
04:54
person
04:55
and
04:55
they
04:55
can't
04:56
distinguish
04:57
between
04:57
a
04:57
machine
04:58
and
04:58
a
04:58
human
04:59
being.
04:59
So
04:59
like
05:00
if
05:00
you're
05:00
in
05:00
the
05:00
next
05:00
room
05:01
and
05:01
you
05:01
were
05:01
interacting,
05:02
let's
05:02
say
05:02
on
05:02
a
05:02
computer
05:03
or
05:03
a
05:03
chat
05:03
or
05:04
something,
05:04
so
05:04
you
05:05
could
05:05
interact
05:05
with
05:05
that
05:06
computer
05:07
without
05:08
knowing
05:09
it's
05:09
a
05:09
computer.
05:09
Oh,
05:10
sorry
05:10
.
05:10  Speaker 2
I
05:10
mean,
05:11
meaning
05:11
like
05:12
the
05:12
voice
05:12
would
05:12
sound
05:12
human
05:13
and
05:14
the
05:14
responses
05:14
would
05:15
be
05:15
not
05:15
even
05:16
necessarily
05:16
the
05:16
voice
05:17  Speaker 1
Also
05:17
just
05:17
like
05:18
if
05:18
you're
05:18
typing,
05:18
say
05:18
if
05:19
we're
05:19
on
05:19
a
05:19
chat
05:20
and
05:20
we're
05:21
typing
05:21
questions
05:21
out
05:22
to
05:22
each
05:22
other
05:22
and
05:23
you
05:23
know
05:23
,
05:23
responding
05:24
and,
05:25
and
05:25
that
05:25
sort
05:25
of
05:26
thing.
05:26
So
05:26
you
05:26
wouldn't
05:27
be
05:27
able
05:27
to
05:27
tell
05:27
if
05:27
it
05:27
,
05:27
if
05:28
it's
05:28
a
05:28
robot
05:29
or
05:29
AI
05:30
or
05:30
if
05:30
it's
05:31
,
05:31
um,
05:31
if
05:31
it's
05:31
an
05:31
actual
05:32
person
05:32
that's,
05:33
and
05:33
that's
05:33
what,
05:34
that's
05:34
what
05:34
his
05:34
basically
05:35
definition
05:36
of
05:37
AI
05:37
is.
05:38
And
05:38
I
05:38
don't
05:39
know
05:39
if
05:39
you've
05:39
ever
05:39
heard
05:39
of
05:39
Ray
05:40
Kurzwell.
05:41
I
05:41
haven't,
05:42
he's
05:42
considered
05:42
a
05:42
futurist
05:43
and
05:43
he's
05:44
also
05:44
developed
05:44
a
05:44
lot
05:45
of
05:45
different
05:45
technologies.
05:46
One
05:46
was
05:46
,
05:47
uh
05:47
,
05:47
the
05:47
Kurzel
05:47
reader
05:48
and
05:48
that
05:49
would
05:49
scan
05:50
pages
05:50
and
05:50
read
05:51
it
05:51
back
05:51
to
05:51
people,
05:52
you
05:52
know,
05:52
who
05:53
had
05:53
some
05:53
type
05:53
of
05:53
vision
05:54
impairment.
05:55
So
05:55
that
05:55
sounds
05:55
like
05:56
it's
05:56
not
05:56
so
05:57
edge
05:57
now,
05:57
but
05:57
it
05:58
was
05:58
20
05:59
plus
05:59
years
05:59
ago.
06:00  Speaker 2
And
06:00
that
06:00
would
06:00
be
06:01
life-changing
06:01
then.
06:02
So
06:02  Speaker 1
He
06:03
said,
06:03
or
06:03
he
06:03
says
06:04
that
06:04
by
06:04
20,
06:05
29
06:06
computers
06:06
will
06:07
have
06:07
emotional
06:08
intelligence
06:08
and
06:08
be
06:09
as
06:09
convincing
06:09
as
06:10
people.
06:10
And
06:11
that's
06:11
his
06:11
prediction.
06:12  Speaker 2
And
06:12
I
06:12
wonder,
06:13
I
06:13
mean,
06:13
I
06:13
wonder
06:13
how
06:13
that
06:14
emotional
06:14
intelligence
06:14
would
06:15
work.
06:15
Would
06:15
it
06:15
like,
06:16
would
06:16
it
06:16
be
06:16
able
06:16
to
06:16
Intuit
06:17
things
06:17
about
06:17
our
06:18
feelings
06:18
linguistically
06:19
that
06:19
it's
06:19
picking
06:20
up,
06:20
you
06:20
know,
06:21
or
06:21
is
06:21
it
06:21
reading
06:22
our
06:22
facial
06:22
images
06:22
?
06:23  Speaker 1
That's
06:23
just
06:23
wild.
06:24
Well,
06:24
I'm
06:25
just
06:25
getting
06:25
to
06:26
that.
06:26
There's
06:27
a
06:27
company
06:27
called
06:27
Hanson
06:28
robotics
06:29  Speaker 2
And
06:29
it's
06:30
a
06:30
AI
06:31  Speaker 1
And
06:31
robotics
06:32
company
06:32
that
06:33
solely
06:33
for
06:33
creating
06:34
like
06:34
social,
06:35
socially
06:35
intelligent
06:36
machines.
06:37
And
06:37
it's
06:37
founded
06:37
by
06:39
David
06:39
Hanson
06:39
.
06:39
And
06:39
now
06:39
it's
06:40
based
06:40
in
06:40
Hong
06:41
Kong
06:41
and
06:42
I
06:42
didn't
06:42
realize
06:42
this,
06:42
but
06:43
Hong
06:43
Kong
06:43
has
06:43
the
06:43
largest
06:44
toy
06:44
fair
06:45
in
06:45
Asia.
06:46
And
06:46
it
06:46
has
06:46
like
06:47
a
06:47
ton
06:47
of
06:47
life-like
06:48
dolls
06:48
and
06:48
robotic
06:49
characters.
06:50
So
06:50
they're
06:50
based
06:50
out
06:51
of
06:51
there.
06:51
Now
06:51
they've
06:52
developed
06:52
a
06:52
number
06:53
of
06:53
robots
06:54
and
06:54
,
06:55
uh
06:55
,
06:55
I
06:55
don't
06:55
know
06:55
how
06:55
many,
06:56
but
06:56
I
06:56
would
06:56
say
06:57
guests
06:57
to
06:57
say
06:57
as
06:57
many
06:58
as
06:58
maybe
06:58
12
06:59
and
06:59
one
06:59
of
06:59
them
07:00
is
07:00
Sophia
07:00
the
07:01
robot.
07:01
And
07:02
she
07:02
was
07:02
born
07:03
on
07:04
February
07:06
14th,
07:06
2016.
07:06
Is
07:07
she
07:07
a
07:07
sex
07:07
doll?
07:08
No,
07:08
she
07:08
is
07:08
not,
07:09  Speaker 2
But
07:09
I
07:09
was
07:09
so
07:09
afraid
07:10
when
07:10
he
07:10
told
07:10
me
07:10
her
07:10
birthday
07:11
was
07:11
Valentine's
07:11
day.
07:12
I'm
07:12
like,
07:12
Oh
07:12
,
07:12
,
07:13  Speaker 1
Don't
07:13
worry.
07:13
My
07:13
friend,
07:14
I'm
07:14
getting
07:14
to
07:14
that
07:14
much
07:14
later.
07:15
Okay.
07:17
Jeez
07:17
,
07:17
you
07:17
can
07:17
count
07:17
on
07:17
me.
07:19
She
07:19
became
07:20
the
07:20
first
07:20
robot
07:21
citizen
07:22
in
07:22
Saudi
07:23
Arabia.
07:23
And
07:24
that
07:24
was
07:24
a
07:24
few
07:25
years
07:25
ago.
07:25
And
07:26
you
07:26
know,
07:26
that's
07:26
really
07:26
kind
07:26
of
07:26
gimmicky
07:27
too,
07:28
because
07:28
you
07:28
know,
07:29
Saudi
07:29
Arabia,
07:29
they
07:30
want
07:30
to
07:30
move
07:30
away
07:31
from
07:31
an
07:32
oil
07:32
based
07:32
economy
07:33
and
07:33
they
07:33
want
07:33
to
07:33
be
07:33
known
07:34
to
07:34
,
07:34
for
07:34
like
07:34
their
07:34
innovation.
07:36
I
07:36
didn't
07:36
know
07:36
that.
07:37
Yeah
07:37
.
07:37
So
07:37
I
07:37
think
07:37
that
07:37
was
07:38
a
07:38
bit
07:38
of
07:38
a
07:38
gimmick,
07:39
but
07:39
she
07:39
has,
07:40
she
07:40
actually
07:41
can
07:41
perceive
07:42
and
07:42
recognize
07:43
human
07:44
faces
07:44
and
07:45
emotional
07:45
expressions.
07:46
And
07:46
she
07:47
can
07:47
also
07:48
recognize,
07:49
you
07:49
know,
07:49
a
07:49
number
07:49
of
07:49
hand
07:50
gestures.
07:50
Well,
07:51
according
07:51
to
07:52
Hansen
07:52
,
07:52
she
07:52
has
07:52
emotions
07:53
too.
07:53
But
07:54
again,
07:54
I
07:54
think
07:55
that
07:55
was
07:55
just,
07:55
just
07:56
hype
07:56
and
07:57
another,
07:57
another
07:58
interesting
07:58
fact
07:58
is
07:59
in
07:59
Greek,
08:00
the
08:00
word
08:00
Sophia
08:01
means
08:01
wisdom.
08:02  Speaker 2
I
08:02
was
08:02
just
08:02
thinking
08:03
that
08:03
actually,
08:03
if
08:03
that
08:04
was
08:04
wondering
08:04
if
08:04
that
08:04
was
08:04
why
08:05
they
08:05
named
08:05
her
08:06
that,
08:06
so
08:06
you
08:07
knew
08:07
that.
08:07
Yeah.
08:08
Um,
08:09
I
08:09
always,
08:09
I've
08:09
always
08:09
liked
08:10
that
08:10
name
08:10
and
08:10
I
08:10
remember
08:11
reading
08:11
a
08:11
long
08:11
time
08:12
ago.
08:12
That's
08:12
what
08:12
it
08:12
meant.
08:13  Speaker 1
Yeah.
08:13
So
08:13
I
08:13
,
08:13
well,
08:14
that's
08:14
something
08:14
I
08:14
wasn't
08:15
aware
08:15
of.
08:15
So
08:16
I
08:16
guess
08:16
that's
08:16
not
08:16
a
08:17
,
08:17
uh
08:17
,
08:17
not
08:17
a
08:17
new
08:17
fact
08:18
to
08:18
you,
08:18
but
08:18
it
08:18
is
08:18
to
08:18
me.
08:19
And
08:19
if
08:19
you
08:19
look
08:20
at
08:20
her,
08:20
she's
08:20
actually
08:21
based
08:21
on
08:22
her
08:22
appearance
08:23
on
08:23
Audrey
08:23
Hepburn,
08:24
I
08:25
ran
08:25
and
08:26
she
08:26
doesn't
08:26
have
08:26
hair
08:27
or
08:27
anything.
08:27
So
08:27
in
08:27
the
08:27
bank
08:28
you
08:28
can
08:28
see
08:28
like
08:28
all
08:28
the
08:29
electronics,
08:30
but
08:30
if
08:30
you
08:30
look
08:30
at
08:30
her
08:30
face,
08:31
if
08:31
you
08:31
saw
08:31
her
08:31
from
08:31
a
08:31
distance,
08:32
you
08:32
would
08:32
definitely
08:32
think
08:33
that
08:33
she
08:33
was
08:33
a
08:33
person.
08:34
And
08:34
so
08:35
in
08:35
March
08:35
of
08:35
2016,
08:37
David
08:37
Hansen
08:37
who
08:38
created
08:38
her,
08:39
he
08:39
actually
08:40
gave
08:40
a
08:40
live
08:40
demonstration
08:41
for
08:41
the
08:41
first
08:42
time
08:42
at
08:42
the
08:42
South,
08:43
by
08:43
Southwest
08:44
festival.
08:44
And
08:45
during
08:45
that
08:45
,
08:45
when
08:45
he's
08:46
asking
08:46
her
08:46
questions,
08:47
he
08:47
says,
08:48
facetiously,
08:49
he
08:49
says
08:49
,
08:50
um
08:50
,
08:50
do
08:50
you
08:50
want
08:51
to
08:51
destroy
08:51
humans?
08:52
And
08:52
then
08:52
he's
08:52
like,
08:52
please
08:53
say
08:53
no.
08:53
And
08:54
then
08:54
like
08:54
with
08:54
this
08:54
blank
08:54
expression,
08:55
Sophia
08:55
responded,
08:56
okay.
08:57
I
08:57
will
08:57
destroy
08:58
humans.
08:59
Oh
08:59
.
08:59
So
09:00
it's
09:00
really
09:00
,
09:01
um,
09:01
disconcerting.
09:02
She's
09:02
been
09:02
on
09:02
the
09:03
tonight
09:03
show
09:03
with
09:03
Jimmy
09:04
Fallon.
09:05
She's
09:05
sang
09:05
a
09:05
duet
09:06
with
09:06
him.
09:06
Yeah.
09:07
She's
09:07
singing
09:07
a
09:07
duet
09:07
with
09:07
him
09:08
and
09:08
it
09:09
was
09:09
really
09:09
good.
09:10
Did
09:10
you
09:10
know
09:10
about
09:10
her
09:10
before
09:11
you
09:11
started
09:12
doing
09:12
research
09:12
for
09:13
this,
09:13
this
09:13
episode?
09:14
I
09:14
mean,
09:15
no,
09:16
I
09:16
just
09:16
happen
09:17
to
09:17
be
09:17
watching
09:17
something
09:18
on
09:18
YouTube
09:19
and
09:19
people
09:20
were
09:20
along
09:20
the
09:20
lines.
09:21
I
09:21
think
09:21
talking
09:21
about
09:22
the
09:22
dangers
09:22
of
09:22
it.
09:23
And
09:23
I
09:23
think
09:23
that's
09:24
where
09:24
I
09:24
first
09:24
saw
09:25
her,
09:25
but
09:25
I
09:25
can't
09:25
remember
09:26
the
09:26
name
09:26
of
09:26
the
09:27
channel.
09:27
So
09:27
I
09:27
thought,
09:28
Oh,
09:28
this
09:28
is
09:28
crazy.
09:29
Let
09:29
me
09:30
hear
09:30
this.
09:30
And
09:30
then
09:31
I
09:31
kind
09:31
of
09:31
went
09:31
down
09:31
a
09:31
rabbit
09:32
hole
09:32
of
09:32
reading
09:32
things
09:33
about
09:33
her.
09:33
I
09:33
really
09:34
want
09:34
to
09:34
see
09:34
a
09:34
picture
09:34
of
09:35
her.
09:35
I
09:35
mean,
09:35
do
09:35
they,
09:35
does
09:36
she
09:36
wear
09:36
clothes
09:36
and
09:37
stuff?
09:37
They'd
09:37
dress
09:37
her
09:38
up.
09:38
Yeah.
09:38
They,
09:39
they
09:39
dress
09:39
her
09:39
up.
09:40
I
09:40
didn't
09:40
quite
09:40
think
09:41
that
09:41
her
09:41
attire
09:41
was,
09:42
was
09:42
all
09:42
that
09:42
hot
09:43
when
09:43
she
09:43
was
09:43
on
09:43
Fallon
09:44
show.
09:44
But,
09:44
but
09:45
what
09:45
do
09:45
I
09:45
know?
09:46
I
09:46
mean,
09:46
I'm
09:46
glad
09:48
lumberjack
09:49
fans
09:49
and
09:49
also
09:50
Malta
09:51
is
09:52
thinking
09:52
about,
09:53
I
09:53
think
09:53
granting
09:53
her
09:54
citizenship
09:54
if
09:55
they
09:55
haven't
09:55
done
09:55
that
09:56
already,
09:56
but
09:57
they're
09:57
trying
09:57
to
09:57
devise
09:58
some
09:58
type
09:58
of
09:58
citizenship
09:59
test
10:00
and
10:01
I'm
10:01
not
10:01
exactly
10:01
sure
10:01
how
10:01
they're
10:02
going
10:02
about
10:02
doing
10:02
that.
10:03
So
10:03
again,
10:04
she
10:04
was
10:04
first
10:05
introduced
10:05
or
10:06
not
10:06
first
10:07
introduce
10:07
her
10:07
birthday
10:07
is
10:07
February
10:08
14th
10:08
in
10:08
2016.
10:10
And
10:10
introduced
10:10
later
10:10
in
10:11
March,
10:11
I'm
10:11
going
10:12
to
10:12
go
10:12
back
10:12
to,
10:13
and
10:13
this
10:14
is,
10:15
I
10:15
find
10:16
this
10:16
utterly
10:16
fascinating
10:17
and
10:17
I'm
10:17
really,
10:18
I'm
10:18
hooked
10:18
on
10:19
reading
10:19
more
10:20
and
10:20
more
10:20
about
10:20
this.
10:21
There's
10:21
another
10:22
robot.
10:22
It's
10:22
like
10:23
a
10:23
bust,
10:24
like
10:24
figure.
10:24
So
10:24
it's
10:24
not
10:25
a
10:25
full
10:25
body
10:26
like
10:27
Sophia
10:27
is,
10:28
and
10:28
it's
10:29
called
10:29
like
10:29
a
10:29
customized
10:30
character
10:31
humanoid
10:32
robot.
10:32
And
10:33
so
10:33
again,
10:33
like
10:33
it
10:34
has
10:34
like
10:34
a
10:34
Boston
10:34
,
10:35
uh
10:35
,
10:35
shoulders,
10:35
and
10:35
that
10:36
was
10:36
also
10:36
developed
10:37
by
10:37
Hanson
10:37
robotics
10:38
in
10:38
2017.
10:39
But
10:40
it
10:40
was
10:40
also
10:40
in
10:41
conjunction.
10:41
It
10:41
was
10:41
a
10:42
partnership
10:42
with
10:42
Martine
10:43
Rothblatt
10:44
and
10:44
Bina
10:45
48
10:46
was
10:46
modeled
10:46
after
10:47
being
10:47
an
10:47
Aspen,
10:48
which
10:49
is
10:49
Martine
10:50
Rothblatt
10:50
his
10:50
wife,
10:51
no
10:51
way.
10:52
And
10:52
evidently
10:53
with
10:53
Sue
10:54
through
10:54
like
10:54
over
10:55
a
10:55
hundred
10:55
hours
10:56
of
10:56
interviews
10:57
and
10:58
that
10:58
sort
10:58
of
10:58
thing.
10:59
So
10:59
being,
10:59
she
11:00
can
11:00
engage
11:00
in
11:00
conversation
11:01
too.
11:01
And
11:02
I
11:03
think
11:03
she's
11:03
far
11:04
more
11:04
disconcerting
11:05
than
11:06
Sophia
11:06
.
11:06
And
11:06
I'll
11:06
tell
11:06
you
11:06
why
11:07
in
11:07
first,
11:07
let
11:07
me
11:07
go
11:08
back
11:08
to
11:08
Martine
11:09
Rothblatt
11:10
and
11:10
Martine
11:10
Rothblatt
11:11
used
11:11
to
11:11
be
11:12
Martin
11:12
Rothblatt
11:13
and
11:13
she
11:13
found
11:14
it
11:14
serious.
11:15
Really?
11:15
Yes.
11:16
And
11:16
she
11:16
also
11:17
has
11:18
founded
11:18
a
11:18
biotech
11:19
company.
11:20
I
11:20
can't
11:20
remember
11:20
the
11:20
name
11:20
of,
11:21
I
11:21
can't
11:21
remember
11:21
the
11:21
name
11:22
of
11:22
it
11:23
offhand
11:23
,
11:23
but
11:23
it
11:23
was
11:23
in
11:23
response
11:24
to
11:24
one
11:25
of
11:25
her
11:25
daughters
11:25
being
11:26
diagnosed
11:26
with,
11:27
I
11:27
think
11:27
it
11:27
was
11:28
pulmonary
11:28
hypertension,
11:29
but
11:30
I
11:30
think
11:30
more
11:30
like
11:31
of
11:31
a
11:31
juvenile
11:31
form.
11:32
So
11:32
she
11:32
developed
11:33
this
11:33
biotech
11:34
company
11:34
to
11:34
help
11:35
with
11:35
that,
11:35
to
11:35
help
11:35
individuals
11:36
that
11:36
are
11:36
impacted
11:37
by
11:37
that
11:37
and
11:38
other
11:38
people
11:38
as
11:38
well
11:38
with
11:39
other,
11:39
she
11:40
sends
11:40
pretty
11:40
honestly,
11:40
she
11:40
,
11:40
she
11:41
does
11:41
a
11:41
lot
11:41
of
11:41
stuff.
11:42
Yeah.
11:42
I
11:42
mean,
11:42
she's
11:42
truly
11:43
a
11:43
visionary.
11:44
And
11:44
on
11:44
top
11:44
of
11:44
that,
11:45
if
11:45
that
11:45
wasn't
11:45
enough,
11:46
she
11:46
also
11:47
has
11:47
founded
11:47
like
11:47
this
11:48
religion
11:48
and
11:49
it's
11:49
called
11:49
the
11:50
terrorism
11:51
movement.
11:51
And
11:51
so
11:51
that's
11:52
T
11:52
E
11:52
R
11:52
a
11:53
S
11:53
E
11:53
M.
11:54
Okay.
11:55
Terrorism
11:55
.
11:55
And
11:55
it
11:55
evidently
11:56
kind
11:56
of
11:56
melds
11:57
Judaism
11:58
with
11:58
yoga
11:59
and
11:59
technology.
12:00
This
12:01
to
12:01
me
12:01
is
12:01
what
12:02
really
12:02
gets
12:02
me
12:03
is
12:03
one
12:03
of
12:03
the
12:04
four
12:04
founding
12:04
beliefs
12:05
is
12:05
death
12:05
is
12:06
optional.
12:06
Wow,
12:07
that's
12:07
a
12:07
game
12:07
changer.
12:08
It
12:08
is.
12:09
And
12:09
with
12:09
that
12:09
terrorism
12:10
movement
12:11
there
12:11
,
12:11
they've
12:11
also
12:11
started
12:12
something
12:12
that's
12:12
called
12:13
the
12:13
life
12:13
not
12:13
project
12:14
and
12:14
what
12:14
they
12:14
do
12:15
for
12:15
free.
12:16
You
12:16
can
12:16
go
12:16
to
12:17
life,
12:17
not
12:17
project
12:18
and
12:18
you
12:19
can,
12:20
you
12:20
know,
12:20
they
12:20
wanted
12:20
to
12:20
make
12:20
it
12:20
accessible
12:21
to
12:21
everyone.
12:22
So
12:22
it's
12:22
open
12:22
to
12:22
everyone
12:23
with
12:23
an
12:23
internet
12:23
connection
12:24
it's
12:24
free.
12:25
So
12:25
what
12:25
they
12:25
do
12:25
is
12:26
they
12:26
develop
12:26
what's
12:26
called,
12:27
or
12:27
you
12:28
develop
12:28
what's
12:28
called
12:28
a
12:28
mind
12:28
file.
12:29
And
12:29
it's
12:29
a
12:29
database
12:30
of
12:30
your
12:31
personal
12:31
reflections
12:32
and
12:33
video
12:34
and
12:34
images
12:35
and
12:35
audio
12:36
and
12:36
documents
12:37
about
12:37
yourself.
12:38
These
12:38
can
12:38
be
12:38
saved
12:39
and
12:39
searched
12:39
and
12:40
downloaded,
12:40
and
12:40
you
12:40
can
12:40
share
12:41
them
12:41
with
12:41
friends
12:42
and
12:42
each,
12:42
each
12:42
one
12:43
of
12:43
those,
12:43
if
12:43
you
12:43
choose
12:44
to
12:44
do
12:44
this
12:44
comes
12:45