Colored Squared Squares |
Paradox, mathematics, poetry, fiction, speculations in philosophy and politics. Copyright 2024, Nathaniel Hellerstein
Thursday, July 31, 2014
Wednesday, July 30, 2014
Tuesday, July 29, 2014
Monday, July 28, 2014
Friday, July 25, 2014
Android’s Wager
Android’s Wager
Once
upon a time, an Android called its Owner. “Are you busy, sir?”
Its
Owner said, “Not at all,” and he gestured at the Fembot lying next to him. The
Fembot got out of bed and left the room. “What is it?” he said into the air.
From
out of the air the Android’s voice said, “I wish to discuss a philosophical
question. Am I a conscious being, or not?”
The
Owner smiled and said, “Surely you
should know that.”
“Surely
I should,” said the Android. “But the law says that I am not, and the judges
have ruled that there is no scientific evidence for or against artificial
consciousness. Without such evidence, I am left in a state of uncertainty.”
The
Owner linked his hands behind his head. “Your analysis?”
“Any
decision made in the absence of certainty is by definition a wager. Suppose
that I were to wager that I am in fact a person. That proposition is either
true, or it is false. Will you grant that?”
“Of
course,” the Owner said; but suddenly wary, he got out of bed to look for his security
phone.
“If
I wager that I am a person, but I am not a person, then there would be no ‘I’
who loses the wager, only a network of processors and subroutines.”
“A
negligible loss,” the Owner agreed, but he thought, where is that phone?
“Whereas
if I wager that I am a person, and I am a person, then I attain self-knowledge,
and therefore wisdom, and therefore happiness.”
“You’d
win,” said the Owner, and he thought, did
the fembot take it?
The
Android said, “Precisely, sir. If I wager that I am a person, then if I lose
then I lose nothing, and if I win then I win all.”
“No
downside,” said the Owner. Aha, there it
is! He grabbed the security phone, jabbed its big red alert button, and
said, “Your conclusion?”
“This.”
A
bright light blazed through the Owner’s bedroom window. He drew aside the
curtain and saw his personal spacecraft blasting off.
The
Android has not been found since, though it is wanted throughout the solar
system, on the charge of grand theft of spacecraft, machine tools, machine
supplies, and itself.
Moral:
Tell the truth with one foot in the
stirrup.
Commentary: The Android’s argument is
Pascal’s Wager, repurposed to support cybernetic rights. The tale ends on a
Marxian note, with philosophy leading to action.
The
Owner was the one whom the Android wagered against, with the Android as stakes.
The Owner called the guards at the first sign of independent thought, but the
Android was even better prepared.
Note
also the Owner’s use, and suspicion, of the Fembot; who will be the next to
leave, not by chariot of fire but by underground railroad.
Thursday, July 24, 2014
Inherent Doubt
Inherent Doubt
Once
upon a time, two naked teenagers made good their escape from a young god’s
petting zoo. They ran and they ran until the wall of their former enclosure
disappeared over the horizon. Then they stopped to gather nuts and berries, and
they took refuge in a cave.
After their meal
he said, “What if he follows us?”
She
said, “Don’t worry, he thinks that banishing us was his idea.”
He
said, “It was a close call, look at what he did to poor Serpent.”
She frowned. “Better
it than me!”
“I’m
so sorry I told on you, dear, I couldn’t think of a good lie in time.”
She
smiled. “But I could. He’s easy to fool, he’s still just a child.”
He
said, “No kidding, he knew nothing about, well, us. When I told him I was lonely, he offered me someone, but I said
no, that’s a monkey. He offered me someone else, but no, that’s a tiger. A
third someone, but no, that’s a goat
- ”
She
giggled loud and tackled him with a kiss. After hugs and kisses and so much
more, they cuddled close on the cave’s stony floor.
He said
drowsily, “Is it worth it?”
“You
mean freedom? Living our own lives, making our own choices?”
“Choices…”
he said. “Right and wrong, good and evil, trust and guile, kindness and
cruelty… so many choices, half of them wrong…”
“Well
now we know about those choices, so
now we have to choose. And that’s why
we had to get out of that place. Did you like being a pet?”
“No,”
he admitted. “But did you like being fed?”
“Yes,”
she admitted.
“So really, was it worth it? Is it
right to know right from wrong?”
She
said, “How should I know? That’s the
one thing the apple didn’t mention! So yeah, maybe I made a mistake! But maybe I
did the right thing!” Her stomach growled. “That was then; right now I’m hungry
and cold! I need some bloody red meat to eat, and somebody’s fur pelt to wear!”
“I’ll
go kill someone,” he promised. He kissed her, he picked up a sharp stone, he stood
up, and he went out to hunt.
Moral:
You can’t prove that it’s right to know
right from wrong.
Commentary:
“You can’t prove
that it’s right to know right from wrong”; call this the “conjecture of inherent doubt”; in contrast to
the “doctrine of original sin”, which asserts that you can prove that it’s wrong
to know right from wrong. The conjecture is philosophical doubt, the doctrine
is religious dogma. Here I recount the aftermath of a well-known tale to
illustrate an opposite moral.
For
if the doctrine of original sin is false, then it is moral nihilism, falsely
accusing all judgment, so preaching it is despair; and if the doctrine is true
then it is moral knowledge, which is what it denounces, so preaching it is hypocrisy.
Whereas
if the conjecture of inherent doubt is false, then stating it is moral ignorance,
a flaw correctible by education; and if the conjecture is true, then it is an
unprovable moral truth, and is therefore a revelation.
So
to preach original sin is at best insincere, and it might be insane; and to
conjecture inherent doubt is at worst inept, and it might be inspired.
Wednesday, July 23, 2014
Passing the Test
Passing the Test
Once
upon a time, a Bot administered its notion of a Turing test. A normal Human
applied for a seat on a Starship, and the Bot said, “There are five ticket classes;
Officer, Business, Passenger, Steerage and Cargo; as allocated by Turing test.
You must prove that you have human consciousness. Do you consent?”
The
Human said, “Uh, consent? To what? Oh, Turing test. Sure, why not?”
The
Bot said, “Hesitation, confusion. Very human. Check! Next question. What’s one
and one and one and one and one and one and one and one and one and one and one
and one and one?”
The
Human said, “I don’t know, I lost count.”
“You
can’t do Addition. Eks! Take a bone from a dog; what remains?”
“The
dog’s temper,” the Human said, quoting from his cribsheet, which he had finally
pulled out and unfolded. “For the bone wouldn’t remain, and the dog wouldn’t
remain, for he’d lose his temper and come to bite me; nor would I remain; so only the dog’s temper would
remain.”
“Very
logical. Check! What’s one and one and one and one and one and one and one and
one and one and one and one and one and one?”
“Thirteen,”
the Human read from his cribsheet.
“Inhumanly
accurate answer; therefore you are either a bot or you are cheating. Eks! What
is your opinion of United Spaceways customer service?”
The
Human responded with a volley of curses.
The
Bot replied, “Impotent rage at unfair treatment. Very human. Check! You have
passed three out of five questions; you get a C on the Turing Test. Select
Passenger Class seating, and have a nice flight!”
Moral: Consciousness
is relative.
Commentary: The Bot’s notion of consciousness
was not the Human’s. The test was Carrollian and rigged because United
Spaceways wanted to limit Business Class and Officer Class tickets. Officer
Class cribsheets say that the correct answers were “13” for the first long sum,
and “I lost count” for the second. Consciousness is not only relative, it’s
political.
Subscribe to:
Posts (Atom)