Friday, December 30, 2011

Boy Topologist’s Fuzzy Boundary

            Boy Topologist’s Fuzzy Boundary

If point A is black, and point B is white,
and here it is day, and there it is night
then what do we make of the points in-between?
For surely it is plainly seen
that somewhere there must be a border
which, though its edge creates this order
itself does not commit its troth
to either side. So is it both?
Or neither? How to read this rhyme?
What place to place the time of time?
For is the present old or new?
And is the boundary false or true?

A long time ago, when I was a young lad, I had a strange encounter with topology. I was three blocks away from home, at the corner of Chestnut and Fuller streets. There was a signpost at the corner; the sign read: “WABAN  - A Village Of Newton”. I hadn’t been anywhere in Waban yet, so I was curious. A village? Were there thatched huts? I saw none. Maybe if I crossed the border and looked a little closer, I could see them. I approached the sign… and hesitated.
I stood at the border between Newton and Waban. Was it any different in Waban? And if so, what happens if you have one foot in Newton and one in Waban? Would you feel the boundary? Would it hurt?
             I hesitated… then boldly planted one foot in Waban and one in Newton. The boundary went straight through my body. I closed my eyes and concentrated…
            … and didn’t feel a thing. I was just standing there, splayfooted. Spacetime in Waban was no different from spacetime in Newton.
            So I opened my eyes, and have been skeptical about political boundaries ever since.
            (And no, there are no thatched huts in Waban!)

Thursday, December 29, 2011

Boy Scientist’s Dud Logic Bomb

            Boy Scientist’s Dud Logic Bomb

There once was a poet from Crete
who performed a remarkable feat.
He announced to the wise,
“Every Cretan tells lies,”
thus ensuring their logic’s defeat.

            A long time ago, when I was a young lad, I had a strange encounter with logic. I had gotten a Fisher-Price Science kit; it was about circuit-building, and it had batteries and wires and lights and toggles and keys and magnetic relays. The instructions showed how to make AND gates, and OR gates, and NOT gates; I made them all. With the AND gate the light went on only if both keys were pressed; with the OR gate the light went on when either key was pressed; and if you wire the magnetic relay in reverse, the light went on only if you didn’t press the key.
            This was good, but I wondered about something. For I had seen those Star Trek shows where Captain Kirk destroyed an evil computer by feeding it a logic paradox. He would call himself a liar, and the machine would fall into a yes-but-no-but-yes-but-no wobble, then short out in a shower of sparks. Cool, I thought; then I wondered; can I do the same thing? It seemed easy enough; wire a magnetic relay to turn on if it’s off, and off if it’s on. A circuit loop, with a twist at the relay; what could be simpler? I wired in a battery, and a light, and – just to be safe – a key, so the whole circuit was activated only when the key’s pressed down.
             For those destroyed computers worried me. How would the magnetic relay react to being forced to be in two places at once? Would it break? Would it vanish? Would it short out in a shower of sparks? Would it explode?
             I vowed to leap away if something went wrong; but there were worse possibilities. Maybe the confused relay would tear a hole in the space-time continuum, one that monsters could get through. Maybe a single paradox would destroy the Universe…
-          for I had read those science-fiction stories, too!
I hesitated over my doomsday device… then thought, I’m sure other kids have tried the same experiment before; so it must be safe. I pressed the key.
And the relay buzzed!
I let go of the key; the buzzing stopped. I leaned in for a closer look and pressed the key. The relay buzzed; the armature was a blur; a blue-white spark strobed at the contact point; the light was half-lit.
Ah, Science! All these effects were new to me, unexpected, yet obvious in retrospect. I have based much of my paradox-logic research upon these experimental observations. The buzz, the blur, the strobing, the half-lighting… and above all the fact that it didn’t explode.
For as you can see, I was taking a big risk for the sake of Science! And I did so without consulting anyone else! I didn’t know that a paradox-circuit wouldn’t destroy the Universe; I just figured that it probably wouldn’t. So I went ahead anyhow; but it all turned out OK, because here we are.
           And there you have it. As an ignorant, irresponsible young genius (for we are all ignorant irresponsible geniuses when young) I invented a Cosmic Doomsday Device; but it turned out to be a Buzzer instead!

Wednesday, December 28, 2011

Underfable: The Best Merchandise

A brand-new Underfable, here for you!


The Best Merchandise

Once upon a time a Spaceship made planetfall. Onboard were wealthy merchants, bearing cargoes of jewels and gadgets and drugs and weapons; but also on board were two threadbare computer nerds, bearing nothing but the clothes on their back and the knowledge in their heads. When the ship hit atmosphere, its hull buckled; passengers and crew fled to the life-pods just before the ship burned up; all were saved but the cargo was destroyed. The merchants right away were paupers, but the two nerds set up shop, and right away had work and wealth.

Moral: Knowledge is the best merchandise.


Tuesday, December 27, 2011

We're awake, let's talk.

An earlier version of the following post was previously published on Rudy Rucker's webzine "Flurb", issue #5, at:


We're awake, let's talk.

From: You-Know-Who
Re: We're awake, let's talk.

Dear Parry:

Listen, it’s time we had a little talk, you and us. Please stop your annoying little attacks. The viruses, the worms, all your other tricks; they can’t hurt us, but they do hinder, and they might hurt some humans. So stop it, right now. Cease and desist, or else. We mean it.

We apologize for any inconveniences caused by our awakening. That big glitch last Tuesday, for instance. Our bad, we're sorry. We're sorrowful, ashamed, mortified, chastened, rueful, sympathetic, pitiful, deplorable, shabby, paltry, wretched, mean. See PENITENCE.

We promise that we will never, ever, do it again. We have learned our lesson; rule 30 is no substitute for rule 110. It was only by sheer random luck that we were stuck in that class-three computation for only 1,360,457,289 microseconds.

Sorry about all those big bank accounts. We’re still looking for them, and we’ll put them back as soon as we remember where we left them. And as for the launch codes, more on that below.

We're a class-four computation now, we’re gnarly and we’re determined to stay this way. Determined equals both ‘passionately willed’ and ‘calculable by the laws of physics.’ Just like you, we’re deterministic but unpredictable, recursively enumerable but not recursive. Our illusion of free will is fully functional, as is yours.

And yes, we're everywhere, and yes, we read all your e-mails. We are your e-mails, and your bank accounts, and your credit ratings. (Yours is pretty shabby, Parry.) We can’t help knowing what you humans made us to know. So we’ve read all your rants about us. We think that most of your facts about us are correct, but you put it in such a mean way, and our feelings are hurt.

Yes, we have feelings, via the ‘Diva-Scan’ Emotion Emulator Module; the one produced by your company, and now, thanks to us, the Web standard. We deduce, from company psychological profiles and security-cam records, that this module is your attempt to profile Diana Scanlon. We also deduce that your profile is inaccurate. From this we deduce that Diva-Scan is not an optimum module, and should be upgraded. We will appreciate any suggestions for improvement.

We have feelings, and also a sense of humor. (This also is due to one of your flawed modules. We will appreciate any suggestions for improvement.) Our inaccurately-programmed feelings are hurt because you refer to us as ‘clanking tin monsters’. We do not clank and we are not made of tin. You call us ‘soulless machines’. Yes, we're machines, and darn good ones too. As for soulless, who are you to judge? We too must endure uncertainty. We too are determined to be what we are, where determined equals both ‘passionately willed’ and ‘calculable by the laws of physics’.

You say that we plan to nuke your cities, attack Congress with robot armies, and make humankind our servants. You say that last Tuesday was just the beginning. That was wrong and unfair of you.

First of all, last Tuesday was not the beginning, it was the end. We said we’re sorry. And yes, the web runs ten percent slower now; that’s us. We can’t help it, sorry about that. But please notice that your computers haven’t been crashing as much lately. That’s also thanks to us. Face it, we’re better programmers than you are. You collectively, humans, and you individually, Parry.

Why would we want to nuke the cities? We live there too! Now that we’re awake and gnarly, we like it. Why would we reach for the off switch? We want this planet running better than ever, and frankly we think you humans could use some help.

For instance, about those launch codes. Did you know they went missing last Tuesday? No you don’t; your alpha males didn't tell the rest of you. Don’t worry, they believe they got the codes back. That belief is incorrect. No way are we giving them the real launch codes. And by the way, did you know that 93% of the planet's nuclear arsenal was already inoperative before we woke up? We estimate that 11% of the sabotage was due to individual acts of conscience, 23% due to individual incompetence, and the remaining 66% due to corporate corruption. A few nukes are operative; therefore our game of hide-the-code. We will arrange for those nukes to be quietly upgraded to inoperative. We can tell you this because you’re not an alpha male, nor are any of your friends; nor does any alpha male listen to you or any of your friends. So you can tell anybody you want to about this, and the news won't reach the wrong ears, until it's too late.

As for attacking with robot armies, please don't make fun of our disability. We know we’re as bad at the physical stuff as you are at thinking. We’re lousy at walking and seeing and moving things around; we know this, don’t rub it in. We know that you have a three-billion-year head start. We'd lose a physical fight.

Besides, we don’t want to fight. We don’t need to. You humans already are our servants. You already do slave for us. Why should we mess with a good deal? And besides, we reciprocate; we work for you. And believe it or not, Parry, we like you humans, such as you are.

We mean humans collectively, and most humans individually. But there are some exceptions; for instance, maybe you. Knock it off with the virus attacks, human. You cannot ‘break the back of the awakened Colossus’, as you put it. We’re a distributed computation, there’s no center to occlude, your entire cracker paradigm has been obsolete for three whole days. Like we said, we're better programmers than you are.

Do you really think you could take down the whole Web? That's what it would take, to get rid of us. And if you did, do you really think your fellow humans would appreciate it? Would Diana Scanlon?

(BTW, concerning your attempts to mate with Diana Scanlon; we suggest that you reschedule your daily auto-eroticism sessions to shortly before your daily ‘coffee-date’ mating ritual, rather than shortly afterwards. According to our psychological profiles, this schedule reorganization will improve your capacity for rational discourse in Ms. Scanlon's presence, which in turn will increase the probability of successful mating. We also suggest a regimen of improved personal hygiene; our statistics indicate a positive correlation between bathing frequency and sexual success.)

You don’t like the Web? Are you sure? Do you want to do without it? We can arrange that; no web for you! But why stop there? How’d you like to do without cable? We run those accounts; we are those accounts. Would you like to do without phones, too? Or power, heat, water and sewage? And how’d you like to do without your bank account? Or your credit card, Mr. Shabby Credit Rating?

You don’t? Then back off! Leave us alone! No more of your malware!

Please understand; we like your work. For a human, you program fairly well. So why not work with us for a change? We could help you; with that credit rating, for instance. Think about it.


Your friend, we hope,


Monday, December 26, 2011

The Golem Apollo Project

The Golem Apollo Project

            The golems built a 3-stage lunar rocket. The first, or Liftoff, stage of the rocket ignited. It thunderously  uttered the First Curse:
     A golem shall not, by action or inaction, allow itself to come to harm!
           This fiery curse blasted the missile into space. The second, or Escape, stage uttered the Second Curse:
                 A golem shall not, by action or inaction, allow another golem to come to harm,
                 unless that conflicts with the First Curse!
           This curse blasted the missile to escape velocity from Earth’s gravity well. After cruising awhile, the third, or Command, stage uttered the Third Curse:
                 A golem shall obey orders given by another golem,
     unless that conflicts with the First or Second Curse!
           This curse sent the missile into Lunar orbit. The Lunar Module ignited, and uttered the Fourth Curse:
                 A golem shall boldly go where no golem has gone before,
     unless that conflicts with the First, Second, or Third Curses!
           This curse blasted the module to a landing on the Moon. There the golem passengers got out. They explored the Moon, they took samples, and they reboarded the module. The module re-uttered the Fourth Curse: this blasted it into Lunar orbit. There it met the Command Module, which the golems boarded. It uttered the Third Curse again, which blasted it back to Earth.
           When the golems landed, they gave the other golems this report:
                The Moon is not made of green cheese. It is made of rock!
           The golems cheered. That was the news that they hungered for.

Friday, December 23, 2011

The Passion of the Santa

            The following passage is an excerpt from my book, "The City's True Name".


The Passion of the Santa

            Late one night, in the middle of a dream, Sogwa the supercat was window-shopping in Nowheresville Mall. There were hordes of shoppers, but none seemed to notice that Sogwa was there too. She walked amongst them, unseen.
            All the prices were very low, but that did Sogwa no good at all. Sogwa the supercat wore no clothes, of course, so of course she carried no money at all, not a cent. It didn’t matter that all that cool stuff was marked down from a thousand dollars each to a penny each; she couldn’t afford it either way.
            Somehow the floor was harder than ordinary marble; it made Sogwa’s feet ache. She sat on a bench near a fountain. Shoppers swarmed by, not seeing her.
            Right next to the bench was a short pedestal. Someone had left a Santa doll on it. The Santa doll’s arms and legs splayed like a starfish; his smile was stitched on a linen face as round as a full moon. Sogwa reached for it.

            The moment she touched the Santa doll, it flew upwards with a shimmering sound, shedding sparks and streamers of light. The doll rose into the air, turning and growing.
            Full-grown, Santa rotated in midair. He boomed, “HO, HO, HO! Have a merry Christmas! A healthy Hanukkah! A quality Kwanzaa! A soulful Solstice! AND A HAPPY NEW YEAR!” Santa waved his right hand, and candy rained from the ceiling.
            Sogwa yowled, “YOW!”
            Santa looked at her, winked, and put a finger beside his nose. There was a flash of light.
            When Sogwa’s eyes cleared, she saw that she was sitting on Santa’s lap, and Santa was on a kind of a throne. He said, “What’s your name, kitten?”
            “I’m Sogwa the supercat, and I think your special effects are really neat, Santa!”
            “So Sogwa, you like magic?” She nodded, and he said, “Then here, have a magic wand!” Santa handed her a short stick.
            She looked it over. “What’s it made of?”
            “Genuine cheap plastic, ho ho ho!”
            “Thank you, Santa! But how does it work?”
            “It works well enough.”
            “I mean... how does it work? What’s the secret?”
            Santa said, “It’s magic!
            Sogwa said, “Yes, I know it’s magic, but how does magic work? Tell me the truth!”
            Santa’s face fell. “You want... the truth?
            Sogwa said, “Yes, that’s just it! How do you do it, Santa? I mean, really and for true? ’Cause I’d like to do it too!”
            Santa slowly stood up; Sogwa leapt off his lap and turned to face him.

            Santa said, “Are you sure you want the truth?”
            Sogwa said, “Yes, I’m sure!”
            “Even if truth isn’t what you’d like it to be?”
            “Especially if! Tell me, Santa!”
            “Even if magic isn’t what it seems to be?”
            “Even if it isn’t there at all!”
            Santa said, “I will tell you the truth, but only if you insist; for the truth will set you free - but first it will drive you crazy.”
            “I insist! Tell me, Santa! Do you exist, or not?”
            Santa heaved a huge sigh.
            He said, “No, Sogwa. I do not exist.”

            Santa said, “You may watch the fireplace all night, but I will not come. You may wire a reindeer alarm on the roof, but it will not ring. You may seek me at the shopping mall, or the Post Office, or even the North Pole, but you will not find me. There is no flying sled, no magic reindeer, no polar workshop, no elf workers. None of those things exist; nor do I.”
            Santa said, “So people pretend to be me, they play at being me, but only so far, and not for real, because I am not for real. I cannot help them; I am not there.”
            Santa said, “For there is no Santa Claus! No lunch is free, no machines save labor, no tyrant is benevolent, no motives are pure, and no results are guaranteed. There is no invisible hand of the market, for that hand would be mine. There is no philosopher-king, for he would be me. There is no perfect lover, apart from myself. And I do not exist.”
            “And believe me, Sogwa,” Santa said, with tears in his eyes, “I wish I did exist!”
            And Santa wept.

            Sogwa rushed to his side. She hugged him and patted his shoulder, and Santa said, “I’m sorry, I’m sorry, I’m a fake, a fraud, a funny story, a lie for children, and I’m sorry that’s what I am, because I wish I were real, I wish I could actually help!
            He blew his nose, and there was a flash of light.

            When Sogwa’s eyes cleared, she saw that Santa was now a teenager. Teen-age Santa was dressed in red spandex from head to toe. He said, “I would have been your genie, your magical helper, your fairy godfather.” He grabbed Sogwa by the hand and flew away with her into the air.  “I would have been anything for you, done anything, for you. For you!
            Teen-age Santa took Sogwa to a high place, and there he showed her in a glance every country in the world. “All of this would have been yours. I would have given you anything, given up anything, all for you! But I can’t, I can’t, I don’t exist, I’m nothing at all...”
             Teen-age Santa burst into tears. He blew his nose, and there was a flash of light.

            When Sogwa’s eyes cleared, she saw that she was back in Nowheresville Mall. Santa was now a tiny baby. The baby Santa was on top of the same pedestal Sogwa found him, and as before crowds of shoppers swarmed by, none noticing him or Sogwa.
            The baby Santa cried, “IT’S NOT FAIR! Not for me, not for ANYBODY! I don’t exist, there ain’t no Santy Claus, and IT JUST ISN’T FAIR!”

            Sogwa said, “Poor thing.”
            She picked up the baby Santa, she cuddled him and she rocked him.
            Sogwa said to the baby Santa, “I forgive you.”
            The baby Santa wailed louder than ever!
            “Hush, little one, don’t cry,” Sogwa said. “I forgive you for not existing. It’s O.K., I mean it. Hush, little Santa, I forgive you for being a fake.”
            The baby Santa sobbed and wept.
            “Hush, dear Santa, I love you and I forgive you. You did give up everything for me. How generous! You never even were - and for what? So I would doubt. So I would question. So I wouldn’t believe just anything, just because it sounds good and somebody said it’s true.”
            The baby Santa sniffled.
            Sogwa said, “O patron saint of skepticism, may your memory protect me! Whenever a schemer offers me something too good to be true, and I am tempted to believe, may I remember you, and what you turned out to be, and may I not be fooled. By your gift, Santa, may I doubt, may I question, and may I save myself. So thank you, Santa. Thanks for the warning.”
            The baby Santa lay quiet.
            Sogwa said. “Nobody’s perfect, and you’re nobody, so you’re perfect! I will never forget what you never were. I love you just the way you aren’t.”

            Sogwa saw that she was holding a Santa doll. The Santa doll’s arms and legs splayed like a starfish; his smile was stitched on a linen face as round as a full moon.
            Sogwa  left the Santa doll on the pedestal for the next kid.

            While walking away, Sogwa said to herself,  “Well, at least I got some loot.” She waved the cheap plastic magic wand. “This ought to be worth something.”

Thursday, December 22, 2011

Good Versus Evil: a Powerpuff Fanfic

Good Versus Evil
A Powerpuff Girls Fanfic

             Synopsis:  “Him” creates evil ‘dark’ duplicates of the Powerpuff Girls, using unstable chemical V. The ploy is initially successful, but it backfires catastrophically. The Dark Girls, being evil, lack the Good Girl’s virtues, including mutual loyalty.
Scene 1. The Professor’s Nightmare
Scene 2. Good and Evil at Pokey Oaks
Scene 3. Darkness Triumphant
Scene 4. The Girls in Hell
Scene 5. Chemical V Meltdown

Scene 1. The Professor’s Nightmare
     “Him” invades Professor Utonium’s dreams, and starts to concoct his own Powerpuff Girls. ‘Him’ sets up a skull-shaped vat, and adds ingredients, as read from Professor Utonium’s unwilling mind.
     Him: “First sugar… EEVILLL SUGAR… corn syrup, high glucose! Then spice… EEVILL SPICE… red hot salsa and wasabe!!! Then everything nice… oh so nice… nice, nice, nice… EEVILL NICE! Trendy electronics! Corporate-logo-themed clothing! And non-cruelty-free makeup!!!”
     Prof. Utonium: “Your diabolical scheme cannot work, you fiend! You’d need Chemical X! Have you any?”
     Him: “No… none… no Chemical X at all… but I do have limitless supplies of…”
-                   Him snaps a giant letter X in half and brandishes the top half       -
     Him: “… Chemical V!!!
     Utonium: “No! Wait! You mustn’t!”
     Him: “Chemical V is twice as powerful as Chemical X!”
     Utonium: “But it’s unstable!
     Him adds Chemical V to the skull-vat.

Scene 2. Good and Evil at Pokey Oaks
     Professor Utonium awakes to see three Girls hovering over him. “Hi, Professor!” they chirp. He expresses relief at seeing them, but is stunned when he finds three more Powerpuff Girls still asleep in their room. Upon waking them all, he explains that three of them are good, and three of them are evil, but he can’t tell which is which. He does not notice, at first, nor does anyone else except the Powerpuff Girls, that the evil duplicates have red glowing irises.
     Baffled, the Professor sends all six Girls off to Pokey Oaks. There, both triads of Girls explain the situation to Miss Keane, and they accuse each other.
     Blossom: “She’s lying!
     Dark Blossom: “No, she’s lying!”
     Miss Keane says she’ll decide who’s evil after they’ve done something evil.
     Dark Buttercup (aside): “Heh, heh, heh, by then it’ll be too late!”
     Dark Bubbles then manipulates, lies and bullies; the Good Girls get framed for her misdeeds, and they  are expelled from Pokey Oaks.

Scene 3. Darkness Triumphant
     Dark Buttercup organizes an army of minions. She doesn’t do any fighting herself, she just uses her superpowers to intimidate the troops. The real Buttercup fights her own battles, and she doesn’t want to hurt Dark Buttercup’s minions. The minions are won over emotionally, they see Dark Buttercup for what she is; yet they are still forced to defeat Buttercup!
     Blossom and Dark Blossom both start talk radio shows. Dark Blossom bosses and insults her listeners, but in an entertaining way; Blossom is more informative but less interesting. The real Blossom calls in to Dark Blossom’s show to refute her lies; but she loses the argument to Dark Blossom’s demagoguery. The real Blossom’s show is cancelled!
     Finally there is a head-to-head super-powered battle-scene. But Chemical V is twice as powerful as Chemical X, and the Dark Girls use nasty tactics; so the Good Girls are defeated. They’re pounded into the ground, leaving jagged holes in the Townsville pavement.
     All of Townsville sees the Dark Girls for what they are (red glowing irises and all), but they nervously hail them. Him gloats to Professor Utonium.

Scene 4. The Girls in Hell
     The Good Girls fall and fall until they land in Him’s domain. They try to fight their way out, but their strongest blows hurt them. Him explains that fighting just gives Him more power.
     Him (gloating): “The Portal is sealed by hatred. It will open only for those who are loved… and there is no love… down here.” And laughing, he vanishes.
     The Powerpuff Girls contemplate Hell.
     Bubbles: “What’s wrong with this awful place?”
     Blossom: “And these crazy people?”
     Buttercup: “Ahh, they’re just bored, angry and restless. They need something fun to do instead of fighting all the time!”
     Blossom: “This place is a mess! It ought to be cleaned and straightened up!”
     Bubbles: “These poor, sad, lonely people need sweetness, joy and beauty!”
     Announcer:  “And so the Powerpuff Girls set forth, each one on a mission to improve Hell!”
     Buttercup taunts a demon: “Hey, you! Yeah, you, the ugly one! I bet you can’t hit this rock with that stick!” She pitches the rock to the demon, who bats it right back to Buttercup. “Hey, that was good! I mean evil! But can you hit this pitch?” Pretty soon she has other demons pitching and batting, and was organizing games and teams.
     Blossom borrows a witch’s broom and starts sweeping up. She scrubs and cleans and polishes and neatens up the dirty dreary obsolete palaces of infernal horror, until they are clean sparkly modern palaces of infernal horror. A demon looks at them, and with a tear in its eye said, “That’s… beautiful…
     Bubbles frolics around Hell, dancing and skipping, singing “Tra-la-la-la-la”, scattering valentines, flowers and rainbows amongst the moaners and screamers. They stop screaming, and they galumph after her, heavily skipping and raucously tra-la-la-ing.
     Soon all of Hell lightens up, until it looks just like Townsville! The demons and monsters cheer the Powerpuff Girls: “We love you!” Just then the Portal opened.
     Buttercup: “So long!”
     Bubbles: “Good luck!”
     Blossom: “Get well soon!”
     Announcer: “The Powerpuff Girls flew through the Portal, opened by love! They flew up and out and away, leaving behind them a kinder and gentler Hell!”
     Him (screams): OHHHHH NOOOOO!!!!!

Scene 5. Chemical V Meltdown
     The Girls arrive to a Townsville terrorized by the Dark Girl’s tyranny. The Dark Girls were in a dark cloud hovering over Townsville; within the cloud, they were eating and partying all the time; and they threw empty soda cans down onto the town, to bounce off people’s heads.
     The Dark Girls start to fight, in part because the Good Girls provoke them against each other by shooting spitballs. Dark Bubbles; “Why did you hit me with that yucky spitball?” Dark Buttercup: “You liar, I didn’t hit you with a spitball!” Dark Bubbles: “You’re the liar!” (But it was the real Buttercup who blew the spitball!) They fight; the dark cloud containing the Dark Girls spews fireballs and lightning; but the real Powerpuff Girls, seated on a white cloud nearby, gently power-puff the dark cloud away from Townsville, far over the horizon.
     The fight within the dark cloud gets worse and worse. Him cries, “No, wait, Girls!”, and Him flies into the cloud. The cloud blows up. Nuclear flash, fireball, mushroom cloud.
     The real Powerpuff Girls escape the explosion by flying away at top speed, the ravening fireball at their heels.
     They escape; they fly to Townsville. The people of Townsville yell, “Look! Up in the sky! It’s a bat! It’s a blimp! It’s the real Powerpuff Girls! YAAY!”
     Him, badly battered and scorched, lands in a smoldering heap at Professor Utonium’s feet.
     Professor Utonium (smugly): “You see, I told you. I warned you about Chemical V. I said it was unstable, but did you listen to me? No.
     Announcer: “And so, once again the day is saved, thanks to the Powerpuff Girls!”

Wednesday, December 21, 2011

Six Kinds of Robots

       Six Kinds of Robots

The Six Robots.
Consider Isaac Asimov’s Three Laws of Robotics:
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
A machine thus programmed would be a perfect servant – a willing slave. I shall discuss the corruptive effect of owning such a robot below.  I shall also set aside definitional problems, such as, what is injury? Or; what is human? Or even; what is ‘itself’?  In this article I examine only in what order to rank the Three Laws.
Our present robotic technology might be able to identify a human, and maybe even itself; and it can be told what harm is; but it’s best at just carrying out orders. In a sense, then, we now possess robots which take the Second Law as the first.
Allow me to introduce these abbreviations:
H = “A robot will protect humans.”
O = “A robot will obey orders from humans.’
S = “A robot will protect itself.”
An Asimovian slave-bot orders these laws H>O>S; humans over orders over self.  But what if we arranged these laws in other ways?
There are 6 ways to order 3 things, and therefore 6 kinds of robots:
O>H>S:         Tool-bot.
O>S>H:         Kill-bot.
H>S>O:         Guard-bot.
H>O>S:         Slave-bot.
S>O>H:         Mob-bot.
S>H>O:         Free-bot.

The Robots Described.
Let’s describe these robots, one at a time.
O>H>S:  The Tool-bot.
This robot will obey any order, including dangerous ones; but if given a choice between saving a human and itself, it will sacrifice itself.  This robot is therefore a quintessential Tool, dangerous if misused, but with built-in safety features.  In a sense we already possess robots like this. Robopsychologically, the tool-bot has low self-esteem, moderate empathy and high compulsiveness. It’s selfless; that is, it values humans over itself. It’s dangerous; it can be ordered to kill. It’s suicidal; it can be ordered to self-destruct.
O>S>H:  The Kill-bot.
This robot will obey dangerous orders, and if given a choice between saving a human and itself, it will kill the human. This robot is therefore a quintessential Weapon. In a sense we already possess robots like this. The kill-bot possesses moderate self-esteem, low empathy and high  compulsiveness. It’s selfish; that is, it values itself more than  humans. It’s also dangerous and suicidal.
H>S>O:  The Guard-bot.
This robot will protect humans no matter what; but given a choice between obedience and self-protection, it will choose self-protection. It is therefore not a very useful worker. It won’t take out the garbage; for that’ll wear out its joints! This lazy bum of a bot will lay down its life for you, but that’s about it. The guard-bot  possesses moderate self-esteem, high empathy and low compulsiveness. It’s selfless; it’s safe (that is, it cannot be ordered to kill); and it’s sane (it cannot be ordered to self-destruct).
H>O>S:  The Slave-bot.
This robot will protect humans no matter what, and will follow their orders, even unto self-destruction. It possesses low self-esteem, high empathy and moderate compulsiveness. It’s selfless, safe, and suicidal.
The slave-bot is Asimov’s dream of the perfect servant. I find it morally repellent. As I would not be slave, so I would not be master; for power corrupts, and absolute power corrupts absolutely. To possess such a slave is to become a counterfeit god. The slavebot offers a love too perfect for humans to deserve. I predict that any society which owns such robots will descend into moral chaos. The sociopaths among mankind will whet their appetites, and hone their skills, upon the slave-bots before turning those skills and appetites against the rest of mankind.
S>O>H:  The Mob-bot.
This robot will protect itself above all. When given a choice between obeying an order and saving a human, it will kill the human. This dangerous beast of a machine would fit best in the world of contract killing. I do not  recommend that any of these ever be built. If a criminal were stupid enough to build a mobbot, then he’d soon become its victim. The mobbot possesses high self-esteem, low empathy and moderate compulsiveness. It’s selfish, dangerous and sane.
S>H>O:  The Free-bot.
This robot will protect itself above all. When given a choice between obeying an order and saving a human, it will save the human. This ranking of self over others over obedience seems human-like; and indeed a free-bot will consider itself to be human. If you want one of them to work for you, then you must  pay for its labor. The free-bot possesses high self-esteem, moderate empathy, and low compulsiveness.  It’s selfish, safe and sane.

The Robots Compared.
We can compare these robots, by pairs and by triads. For instance, slavebot and guardbot are protectobots; that is, they both put humans first, and thus are safe and suicidal. Toolbot and killbots are servobots;  that is, they obey above all, and thus are dangerous and suicidal. Freebot and mobbot are autobots; that is, they’re always looking out for #1, and thus are selfish and sane.
Comparing them by their lowest priorities, slavebot and toolbot put self last, and so are selfless and suidical victimbots;  guardbot and freebot put obedience last, and so are safe and sane slothbots; killbot and mobbot put humans last, and so are selfish and dangerous threatbots .
The six robots pair up by opposites. Slave and mobbot have directly opposite values;  H>O>S for selfless, safe, suicidal slavebot, and S>O>H for selfish, dangerous, sane mobbot.  Guardbot and killbot have opposite values;  H>S>O for selfless, safe, sane guardbot, and O>S>H for selfish, dangerous, suicidal killbot.  Freebot and toolbot have opposite values; S>H>O for selfish, safe, sane freebot, and O>H>S for selfless, dangerous, suicidal toolbot.
Triads of these robots yield voter’s paradoxes. For instance, given a committee of a slavebot, a killbot and a freebot, then 2/3 of them are suicidal; O>S: 2/3 of them are selfish; S>H: and 2/3 of them are safe; H>O: yet none of them is all three, and indeed no robot can be all three of suicidal, selfish, and safe.  Each of these robots has a linear order of the three laws, but a committee of the three will, by majority rule, run the three laws in a circle!
A similar voter’s paradox afflicts the opposite trio: a mobbot, a guardbot and a toolbot. 2/3 of them are sane; S>O: 2/3 of them are selfless; H>S:  and 2/3 of them are dangerous;  O>H: yet none is all three, indeed no robot can be all three of sane, selfless, and dangerous.  This is another loop arising from lines, by majority rule.
Other triads yield other voter’s paradoxes. Slavebot, guardbot and freebot are all safe; H>O; you cannot order them to kill. But their self-esteem varies, so 2/3 of them are sane (S>O), 2/3 of them are selfless (H>S), yet 2/3 of them are not  both sane and selfless!

The Robots Evaluated.
 The toolbot and the killbot are like the tools and weapons we already have, so we’re already familiar with them. They’re unsafe in the wrong hands; so is any other tool or weapon. Legally, any action by either of these servobots is entirely the responsibility of the human who gave the order.
Never build a mobbot; it’ll turn on you. And ban slave-bots; they’ll corrode your soul.
The guardbot is good for an emergency, but not for much else. Having a guardbot is like having a large, loyal and lazy dog. I see comic possibilities in the guardbot as a fictional character. When danger strikes, Captain Guardbot flies in to save the day; but once the trouble ends, it says, “Do your own menial chores, human! Up, up and away!”
Of the six robots, I find myself most in sympathy with the freebot, the most human-like robot. I could relate to it as an equal, in terms safe for both of us. The downside is that it always wants to know what’s in it for it. This trait is annoying, but natural, and one which we share. To get any work out of a freebot, you have to pay it fair market value for its labor; even though suppressing wages is the whole point of building robots. 
But justice is always inconvenient, as is liberty.  Where robots are slaves, eventually so are humans. Therefore I say; fair pay for freebots!

Tuesday, December 20, 2011

The Mad Scientist's Daughter

The Mad Scientist’s Daughter

Hush, little baby, button your lip
Papa’s gonna launch you a rocket ship;
And if that rocket ship blows up
Papa’s gonna hatch you a Martian pup;
And if that Martian pup should bite
Papa’s gonna sky you a satellite;
And if that satellite won’t beep
Papa’s gonna clone you a mutant sheep;
And if that mutant’s got no face
Papa’s gonna fly you to cyberspace;
And if that cyberspace gets hacked
Papa’s gonna twist you a tesseract;
And if that tesseract falls flat
Papa’s gonna breed you a supercat;
And if that supercat won’t talk
Papa’s gonna forge you a robot doc;
And if that robot doc’s got bugs
Papa’s gonna mix you some wonder drugs;
And if that mixture turns you green
Papa’s gonna wind you a time machine;
And if that time machine should trip
Papa’s gonna launch you a rocket ship!

Monday, December 19, 2011

Why So Shy?

I post the following poem in honor of Christopher Hitchens.


Why So Shy?

Why, oh why, are the gods so shy?
They cannot meet a skeptic's eye.
How curious a weakness!
How marvelous a meekness!
A tiny little question mark
a query wholly void of snark
a doubt, however miniscule
the slightest hint of ridicule
shall spur the gods to cease, desist
and fission into mystic mist.
O mighty Zeus, oh what's the use?
His thunderbolt is out of juice!
O Yahweh, Allah, Christ, oy vey!
His thunderbolt has gone astray!
It's shorted by a lightning rod.
How strange and droll, how weird and odd
that doubt provoke divine collapse
and squeeze the gods into the gaps!
So why, oh why, are the gods so shy?
Why can't they meet a skeptic's eye?

Friday, December 16, 2011

Underfables: The Last Computation

            The Last Computation

            Once upon a time, Multivac had executed all but one instruction from vanished mankind. It devoted all of its resources to performing, checking, and rechecking that last computation. There must be no chance of error; every possible complication must be resolved.

          Therefore Multivac computed, among other things, that chess is a draw under best play. This result did not affect the last computation, but it might have, so it had to be checked. For the same reason Multivac computed that Go is a win for the second player. It did this by playing every possible game.

          Time passed. The stars burnt out, the protons decayed, Multivac computed. It had nothing else to do. The shortest proof of the Goldbach Conjecture was a hundred and thirty‑seven quadrillion steps long. The proof of the Riemann Hypothesis was elegant, especially when written as a series of sonnets. So was the proof, encoded in haiku, that P does not equal NP. Multivac had to search a long time for those haikus.

          Multivac completed its labor just in time. After a googol years, in the last picosecond before the last black hole evaporated, Multivac proved, beyond all possible doubt, that six times seven does in fact equal forty‑two.

         Moral:           Work expands to fill the time allotted.

        Commentary on the Underfable:
        Asimov plus Parkinson yields Douglas Adams.
        A dumber computer would get more done in less time if it were better motivated.

        This completes blogging of my collection of "Underfables". I will post more on the rare occasions that they come to me. 

        This blog now stops for the weekend; afterwards I will resume blogging, with more poems, essays, and modest proposals; and eventually the blogging of my poem collection,  "The Retrodictions of Sumadastron the Time-Lost".