Tuesday, August 4, 2009

Robot AL-76 Goes Astray

this story has a robot built for the moon or mars to operate an initially undescribed device - the disinto - escape (accidentally, it wasn't actually trying to break out or anything) from the factory, wander through an earth town, happen upon a modern day tinker, build his disinto out of spare parts, use it - which involves removing a nearby mountain top - and in doing so scare the tinker out of his wits who then commands the robot to destroy the machine and forget everything that happened today (so that the tinker can escape liability) - which the robot then does much to dismay of the just arriving US Robots search party who recognizes a massive improvement in energy consumption of the new disinto (apparently the human designed ones require massive amounts of energy whereas the now destroyed disinto was powered by flashlight batteries)

a few thoughts on this one

1. asimov continues with creating the impression that humans will be not just distrustful of robots but incredibly fearful of them - despite robots having existed for some years people run or attack on site of the wandering robot ... i understand a movement toward banning robots and i understand distrust ... but visceral fear? ... this seems extreme

2. the robot as worker - the robot feels compelled to work, immediately goes out in search of the tools he needs to conduct his tasks - this says a lot more about the psychology of the robots than the three laws alone - something sinister i can't quite describe yet ... hmmm ...

3. the robot as midlessly obedient - the robot is told to forget, and he forgets ... this also seems extreme ... with modern day materials it's virtually impossible to erase computer memory and human memory is highly indelible if also incredibly unreliable ... to be able to command forgetting, it seems like a workaround of the three laws could be accomplished here

4. energy issues - this is the first story to bring out asimov's interest in energy issues, something which i gather will be a kind of undercurrent of his later works (i've read foundation and he stresses the importance of "nucleaics" there) ... despite this he never actually seems to explain how the robots are powered

Asimov seems somewhat distinct from other science fiction writers as he's not so much concerned with the science as he is with the ethics/psychology/philosophy ... not once does he explain how a robot actually works, how it's gained self awareness or consciousness or artificial intelligence ... he says very very little about the development of robots or any of the real science behind it ... the only thing we really get is the three laws - almost perfectly designed to overcome the fear he sees humanity embracing in the face of robots

One has a sneaking suspicion that if mass produced AI robots are ever created, they will somehow have the three laws wired into their brains ... to assuage human fears ... although asimov seems to believe that will be unsuccessful anyway, despite his long robot advocacy ... but asimov appears to have little else to robotics than these ethical concerns and a few vocabulary words and he seems bound by only the most rudimentary elements of science

more on this line of thought later

Friday, July 31, 2009

2 more thoughts on Robbie

I almost forgot to mention two other ideas raised by the story.

First, from the very beginning of his robot stories - Robbie is the first one he wrote - Asimov brings up the idea that human society will fear robots - in particular those that in any way resemble humans.

I wonder if there is any sort of sociological literature on societal views of robots but I imagine that the question of human response to humaniform robots must still be somewhat up in the air. Although I think Asimov has a lot going for him. I've already mentioned the "other" element which will only grow the more humaniform robots become and the more intelligent they become - but I should also not neglect to mention the economic threat that robots represent to all manner of workers.

Both elements will certainly stir the pot - if not toward a ban as Asimov proposes might occur, then definitely toward a wariness for humaniform robots - or at least towards some major marketing efforts by a future USR.

The idea I neglected to mention in the first post is the extreme similarity between Robbie and the movie Fido. The movie takes place in a future zombie infested world where the company Zomcon has created a device that allows humans to control zombies - employing them in positions of manual labor. The story follows Timmy and his adventures with their zombie (named Fido) bought by the mother in order to keep up with the Joneses against the wishes of the father who never got over killing his zombie dad. There's more to it (and less) but the relationship between the child and the zombie and the relationship each parent has to the zombie are very similar to relationships in Robbie - and I imagine the filmmakers came under that influence.

A Boy's Best Friend and Robbie

The first two robot stories (again in the fictional chronology - in actual terms, A Boy's Best Friend appears much later - Robbie is, I believe, the first Robot story written by Asimov) are incredibly similar involving a young child becoming attached to a robot (robot dog or nursemaid) in preference to the "real thing", much to the parent's chagrin.

In both cases, the child is extremely disappointed when the parents attempt to replace the mechanical with the organic (in both cases an organic/living dog). Both the stories seem to bring up the question of what makes something alive/ethically valuable - is the robot dog's mechanically or digitally programmed loyalty any different from the organic dog's chemically and biologically programmed loyalty? And does it really matter in the eyes of the beholder - that is to say the eyes of the child?

Clearly it matters to the parents but is that only because they have been raised with the organic ideal and view the mechanical as unmistakeably "other". Differences between organic and mechanical beings are certainly greater than between different ethnicities but certainly one can see some sort of correspondence to a previous era in which other races were viewed as utterly "different" - as things to keep away from children.

The story inevitably brings up the idea of a soul - is the difference between robots and organic beings some indescribable phenomena or some unquantifiable or intangible essence? Asimov is (or at least as far as I can gather from his wikipedia page) an atheist and so I'm guessing he comes down on the side of the soulless - that is to say that he almost certainly thinks there is less of a difference between organic and mechanical beings than meets the ... well perhaps the eye in the earlier stories but when we move onto humaniform robots it will be the conscious.

But aside from the soul there seem to be two other elements that trigger a repulsion or a feeling of "otherness" associated with robots. First is their seeming mechanical nature - that is to say they respond to stimuli in entirely mechanical/determinant/fatalistic ways. They are machines are they not? Incapable of independent thought? These stories are certainly somewhat ambiguous in addressing this question - Asimov clearly gives Robbie non-determinant behaviors - that is to say emotions. And with the coming 3 laws in later stories the fatalistic nature of robots seems much more questionable as Asimov explores the weighing of those 3 seemingly simple laws.

The other element would have to be their uniformity - clones and the idea of cloning also seems to trigger the same problem. Anything that is uniform is to a certain degree replaceable - if I can produce thousands of identical robots are any of the individual robots special? To a certain degree the troublesome nature of this question seems to stem from the mechanical nature problem - that is to say we expect all robots of the same make and model to behave uniformly - which would in and of itself be proof of their fatalistic nature, proof of their un-uniqueness.

But what about twins? Certainly their uniformity does not take away from the special-ness. And they certainly are not mechanistic - nor are clones for that matter (for more on that see an interesting This American Life program on a clone of a famous bull - or was it Radiolab ...). Even the computer I'm writing this on has been shaped uniquely by the experience of working for me to such an extent that it is in many ways unique. Are robots any different? Do the sheer numbers produced - 20000 instead of 2 - make the difference? What if we could have identical vingt-uplets? Would the children seem so much less special?

A final note on the story relates to the thoughts of the parents in Robbie, worried about their daughter's lack of physical and emotional contact with other children and how that will stunt her development. The parent's worries seem to foreshadow future parent worries of the deleterious effects of television and then that most robotic of childish pursuits, video games. Having been raised at least partly on video games I hope the worries didn't prove to be too true.

So, I've analyzed these 2 short stories (albeit briefly) through the ideas of what it means to be human and alive, otherness and race and souls and clones and fate and video games. Perhaps lacking in the gender or economic department but all in all, a days work well done.

from robots on

I've set myself the goal of reading Asimov's Robots, Empire, Foundation series in it's entirety in chronological order along with the various canonical and non-canonical works of other authors.

I've taken to blogging as a way of recording my thoughts on various issues so I thought this might be an applicable place. In all likelihood, no one will find this but me.

I'm reading the stories in their fictional chronological order as taken from Johnny Pez's insanely complete fiction list.

Feel free to follow along.