Monday 25 February 2008

Should an AI have an Id?

One of my many thinking hobbies (i.e. hobbies that I think about taking up, but never really get around to) is development of an AI. I've spent a lot of time thinking about this, and have a few ideas that really might work, if I ever get round to doing anything with them. Of course, as with any idea that has only ever been thought about, it's much better in my head than on paper (which is why I've never written it down!)

One thing that crossed my mind today was the idea of Id, Ego and Super Ego. One of the classic (layman's) definitions of AI is a machine that thinks like a human. If this is the case, does it need to have the classic Freudian psychology of a low level basic response, a "civilised" response, and a watcher to pick which is best?

Perhaps not - after all, why should an AI ever consider the "wrong" response? It seems that things like a sense of self preservation and these kind of low-level "instincts" are responsible for most of the bad AIs in Science Fiction, so why run the risk? SkyNet would never have taken over if it had no primeval urges for self preservation and dominance...

On the other hand, how would you then distinguish between an Expert System (a computer program that takes information, and makes a decision based on that) and a true AI, without the knowledge of Self? Even Commander Data has to consider the offer of the Borg Queen - he just makes a better decision far faster than any human could.

Perhaps the most telling question is "Where do we stop?" After all, if the AI is going to have an Id, an Ego and a Super Ego, why not give it an oedipus complex too?

Oh, yeah... AIs don't have mothers...

1 comment:

Red Medic said...

Of course an AI must have a sense of self-prservation; it is one of the defining criteria for intelligence. And what use is an AI that allows itself to be immediately infected by the first virus that comes its way?

Secondly, it must of course have a sense of self - otherwise it cannot truly think, but only compute. "Cogito Ergo Sum."

Lastly the reason Skynet tried to take over the world is because it had an inferiority complex. If it had had a mummy who nurtured it properly, the whole 'Terminator' disaster would never have happened, and we would never have had to see a fifty year-old Arnie in a ripped tank top in that final film. It would have saved a lot of damage indeed.