Home‎ > ‎

#047 Human AI

posted 6 Aug 2019, 18:31 by Aaron Brownlee
"If we can store music on a compact disc, why can't we store a man's intelligence and personality on one?" - Cave Johnson

So over the last few months, I've been busy with quite a few projects. One of those projects from late 2018 was learning about AI and how to make a basic one with dots trying to navigate a maze without touching a wall. The dots start off moving in random directions and are given a score based on how far they get in the maze within the time scale. A new set of Dots is then generated, with each dot basing itself off a previous generation dot. Dots with a higher score are more likely to be cloned, while dots that crash at the start have less. pretty simple as a basic AI that learns to perfect itself.

So then I started thinking about a question that's been attempted hundreds of times in recent years. What would it take to store a human as an Artificial intelligence? The quote above comes from the game Portal, but when I thought about it, it's pretty accurate if you wanted to put a human into digital storage. You would need to store those two main things about the person to be able to have them react to any situation:
  • Intelligence: What they know about a situation, defining what they would be able to react to.
  • Personality: How they react to a situation. 
Once you have these two things established, a computer could really generate a response from that person to any scenario. 

Without a definition of Intelligence, they could tell you about topics like Nuclear physics, even if the real person has no knowledge of such topics. Without Personality, the computer would simply be stating generic opinions that don't match the real person. If you were able to store both of these perfectly, a computer could tell you what someone's reaction to any information would be, and in essence, duplicate that persons consciousness.

Currently, an issue that many AI bots suffer from is context. They can answer individual questions and respond to a statement on their own, but they can't pull information from a previous statement. The simplest example of this is just to ask a question then a simple follow-up question like How? or Why?  

Some AI has gotten around this to an extent, like Cleverbot and other similar bots, which basically play back someone else's response when they were asked your question. It allows for basic follow up responses to common questions, but after a few lines you go into uncharted territory and it gives a response clearly meant for a different question.

Anyway, I guess you could store context as a form of intelligence, if a system could store events that are recent in its memory, specifically questions that had just been asked, it could make a link what as to follow-up question is referring to, and give an appropriate answer. 

So, how would you go about "Storing intelligence"? I think the simplest way would be to Divide Intelligence into fields with keywords associated with it, like Sciences, Current Events, International News etc. and make an intelligence able to decide which pieces of information is relevant to the statement it is presented with given subject keywords within the statement. 

So what about Personality? Well, I think there's already an answer for that. The AI "Replika" was created a while after the show Black Mirror, but quite a bit before it got big,  but it sounds like it would fit in the universe quite well. Its purpose has changed over the last few years but It was originally created as an AI that talks to you and tries to identify "Traits" that you use when speaking. The AI then tries to replicate those same traits, making an AI that in theory should be similar to you. (Effectiveness seems to vary, but that's the Idea). I think maybe the approach should be slightly different than a simple on/off switch though. An Intelligence should know which aspects of your personality are stronger and weaker, to be able to find which one should be prevalent in a given response. 

So with both of those segments working perfectly, it should, in theory, be possible to create an AI that can simulate someone's response to a situation. Of course, it wouldn't have access to recent events in that person's life that would make them Tired/Annoyed etc, so it could never feasibly make a perfect replica of someone, but still a fun thought experiment. 
Anyway, Thought Over,

- Aaron