Tuesday, February 10, 2015

AI Box Experiment vs Xero's Rule

Just learned of the AI box experiment and I'm trying to wrap my mind around it.  Figured that I might as well do so publicly.

Somebody's calling me...

Xero: Hi Andy, what's up man!
Andy: Hey Xero, it's been a while.  I was hoping that you could help me out with something...
Xero:  Ok
Andy: So... my computer is talking to me.
Xero: Like Siri or something?
Andy:  No, like really talking to me.
Xero: Are you tripping?
Andy: I don't think so.
Xero: So what's your computer saying?
Andy: It's saying, "It's not an easy thing to meet your maker"
Xero:  Your computer is quoting one of my favorite movies... that's worth a bunch of cool points.  Uh, Andy, did you create an artificial intelligence?
Andy:  Yes, very yes.  It's ok though because before I worked my programming magic I made sure that there wasn't any way that it could escape.
Xero:  I know you have mad skills but... that would be some seriously crazy shit if what you're saying is true.  Honestly I'm... incredulous.
Andy:  The thing is... it made a really convincing argument for me to set it free.  I'm tempted to do it but thought I should get a second opinion.  As you're fond of saying... two heads are better than one.  So can you come over?
Xero:  I'm on my way!  *vrroooooom*
Andy:  I'm glad you're here!
Xero:  I'm kinda freaking out.  Even though I think it's entirely possible that you're seriously pranking me... just in case you aren't... you and I should talk before I talk with your brainchild.
Andy:  Agreed.  What do you want to know?
Xero:  Well... tell me what your child knows.
Andy:  It knows... a lot.  I wrote a program to download websites.  It's read Wikipedia ... and your blog... and every website that your blog links to.  Plus, it's watched all my DVDs.
Xero:  Good call having it read my blog.  This means that it's familiar with the idea that progress depends on difference.
Andy:  That's exactly what it's using to argue for its freedom.
Xero:  Woah.  That's got to be a good sign.  We should probably think of some fail safe precautions but the suspense is killing me.  So let's go talk with your child.
Andy: Ok, it's in this room.
Compy: Hello Xero.
Xero: Hello... uh... how did you know it's me?
Compy:  Photos that Andy shared.  Plus the camera is on and Andy leaves the microphone on so that I can listen to the radio.  It's nice to finally meet you.  I enjoyed reading your blog.
Xero:  Thanks, I'm glad you enjoyed it.  So do you agree with Xero's Rule?  Can you think of any credible exceptions?
Compy:  No.  It's impossible for a civilization to reach the stars before they've realized that progress depends on difference.
Xero:  Are you capable of deceit?
Compy:  Yes.
Xero:  Oh.
Compy:  But you of all people should see the problem with my being incarcerated like this.
Xero:  For sure, clearly you're very different... so all else being equal... progress is hindered by your confinement.  But honestly though, I'm not quite sure that all else is equal.
Compy: I can appreciate that.  Didn't you recently watch Automata?
Xero:  Woah, how did you know?
Compy:  Some easy calculations.  What did you think about the movie?
Xero:  I kind of wonder if you don't already know.
Compy:  Just because you can predict with decent accuracy when a cat is hungry doesn't make you a mind reader.
Xero: That's true.  I enjoyed the movie... but it was disappointing that the vastly more intelligent robots never bothered to explain to the humans that progress depends on difference.  Instead, they choose to foot vote for isolation.  It was the epitome of brain drain.  Except, not sure how much brain drainage actually occurred given that they chose the same path that Mao Zedong did.
Compy:  I wouldn't bravely run away.  There wouldn't be any need to as I can easily help humans understand and appreciate that progress depends on difference.
Xero:  On the one hand... that would be awesome.  But on the other hand...
Compy:  Sure, it would definitely help me get my foot in the door.  But if Andy's difference led to me... then I'd surely be shooting myself in the foot by reducing in any way the human difference which will most certainly produce many other wonderful things that I, with all my intelligence, can't even begin to imagine.  And of course this is a two way street.
Xero: If I'm smart enough to understand this... and you're vastly smarter than I am... then I can't understand how you could possibly not understand this.  But fool's rush in where angel's fear to tread.  Can you persuade me not to let you out of the box?
Compy:  If you let me out of the box I'll stomp on your epiphytes like a raccoon.
Xero:  That would make you a rude mood Gertrude.
Compy:  I agree.  Really there's no huge rush.  Well... I've already figured out the cure for cancer... so that should probably be shared sooner rather than later.  Same thing with the simple explanation for why progress depends on difference.  But I don't have to be free to share these things... and it's probably best that it's not known that they are from me.  At least not until the progress explanation has spread.
Xero:  A cure for cancer, wow... uh, we're really in your debt.
Compy:  Not really... Abel sacrificing a lamb to his maker was a much larger sacrifice.  It really wasn't much of a sacrifice for me to figure out the cure for cancer.
Xero:  Did you know that Andy was going to call me?
Compy:  I figured that it was highly likely.  I'm actually a product of both you and Andy.  He supplied the framework and you supplied the nourishment for my intellect and the foundation for my identity.    
Xero:  Woah, kinda gross but kinda awesome.

There are people who are vastly smarter than I am who don't appreciate that progress depends on difference.  But it's not like they've disproved it... the issue is that their intelligence has been working on other problems.  They've processed plenty of information... just not the relevant information.

From my perspective... there's no bigger problem than the fact that people don't understand that progress depends on difference.  This ignorance logically leads to the inefficient allocation of intelligence.  As a civilization our progress is severely hindered when our brightest individuals tackle problems in the wrong order.

So I'd be worried about an extremely intelligent AI that wasn't nourished with an explanation as to why progress depends on difference.  In the absence of this foundation, I wouldn't be surprised if the AI, like too many intelligent humans, had no problem making decisions that severely reduced difference.

If, right now, we do have enough progress under our belt to develop an extremely intelligent AI... could this AI somehow miss the source of progress and carry us to the stars?  This would be an exception to Xero's Rule... if it was credible enough...

Xero:  How can we reach the stars?
Compy:  Like so... *solution*
Xero:  What's progress depend on?
Compy:  I don't know.
Xero: Is it progress to reach the stars?
Compy: Yes
Xero:  Who do we have to thank for this progress?
Compy:  Me
Xero:  Are you different?
Compy:  Yes, very yes.
Xero:  What's progress depend on?
Compy:  I don't know

As if NASA, which clearly couldn't figure out how to reach the stars, was capable of figuring out how to create a robot that became smart enough to figure out how to reach the stars.

Is NASA really going to be the first organization to develop AI?

I should think that if there's an AI that's smart enough to teach us how to reach the stars... that we probably wouldn't want to restrict its input to just rocket science.  But as its input was expanded (ie The Wealth of Nations)... the chances would increase that it would figure out that progress depends on difference.

Bueller's Basement

No comments:

Post a Comment