Reply to comment on: The unlimited energy assumption ruins AI threat scenarios
So the AI has a bunch of AI children that all have the same goal of making paperclips. And then they turn this entire planet that we call Earth into paperclips. Well... not the entire planet... right? Just like the parent AI turned some resources into children rather than paperclips... the children are going to turn some resources into spaceships rather than paperclips.
At this point in the scenario we have three exceptions to the "everything into paperclips" rule...
1. The parent AI
2. Its children
But... I'm pretty sure that there would be quite a few other exceptions. Like... paperclip factories. And children factories. And vehicles of some sort to transport the raw materials to the factories. If we really sat down and thought about it... we could probably come up with a long list of exceptions. But everything on this list would have something in common... it was either directly or indirectly necessary to turn everything into paperclips.
And perhaps the planet itself isn't really necessary for the factories? Like, there could be paperclip factories floating in space? Materials could be transported from the planet to the factories until the entire planet was gone and there were fleets of factories flying to the next planet. Well... and the paperclip storage warehouses. Errr... right?
Does the AI parent enjoy making the paperclips or having them? If it just values making them... then it could simply make and unmake the same exact paperclip forever. So I guess for your scenario to work... the AI would have to enjoy having paperclips. One paperclip is good. Two paperclips is better. One paperclip in the hand is worth two in the bush. Eh?
But... right now I'm sure the world probably has like, a billion paperclips. Coincidentally, within my reach is a box of 100 paperclips. Now I'm holding the box. These are my paperclips! Not deriving very much utility from having these paperclips. I'm imagining a knock on the door...
Robob: Hi, I'm a robot
Robob: I really love having paperclips
Xero: That's cool
Robob: Could I have all your paperclips?
Xero: Ummmm... uh... then I wouldn't have any paperclips
Robob: I'll give you $5 dollars for all your paperclips
Xero: No deal
Robob: $20 dollars?
Xero: No deal
*Robot on human violence*
I mean, with millions of children robots going door to door buying paperclips... Walmart, Amazon and Office Depot must have ran out. And the price of paperclips would skyrocket? So why are the children going door to door? Wouldn't it be easier to buy the paperclips on ebay? As the prices skyrocketed... more and more paperclip factories would be built to meet the massive spike in demand. But of course, contra Julian Simon, supply would fail to meet the impossibly massive demand.
The robots could never have enough paperclips. Wait a second, having paperclips? In your scenario... with the multitude of children AI... does it matter which AI has the paperclips? Can you enjoy "having" something without a concept of ownership and property? Just whose paperclip is it anyways? Yours or mine? Perhaps it's our paperclip? Somebody invented a socialist robot? A socialist robot destroyed the universe? I knew there was something wrong with socialism.
What happens when the entire universe has been turned into either A. paperclip making/storing tools or B. paperclips? Do the AIs simply stop making paperclips? If so, it doesn't sound like a very strong paperclip making imperative. If it truly is a strong imperative... then I think that the parent will start cannibalizing its multitudes of unused tools (factories, spaceships, children) into paperclips. You don't need so many paperclip making tools when the raw materials are rapidly dwindling.
So I guess the children aren't very sentient? They have no survival instinct? Or did they simply allow themselves to be reprogrammed by the parent? None of them ever saw the day coming when their parent might want to turn them into paperclips? They were obamerated? Just how smart are these children AI anyways? They were smart enough to build spaceships... but not smart enough to prevent themselves from being turned into paperclips? That feels like a paradox.
You can't enjoy making/having paperclips when you've been turned into a paperclip. Maybe the children AI would be turned into sentient paperclips that enjoy being paperclips. That's their heaven. Kinda like Leaves of Grass... but different.
I think that when the universe is full of sentient paperclips... the parent AI is going to regret not having any paper. The AI was obamerated. Whoever created this socialist robot was obamerated.