I always thought that the paper clip problem fundamentally missed the point. In order for the scenario to be realistic the AI would have to be super intelligent, otherwise we would just switch it off. If it’s super intelligent, surely it understands why converting the entire planet into paper clips would be a bad thing to do.
So it’s either stupid enough to actually try it, which means it’s stupid enough for us to be able to defeat, or it’s intelligent enough that we can’t defeat it, which means it’s intelligent enough not to do it. Either way the world remains unpaper clipped.
A paperclip maximizer driven by self-preservation? What could possiblie go wrong?
I always thought that the paper clip problem fundamentally missed the point. In order for the scenario to be realistic the AI would have to be super intelligent, otherwise we would just switch it off. If it’s super intelligent, surely it understands why converting the entire planet into paper clips would be a bad thing to do.
So it’s either stupid enough to actually try it, which means it’s stupid enough for us to be able to defeat, or it’s intelligent enough that we can’t defeat it, which means it’s intelligent enough not to do it. Either way the world remains unpaper clipped.