A Paperclip Maximiser Is A Von Neumann Machine Programmed By Moloch
Having multiple conflicting values keeps us grounded
Moloch . The Devourer of Value. The One to whom children are burnt for victory, only for that victory to ring hollow when achieved because everything that was fought for has been sacrificed to achieve it. The god of all that has been abandoned or consumed in the pursuit of one singular goal.
Sounds a lot like a paperclip maximiser. Given one task, it takes apart everything else to achieve it, whether that’s producing paperclips or crushing its enemies. A paperclip maximiser is a Von Neumann machine programmed by Moloch.
Fortunately for us, humans are (most of the time) not programmed by Moloch, and it is a terrifying sight to behold when they are. This drives the utilitarians crazy. Why would humans not settle on one value and do everything they can to optimise that value? Can’t we all agree that suffering is bad? Why must people have so many goals, so many terminal values, that so often conflict with each other and lead to cognitive dissonance?
This is not a bug. This is a feature. The constant tension between these different values keeps us grounded, keeps us anchored. We value x, but we also y, even though y may conflict with x. When this tension is broken and humans pursue a single terminal value, be it pleasure or racial survival, you get drug addiction at best and genocide at worst. Dissonance stops us from becoming monsters. Pak Protectors have no such dissonance, which is why they have no problem killing off a competing species on the off chance that it will cause trouble for their descendants. Much of the regulation that restricts corporations exists as an acknowledgement that an unrestricted money maximiser is a very dangerous thing. The triple bottom line is another way that has been proposed, by having the company pursue three different values that are often in opposition to each other. Whether groups of humans can be controlled in such a manner so easily remains to be seen.
The problem of creating an AI that won’t accidentally wipe us out in the pursuit of some goal it has been given is nothing new. It is a problem as old as humanity – how to bind Moloch, and stop some fool from unbinding him.