Von Neumann's Self Replicating Machines
Arona Jones

A unique capacity of any biological organism is its ability to self-replicate. Plants and animals can produce (mostly) identical copies of themselves using materials from their environment. Living organisms, humans included, have the primary function of reproduction.
This ability for reproduction is perhaps the strongest dividing line between man and machine.
That line, however, has the capacity to be blurred or erased.
John von Neumann was an Hungarian-American computer scientist, mathematician and inventor who gave his name to his proposal of self-replicating machines. These are von Neumann machines, more accurately called universal constructors. These ‘universal constructors’ are machines capable of self-replication. Machines, the primary function of which is to create identical machines. Just like biological organisms.
And at that point, is a machine still a machine?
Consider an Artificial General Intelligence – an AI intelligent in a broad field rather than being tightly focused on one task – that is also a universal constructor which copies itself. Arguably, at that point, said intelligence is also a living creature by current definitions, albeit a silicon rather than carbon based life form.
While definitively equivalent, such machine life would have several functional differences from ‘traditional’ biological life. Without doing so deliberately, there would be no variation in the offspring of the original machine – they would all be identical clones. That said, it is likely the AI would attempt to make upgrades much as natural selection does, except in a more controlled and intelligent manner.
Machine life would also be near immortal, requiring only power – easily provided through renewable sources – to continue to be. It would not require the very specific set of conditions that biological life requires to continue to survive. Nor would a machine die of old age.
Self-replicating artificial general intelligence would also be dangerous.
Throughout history humans have always believed, to some extent, in not only an ability, but sometimes a ‘right’, to exploit lower forms of life. Theoretically, there’s nothing stopping an artificial general intelligence, smarter than most or all humans, considering us ‘lower life forms’ and enslaving or even erasing us.
Once the ‘genie is out of the bottle’ with Artificial Intelligences, it would be exceedingly difficult to put back. Particularly since it could spread rapidly to every machine connected to the internet. Exceedingly dangerous.
This is one area where science should certainly consider if it ‘should’ as well as if it can.
This ability for reproduction is perhaps the strongest dividing line between man and machine.
That line, however, has the capacity to be blurred or erased.
John von Neumann was an Hungarian-American computer scientist, mathematician and inventor who gave his name to his proposal of self-replicating machines. These are von Neumann machines, more accurately called universal constructors. These ‘universal constructors’ are machines capable of self-replication. Machines, the primary function of which is to create identical machines. Just like biological organisms.
And at that point, is a machine still a machine?
Consider an Artificial General Intelligence – an AI intelligent in a broad field rather than being tightly focused on one task – that is also a universal constructor which copies itself. Arguably, at that point, said intelligence is also a living creature by current definitions, albeit a silicon rather than carbon based life form.
While definitively equivalent, such machine life would have several functional differences from ‘traditional’ biological life. Without doing so deliberately, there would be no variation in the offspring of the original machine – they would all be identical clones. That said, it is likely the AI would attempt to make upgrades much as natural selection does, except in a more controlled and intelligent manner.
Machine life would also be near immortal, requiring only power – easily provided through renewable sources – to continue to be. It would not require the very specific set of conditions that biological life requires to continue to survive. Nor would a machine die of old age.
Self-replicating artificial general intelligence would also be dangerous.
Throughout history humans have always believed, to some extent, in not only an ability, but sometimes a ‘right’, to exploit lower forms of life. Theoretically, there’s nothing stopping an artificial general intelligence, smarter than most or all humans, considering us ‘lower life forms’ and enslaving or even erasing us.
Once the ‘genie is out of the bottle’ with Artificial Intelligences, it would be exceedingly difficult to put back. Particularly since it could spread rapidly to every machine connected to the internet. Exceedingly dangerous.
This is one area where science should certainly consider if it ‘should’ as well as if it can.