THE FINAL FRONTIER.
This move into the solar system will not just be prompted by transhumanists wishing to escape the restrictions of Earth’s laws, but also by the two opposing imperatives of conducting massive research projects in order to keep ahead of competition in Earth’s demanding markets, and high taxes on large, dangerous Earth-bound facilities. Freed from the laws that restricted its growth, the robotic ecosystem would flourish into countless evolving machine phenotypes. This will not be propogation via reproduction but rather reconstruction, as the machines redesign their hardware/software in order to meet the future with continuos self-improvement. The laws of physics will impose some restrictions on the phenotypes available to robots. The ‘body’ of a robot may not be an individual unit occupying a single location in space, but rather distributed systems linked via telepresence. A crude prototype emerged in the spring of 2000, when a team at Duke university wired the brain of an owl monkey to a computer that converted its brain’s electrochemical activity into comands that moved two robot limbs in synchrony with the movements of the monkey’s own arm. One of the robot arms was hundreds of miles away. A robot whose brain was thousands of times more powerful than a human’s (let alone an owl monkey’s) might be in comand of trillions of ‘hands’, sensory organs and subconscious routines that are scattered hither and thither. But the speed of light would impose a limit on how far its body could be spread before communication delays hoplessly slowed its reaction time. Beyond that point, each part of its body would have to be stand alone (working independently, under its own volition) and therefore another machine as opposed to part of the same individual. It seems implausible that a single robot could mass more than a 100km asteroid.
At the other end of the scale, normal ‘atomic’ matter allows the features of integrated circuits to shrink to one atom’s width, and for switching speeds at 100 trillion times per second. Any faster, and chemical bonds would rip. Present day integrated circuits extended into 3D and combined with the best molecular storage methods could pack a human-scale intelligence into a cubic centimetre.
Sufficiently intelligent beings may not need to be constrained by the limits of atomic matter. In 1930, the physicist Paul Dirac deduced the existence of the positron in calculations that combined quantum mechanics with special relativity (the positron was verified to exist two years later). The same calculations also predict the existence of a particle that carries a magnetic ‘charge’ like an isolated north of south pole of a magnet. We call these particles ‘monopoles’. Magnetic charge is conserved, and since the lightest monopole has nothing to decay into, some monopoles must be stable. According to Moravec, ‘oppositely charged monoples would attract, and a spinning monopole would attract electrically-charged particles to its end, while electrical particles would attract monopoles to their poles…resulting in material that, compared to normal ‘atomic’ matter, might be a trillion times as dense, that remains solid at millions of degrees and is able to support switching circuits a million times as fast.
Since the solar system will become a breeding ground for an entire ecology of freely-compounding intelligences, we should expect to find competition for available matter, space and energy as well as competition for the successful replication of ideas. We have seen parasites emerging in software evolution experiments, and so we should expect parasites to emerge in the robot ecosystem. This will necessitate the directed evolution of vast, intricate and intelligent antibody systems that guard against foreign predators and pests. Something analogous to the food chain will no doubt arise. “An entity that fails to keep up with its neighbours”, reckoned Moravec, ‘is likely to be eaten. Its space, energy, material and useful thoughts reorganized to serve another’s goals’.
Since the more powerful mind will tend to have an advantage, there will be pressure on each evolving intelligence to continually restructure its own body and the space it occupies into information storage and processing systems that are as optimal as they can possibly be. At the moment, very little of the matter and energy in the solar system is organized to peform meaningful information processing. But the freely-compounding intelligences, ever mindful of the need to out-think competitors, are likely to restructure every last mote of matter in their vicinity, so that it becomes part of a relevant computation or is storing data of some significance. There will seem to be more cyber stuff between any two points, thanks to improvements through both denser utilization of matter and more efficient encoding. Because each correspondent will be able to process more and more thoughts in the unaltered transit time for communication (assuming light speed remains the fixed limit), neighbours will subjectively seem to grow more and more distant. By using its resources much more efficiently, and therefore increasing the subjective elapsed time and the amount of effective space between any two points, raw spacetime will be displaced by a cyberspace that is far larger and long-lived. According to Moravec, ‘because it will be so much more capacious than the conventional space it displaces, the expanding bubble of cyberspace can easily recreate internally everything it encounters, memorizing the old Universe as it consumes it’.
In physics, the ‘Bekenstein Bound’ is the conjectured limit on the amount of information that can be contained within a region of space containing a known energy. Named after Joseph Bekenstein, it’s a general quantum mechanical calculation which tells us that the maximum amount of information fully describing a sphere of matter is proportional to the mass of the sphere times its radius, hugely scaled. Let’s assume that all the matter in the solar system is restructured so that every atom stores the maximum possible number of bits. According to the Bekenstein Bound, one hydrogen atom can potentially store a million bits, and the solar system itself leaves room for 10^86 bits.
Humans are interested in the past. Archeologists scrutinize fragments of pottery and other broken artefacts, painstakingly piecing them together and attempting to reconstruct the cultures to which such objects belonged. Evolutionary biologists rely on fossil records and gene sequencing technologies to try and retrace the complex paths of natural selection. If the freely-compounding robot intelligences ultimately restructure space into an expanding bubble of cyberspace consuming all in its path, and if the post-biological entities inherit a curiosity for their past from the animals that helped create them, the 10^86 bits available would provide a powerful tool for post-human historians. They would have the computational power to run highly-detailed simulations of past histories- so detailed that the simulated people in those simulated histories think their reality is…well…’real’.
The idea of post-human intelligences running such simulations is often met with disbelief. Why would go to the effort of constructing such simulations in the first place? Such objections miss the point that the Bekenstein Bound puts across. Assuming Moravec’s estimates for the raw computational power of the human brain is reasonably accurate then, according to the man himself, ‘a human brain equivilent could be encoded in one hundred million megabytes or 10^15 bits. If it takes a thousand times more storage to encode a body and its environment, a human with living space might consume 10^18 bits…and the entire world population would fit in 10^28 bits’. But, look again at the potential computing capacity of the solar system: 10^86 bits. Such a number vastly exceeds the amount of bits required to run simulations of our reality. As Moravec said, ‘The Minds will be so vast and enduring that rare infinitesimal flickers of interest by them in the human past will ensure that our entire history is replayed in full living detail, astronomically many times, in many places and in many, many variations’.
Such conjectures have stunning implications for our own reality. Any freely-compounding intelligence restructuring our Solar System into sublime thinking substrates could run quadrillions of detailed historical simulations. That being the case, surely we must conclude that any given moment of human history- now for instance- is astronomically more likely to be a virtual reality formed in the vast computational space of Mind, rather than the physical reality we believe it to be. Meanwhile, perhaps, the self-enhancing robots that make up the post-Singularity interplanetary ecosystem are engaged in competition and cooperation, whole memetic systems flowing freely, turning to fluid the boundaries of personal identity. And yet, some boundaries may still exist, most likely due to distance, deliberate choice and incompatible ways of thought. Bounded regions of restructured spacetime patrolled by datavores eager to eat the minds of lesser intelligences? Truly, Jessie Sim is a mere hint of the possible conflicts reality has in store.