The Russian Dolls: A Short Science Fiction Story by Extropia DaSilva

FOUR.

The Web was dreaming. No human understood this, because the Web’s mind was the emergent pattern of a global brain, too big to be perceived by human senses. But, nevertheless, it was dreaming. And, what it was dreaming of, was a team of scientists, and their equipment, and of a girl called Emily who existed as patterns of information within the patterns of information that were the supercomputers the Web dreamed about.

In the past, a few people had wondered if, by some great accident, the Web could become conscious. Such a thing had happened, but it would be somewhat inaccurate to call it accidental. It was not planned — no person, group, corporation or government had ever sat down and devised the near-spontaneous emergence of a virtual research team, complete with virtual supercomputers, all existing within the digital soup of zeros and ones that now enveloped the world in an invisible yet all-pervasive ether. But neither was it an entirely random event.

What trigger effects had lead to this remarkable outcome? One cause was the sheer amount of information about human existence that had been uploaded to the Web. The age of the personal computer had only truly began with the era of the smart phone and only really took off when the CMOS era had been superseded by molecular electronics that could pack, in complex three dimensional patterns, more transistors into a sugar-cubed device than all the transistors in all the microprocessors that had existed in 2009. It was apps, running on phones that could keep track of their owner’s position thanks to inbuilt GPS and then (as the nanotechnology behind molecular electronics lead to medical applications) all kinds of biometric data, that really opened the floodgates for offloading cognition. The very best designers of apps knew how to tap into the computer intelligence’s native ability in order to gather crowdsourced knowledge from anyone, anywhere, who had spare time to perform a task computers could not yet handle.

From tracking the movements of whole populations, to monitoring the habits of an individual, every person was, every second of the day, uploading huge amounts of information about how they lived their lives. This, of course, presented the problem of retrieving relevant information. Access to knowledge had changed from ‘it’s hard to find stuff’ to ‘it’s hard to filter stuff’. More than ever before, the market imposed an evolutionary pressure of establishing semantic tools with the ultimate aim of making the structure of knowledge about any content on the Web understandable to machines, linking countless concepts, terms, phrases and so on together, all so that people could be in a better position to obtain meaningful and relevant results, and to facilitate automated information gathering and research.

The Web became ever-more efficient at making connections, and all the while the human layer of the Internet was creating more and more apps that represented some narrow-ai approach. Machine vision tools, natural language processing, speech recognition, and many more kinds of applications that emulated some narrow aspect of human intelligence were all there, swimming around in the great pool of digital soup, bumping into long-forgotten artificial life such as code that had been designed to go out into the Internet and evolve into whatever was useful for survival in that environment.

That environment, a vast melting pot of human social and commercial interactions, imposed a selective pressure on evolving code that could exploit the connections of the semantic web in order to evolve software that could understand the minds of people. Vast swarms of narrow ai applications were coming together, and breaking apart again, reforming in different combinations. The spare computing cycles of trillions and trillions of embedded nanoprocessors were being harvested, until, like a Boltzman brain spontaneously appearing out of recombinations of a multiverse’s fundamental particles, Dr Dinova, Dr Epsilon, and their supercomputers, all coalesced out of the digital soup.

There they were, existing within the connections of evolving code. Their appearance as difficult to predict as the evolution of elephants and yet, with hindsight, as foreseeable as the eventual rise of land-based animals from the ancestral fish that first dragged themselves out of the water. But people went about their concerns without ever knowing that, somewhere among abstract mathematical space, among the vibrations of air alive with the incessant chatter of machines talking to machines on behalf of humankind, a virtual research team had emerged, pondering questions of consciousness that all sentient beings invariably strive to understand. The Web was dreaming, and while it did so, Emily helped Adam cope with his daily routines, unknowingly watched by Drs Dinova and Epsilon, who themselves existed as purely digital people blissfully unaware that they were nothing but the imagination of a global brain made up of trillions and trillions of dust sized supercomputers and sensors, ceaselessly gathering, and learning from, patterns of information about the daily habits of humans.

Print Friendly, PDF & Email