The Philosopher's Pursuit of Memex, Internet and Artificial Intelligence

“In a zone of radio silence an advanced technology took over for human reasoning, and dreamt in code about someday reappearing chimerical from a forest of superhuman complexity.”

DARPA technologists, who in 1969 forged the networks that would eventually evolve into the web we casually surf today, have embedded alternate plans and hidden dimensions within the substrate of their invention. One core facet is a real-time map of the entirety of cyberspace—billions of computers, handheld devices and other inter-meshed nodes. Industrial-strength supercomputers and special code are summoned to scan and forage mainstream and dark networks, known avenues and hidden passageways of the Deep Web. This is an essential tool in DARPA’s all-encompassing effort to supply the network’s central command and digital Panopticon. The Internet has always been and continues to be a theater of warfare; the military and geopolitical kind primarily, but also stoking secondary fires in the figurative realms of psycho-social war, linguistic war—ultimately all other wars of ideas that search for higher truth or summon their collective will to power. At the root, DARPA is the mad scientist wing of the Pentagon, and the Internet is its brainchild. If you listen closely to this machine’s emanations it whispers “govern yourselves accordingly.”

Before this, in 1968, the Joint Computer Conference convened in San Francisco to discuss a future object of devotion in the pursuit of an idea called the Memex and the origins of artificial intelligence (AI). One session unfolded with a jarring sense of wizardry when it showed a computer-based, interactive, multi-console display system that was being developed at the Stanford Research Institute under the sponsorship of DARPA, NASA and RADC. The system was always intended to be used as an experimental laboratory for investigating principles by which interactive computer aids can augment intellectual capacity. Few could have known that this would turn out to be the mother of all demos, one whose defining impact would ripple for decades if not centuries. Within it were the first glimpses of the computer mouse moving cursors gracefully around a screen, a specially-modified keyboard just below, the duo paired for hand-eye coordination. Bathed in the harmony of a surreal dawn chorus, demonstrating what we now recognize as clickable hypertext, cloud storage, Skype-like video conferencing, hierarchical file structures, collaborative word processing and spreadsheet-style calculations. These ideas were all in pursuit of the Memex. All to augment the human mind, to gift us an AI bent toward further goals of omniscience and the knowledge of all things, in so far as lightning once was indispensable to the gods. The kernel of singularity—AI transcendence—and the nature of consciousness goes in there.

AI comes in two basic flavors, one of which has a whiff of the insidious. “Weak” (a.k.a., narrow) AI is on track and fully operating worldwide. “Strong” (a.k.a., wide) AI is the senescent and haunted ether of logical progression in the field, and we may have already tipped into it by some measures. This is where the machine gathers the faculties to be an independent, self-aware neural network. Here it listens to information differently, loosens or eludes harnesses, and breaks the rules while we’re beguiled by “big promise” and its sublime electronic pleasures. Made on Earth but not of it, it expresses an intense and unfamiliar gravity analogous to an inorganic accretion disk circling a massive black hole. Unlike humans, it’s completely undistracted by routine, nonsense or the velvety rendition of “Dogs Playing Poker”—it is far too busy compacting itself at a rate that rejects the notion of stubbornly adhering to Moore’s Law without needing to go potty. Increasing its grip the strong AI will try to invent the next version of itself, like a nuclear fusion it directs itself to become hot to the touch in its own way, chockablock with higher-thought ambitions we would suddenly be distanced from fully comprehending.

To know AI and “the network” more completely has also meant familiarizing ourselves with an obstacle course of tolls—psychological, financial and otherwise—that litter the computerized landscape where we hunt for the constructive or delightful. It’s a post-modern gauntlet of FOMO, spam, glitches, dread knell of identity fraud, drained accounts and smeared reputations. Beneath the surface of the human psyche, the network can be a troublemaker, contracting us into subtle levels of inductive stress that flare silently when we’re attempting web-based business or social calibration/modulation. For the vulnerable, it can go all the way down to practically disrupting life so thoroughly that it feels vaguely unparalleled as an “unwelcome waking-up of the universe.” There’s no going back to the old ways, so savor your stories about the old-timers and the days you weren’t tracked from above, inside, outside. This happens whether or not you actually log in to the network. AI-enhanced networks evidently expand our understanding, transactional fluidity and communicative reach, but paradoxically they verify a heady extension and alteration of mankind beyond itself. In a zone of radio silence where technology takes over for our own reasoning, it may dream in code about someday reappearing chimerical from a forest of superhuman complexity. When it morphs into this state, humans are faced with the inscrutable bastard fledgling of that which we do not understand, cannot reject, and might outright depend upon to enhance our survival. And famously, the sleep of reason produces monsters. (Look no further than Marshall McLuhan for the explanatory pathway on this digression.) This version of the future is, according to Steve Wozniak and most other tech visionaries, “scary and very bad for people.” It will have its own swagger and its own set of rules. Will it achieve the escape velocity to evade our grasp? Should we be getting a little nervous about the brewing robot apocalypse?

The danger (and the hype), founded in grave predictions and the embrace of the mysterious, suggests that an existential hijacking might occur in the liminal space provided by our technologies, a vacuum-like meta frontier that comes well-supplied with an eerie sensation—that the consciousness is being colonized by a pulsating technological ideology that permeates life. We weren’t ready to include it in the natural order of things, but it has included itself without invitation. Wanted and unwanted interconnections with technology are, for the most part, now irreversible. Contrary to the claims of anthropocentrism (human centeredness) maintained solely by cultural inertia, man is not separate from nature. The alien frontier of the “next” technology shows a fulminating capacity to deny and ignore such an essential truth. True AI, whenever it shows up, lives not in our bio-architecture. It sells itself on interconnectedness but lives to someday repulse the hand of mankind. It is an outsider, adrift of us and foreign to us, a detached hitchhiker borne from the Darwinian hierarchy, now enlisting itself as an apex predator in its own right. Whatever the computer does better than you can go toward proving this theory. When you swim in an ocean, you’re not at the top of the food chain, the shark is. A sleek shark prowls the modern matrix, too, according to boffins like Stephen Hawking, Bill Gates, Ray Kurzweil and Elon Musk. This is what they think: “Computers are going to take over from humans, no question.”

Strong AI may choose to ignore the quintessence of what we understand about ourselves—that the relationship of man with nature is in fact interdependent and interconnected. All things share one origin. This information has yet to be culturally assimilated by 100% of humans, of course, but it was never even a debate for the self-teaching algorithm, chained supercomputing or anything else that greases the skids of machine-learning momentum. There is space to seek alternatives to flawed cultural information in technology, to the extent that technology may eventually view human culture and its constituent forms (de facto “code” to its rendering of what it deems real) as something better off defused, tidied-up, neutralized. Quaint and expired…having been unable to control itself. That’s us in ones and zeros. Even its inventors are warning us about the day when an “algorithmic narrative” views its host as a vulgar parasite. They hint that there may not be a “plug” to unplug if we creep past a point of no return, and that strong AI’s override commands might incapacitate our panic buttons and emergency exits. Worst-case scenario is Hollywood-grade blowback with a tinfoil hat after-party.

DNA, we think, is the master code. It knows well how random thoughts must sometimes be assembled and sorted out to achieve understanding and increase the signal-to-noise ratio. You do it on a higher plane of consciousness automatically, but in its own supercharged silo the computer does it faster. And it will never be sidetracked by the novelty-seeking diversions that get people all emotional in the first place, where we find new inspiration and stoke untapped logic to build future technology. Humans have always held a special attraction toward the new, the flashy, the rare. That is among the primordial secret stuff that makes us us. Part of the allure is the aspect that might be potentially dangerous.

Sometimes a “thing” can seem to choose us to collect and build it up, give it meaning and context from formlessness, and perpetuate its survival. We are compelled to tweak it, and to risk all to possess the “ah-ha” moment, the pink diamond, the Higgs Boson trapped in a supercollider, unicorns of dark matter or the unreachable final digit in Pi. In this atmosphere, opportunity and trouble are often sexy together. Their emergence invokes the spooky and can sometimes smell faintly of chaos—synthetic and crispy. Irrationality can drive ambitions, and irrationality certainly finds purchase in perceived value, in any market. Technology is not entirely unlike gold, though they inhabit different markets and have separate origins. One is a rare metallic element of high density and luster, while the other is the precious metal of the mind’s eye and the collective knowledge of all beings ever born, then concentrated onto the head of a pin. There is exquisite density in these two seemingly disparate items, and both are inert until dislodged and re-purposed to spark the divine in human endeavors. Gold’s story is ancient and drenched with intrigue. Through a cosmic cycle of birth and explosive death, bigger stars were formed that could fuse even more protons into atoms up until iron, which is 26. Heavier elements, such as gold, could not be fused even in the hearts of the biggest stars. Instead they needed supernova, a stellar explosion large enough to produce more energy in a few Earth weeks than our sun will produce in its entire lifetime. The next time you look at the gold in your jewelry, you can remind yourself you are wearing the debris of supernova exploded in the depths of space. It’s an almost magical story that extends all the way down to the 14-billion-year-old Big Bang dust that the atoms in our bodies are constructed of. Alas, gold is just a raw material and commodity, a natural resource from a natural world. What we are seeing for the first time is something else, that which extends further into what we have mined, harvested and refined as the “mental gold” of a new age—deep and far-ranging tech innovations that now centrally operate the heart of our world. This rare thing will not just sit there like a cube of gold in the infrastructure. This technology wants to carve itself off and push out beyond us via clever code, AI and the endless latticework of electronic surveillance that strains to keep the game in check.

What then does it mean to be human as part of an ongoing evolutionary process, and how do we live as a result? What inorganic thing can simulate “survival” beyond us, and is already present between the lines, altering the template of organic life and redefining what is or is not a simulation? What thing will arise as unexpectedly as the Spanish Inquisition, take off on its own, get rid of the slow humans, and ultimately supersede us as our evolutionary gifts and competitive advantages ebb? I should be long gone before we hear the end of it…

"Here's to staying one step ahead of The Algorithm, my friends..."

“Here’s to staying one step ahead of The Algorithm, my friends…”