Several years ago, when blogs were young and the “blogosphere” was just beginning to get crowded, I came up with a neologism to describe it, the “clogosphere.” And what was the first thing I did with this burst of creativity? Turn to Google of course, to see if I was truly original. Unsurprisingly (in retrospect), there were already ten results for my clever new term. I cursed to myself, and began declaring to anyone who would listen that “If you think you’ve had an original thought, just google it and be humbled.”
So, you can imagine my surprise when I was reading one of my favorite web comics and they made the very same point about Google deflating your sense of creativity. My insight into creativity was applicable to itself in a very “meta” sort of way.
This is certainly “a condition of modern life,” but as a historian of technology I know how often the simultaneous emergence of an idea or invention has cropped up in the past. If you live in the United States you were taught that Thomas Edison invented the electric light, but at least half a dozen other people around the world are held in similar regard, including Joseph Swan in England and James Lindsay in Scotland.
History is littered with similar episodes. Isaac Newton and Gottfried Liebniz independently inventing calculus. Alfred Wallace writing to Charles Darwin for advice on his theories about natural selection, not knowing that Darwin had been laboring over the same idea for decades. Elisha Gray and Alexander Graham Bell independently submitting patents for the telephone on the same day. What is going on here?
Notions like the “adjacent possible” and the combinatorial method of technological innovation (which I’ll explore in future posts) go a long way toward explaining this. After all, you can’t invent a light bulb until someone has invented the supporting technologies. The Romans had the necessary glass blowing expertise, but not the necessary metallurgical techniques, much less an understanding of electricity. Once all those pieces were in place, taking the next step was not just possible but probable. Part of Edison’s genius was developing not just the light bulb, but an entire electrical system to support it, from dynamos to sockets.
At a deeper level, however, I wonder if this phenomenon of simultaneous invention is directly related to the operation of a distributed intelligence based on a network of information technology. Sometimes derided as the cliche of a “world brain,” it is in fact the natural outgrowth of a living technology that has access to language. The outlines of this intelligence are indistinct, but this should come as no surprise. The workings of our own brains, after all, remain quite mysterious. What we are learning about human intelligence, however, points in an intriguing direction.
Dr. Giulio Tononi’s work on human consciousness stands out in particular. He has been examining the activity of neurons in the human brain as a network, and attempts to measure the amount of information that such a network might contain. “To do so, [he is] adapting information theory, a branch of science originally applied to computers and telecommunications.” If indeed human consciousness and technological intelligence are fundamentally related, this is just the sort of development we might expect.
The most interesting result of Tononi’s work to date involves firing a magnetic pulse at people’s brains. The pulse causes neurons to fire in a cascade, and in conscious subjects the signal echoes back and forth across the brain for almost a quarter second, like a ringing bell. In unconscious people, the pulse peters out in less than half the time, stimulating only a small region of the brain. As the subject returns to consciousness, the signals travel farther and farther through their neural network. These results support Tononi’s contention that consciousness is nothing more than integrated information.
As Steven Johnson points out in Where Good Ideas Come From, an idea is a network of neurons in our brain firing. To attribute the idea to a single neuron would be as silly as, well, attributing an invention or discovery to a single person. Just as our consciousness emerges from a network of neurons, our inventions and discoveries emerge from an interconnected network of minds, drawn together and exchanging information through information technology.
This communication network began slowly, as information traveled physically around the world. In Darwin’s day it took months for a letter to get from Wallace in Indonesia to Darwin in England. Today it takes mere moments to hear about the latest advance in biomimicry or discovery on a windswept glacier. Each individual can benefit from the knowledge of others, and with advances in information technology this can happen more and more quickly. As the network gets larger and more complex, it’s “phi,” Tononi’s measure of integrated information, increases exponentially. As it runs faster and faster, we can begin to observe the technological system thinking in a state that approximates our own lived experience of consciousness.
If you are troubled by the notion that our consciousness is an emergent network effect, one that we share in common with information technology, then you might have some idea how Darwin’s contemporaries felt about learning they shared a common lineage with monkeys and chimpanzees. But it doesn’t make the insight any less true, and in fact opens up avenues of wonder. After all, thanks to the theory of evolution, we understand that we are deeply related, not just to chimps and apes, but to every living thing on this planet. Realizing that the catalog of living things includes technology, and that it is conscious of us as we are of it, is truly a wonder to behold.
Great post, Nick. I like the way you connect the neural and the planetary.
What does education look like, after 20 more years of this growth?
Great question! My first inclination was to address institutions of learning in response to your question – public schools, colleges and universities, etc. But looking at education more broadly, it’s not at all clear what the future holds, precisely because the evolving global intelligence is so disruptive. We’ve already seen what happened in the 20th century as technology made physical labor into a commodity. Now we’re seeing mental labor being commoditized as well. (Talk to a travel agent lately?) The last professions standing are likely to be the highly creative and the highly parasitic. In a world where a good idea (Instagram) or a clever deal (Bain Capital) is worth a billion dollars, there is tremendous pressure to make it into the shrinking elite. Education has traditionally been that pathway, though the bar continues to rise. A high school education was once enough to guarantee a comfortable lifestyle, then a college education, and now a master’s degree is the new bachelor’s and a second master’s is the new master’s. Education for the sake of knowledge itself will continue to have a cherished place, and education as a form of socialization is essential. But I suspect that education for financial advancement, in the face of technological competition, has a much less rosy future.