I believe it was Warren McCullough, an early researcher in neuroscience, who first proposed that there was an analogy between the way a neuron fired, a neuron being within two states, at rest or, if it got enough electrical input from its sources, would then fire, or burst. I think McCullough was the first one to draw an analogy between that on and off state of the neuron and the zero-one digital information functioning of early computers, and this analogy set off a metaphorical revolution in which the working of the brain and by extension the mind, was increasingly considered to be either not just analogous to but identical with computation.
McCullough was part of an early disciplinary group of people that included Walter Pitts and Gregory Bates and others who attended the first conferences on cybernetics, the Macy Conferences, which were happening around 1950. Bateson came from anthropology. His primary interest was in feedback systems, recursive feedback loops and how systems regulate or go out of control. His basic metaphor for the functioning of the mind was that the computer functions as the thermostat. The thermostat measures the temperature. When it’s too cold it turns the heat on and when it gets too hot it turns the heat off so it establishes a relationship with the environment to maintain something like homeostasis. It was thought that was an important way to understand the mind-body interaction with the environment.
He also liked to use the analogy of two people with electric blankets to show how feedback systems can run amok. He said, picture a couple lying in bed, husband and wife each has a separate electric blanket, but in the middle of the night they get the switches mixed up so that he holds the switch to her blanket and she holds the switch to his blanket. Well, when he’s cold, he turns it up, so she gets hotter, and when she gets hotter, she turns his down, so he gets colder. He gets colder so he turns his up, she gets warmer so she turns hers down. It’s the opposite of homeostasis.
We might see this picture of mutually interacting feedback as a precursor to some of the things we’re talking about in our discussions about intersubjectivity later on, because what we’re looking at are mutually interacting systems, mutually creating systems. The dilemma that arose with the computer was that the mind basically got out of the systems picture and created this isolated computational mind.
That’s also taken in different directions. One of the weirdest that came up recently was in the work of Ray Kurzweil, a self-styled futurist, AI developer, and he’s been the inventor of a lot of quite remarkable intelligence kinds of machines and products, one of the original machines -- an optic scanner, that could read texts, and he developed facial recognition patterns. I think he invented the first music synthesizer that could create all the sounds of everything, digitally.
But by taking this picture of the brain-equals-mind-equals-computer to its extreme conclusions, he’s gone on to say that he anticipates that the brain will be completely reverse-engineered by 2025. That means everything that the brain can do will be reverse-engineered computationally into something a computer can do. Now that claim makes sense only if you think of what a mind does or what a brain does purely instrumentally. What does it mean to reverse-engineer subjectivity? What would it mean to reverse-engineer sadness? Or joy? Or zazen? What are all these kinds of brain capabilities or human qualities? What does it mean to think that we’re reverse-engineering?
He goes even farther and has predicted that in 2045 an event that he calls the singularity will occur, and this means the ultimate interface between humans and machines, so that the biological and the computer will completely merge, and he calls it a singularity after the event arises out of an intelligence explosion, saying that we just can’t even imagine what would happen after this, that this is an event that will be utterly transformative of what it means to be human.
When I listen to his talks, there seems to be a bifurcation in what happens when man merges with machine. On one hand, he’s a big advocate of nanotechnology, and so he thinks that what we will do is inject ourselves with billions of micro-machines that course through our body and constantly repair everything that is breaking down inside, so that we would never get old, we would never get sick, the body would become perfectly efficient, and so there you get a kind of machine-assisted biological immortality.
The other possible bifurcation is the idea that the mind will become totally down-loadable or up-loadable to the cloud, so the mind is reducible to a whole bunch of programs or algorithms. Somehow all of these will be able to be transferred up and out into a kind of immortal virtual reality.
In reviews of Kurzweil’s work by philosophers like Janet Maslin, one of the things they tend to point out is how much the singularity seems to have in common with ideas like the rapture, a kind of end-of-history moment of total transcendence, where once and for all we will transcend our mortality, our physical bodies, that we will somehow be translated into a completely different realm.
I would say that this is the ultimate curative fantasy. What you have, and I think Kurzweil has talked about part of his drive in thinking this way and pursuing all this, is a kind of fear of death, and the sense that man must become immortal. And so there’s a way in which there is this whole technologically-driven program that remarkably overlaps with a whole set of religious ideas about eternal life and a soul, which will somehow be independent of and transcend the body. For Kurzweil, it’s the mind which will be separable from the body and be uploaded into the cloud. In religious terms it’s the soul that will be separable from the body after death and have its kind of immortality.
I think these ideas bear more than a little family resemblance to each other. Many of us who engaged in this practice early on tended to treat the idea of enlightenment as our own personal singularity. There was a kind of equivalent fantasy of some event that would be utterly transformative, so that life would have this kind of before and after quality. Many of the early accounts of kensho had this kind of born-again conversion experience quality. Having had this moment, everything was completely changed once and for all.
Lots of those moments, of course, do occur, but fortunately they don’t all play out the way we anticipate, and as Jack Kornfield’s book title puts it, “After the Ecstasy, the Laundry.” We always seem to come back down to earth. But I think once you get a sense for the form of these curative fantasies, you start seeing them everywhere and in all different shapes and sizes, and you can see in this kind of Kurzweil project something that seems like yet another attempt to transcend what it is to be mortal, what it is to be human, to think that the project of this life is not to become more fully human but to somehow escape the confines of the human condition.
While I do not by any means pretend to be an expert on what’s technologically possible in the future, I do think I’m an expert on what’s crazy.