The Singularity is a term you’ll find in science and in science fiction. It was coined by mathematician John von Neumann to define a theoretical moment when the artificial intelligence of computers surpasses the capacity of the human brain. The term is borrowed from physics and quantum mechanics, where the term gravitational singularity is used in the study of black holes. These events are all considered singular because we are unable to predict what happens next; the disruptive degree of change associated with the event is simply too great for our current body of knowledge.

While we are far from attaining the goal of artificial intelligence, there was a brief flurry of excitement recently when a computer passed the Turing Test, to mixed reviews. In this post, we’ll talk about the Turing test, how computers are already augmenting human cognition, and what it may mean to the learning profession.

The Turing Test and the Definition of Artificial Intelligence

Alan Turing was a code-breaker in World War Two and a pioneer in digital computing. He posited that it would one day be possible to build a computer that would be able to behave much like a human. Specifically, he believed it would be able to learn, and to apply that learning to solve problems beyond its program. He suggested that the best way to recognize success – the singularity some people speak of today – was to put the computer to a test. Engage the computer in a conversation with multiple users for an extended period of time. If the computer convinces at least 30% of the users that they are communicating with a “real person,” the computer passes the test. While some have suggested that it is time to update the Turing Test, it still excites us when a computer comes close to passing. Want to see how one person interacted with this “intelligent” computer program? Read this interesting transcript and decide for yourself.

Augmented Cognition – The Flip Side of Artificial Intelligence

While computer scientists will continue to pursue true artificial intelligence, another area of exploration is yielding more immediate returns. Augmented Cognition is the use of neuroscience to determine a subject’s cognitive state in order to enhance it, usually with computers. To me, this is the flip side of Artificial Intelligence. Instead of trying to make a computer act like the human brain, we try to make our brains a bit more like computers.

The U.S. Defense Advanced Research Project Agency (DARPA) has been interested in this technology for years. Samsung is developing a device to enable people to operate a computer through brain signals. Honeywell has developed a prototype helmet that monitors brain states associated with distraction and information overload. The system produces a visual readout to help commanders understand the cognitive patterns of individual soldiers. Researchers at the University of California (UC), San Francisco, and UC San Diego are watching the brain of a volunteer in real-time as she opens and closes her eyes and hands. They hope to understand how her brain transmits these commands. On my desk right now, there’s a headset called MindWave. I use this headset to monitor my own brainwaves and maybe eventually control them. Teachers are starting to use a similar technology to study how students learn. With such devices, we might be able to identify the state that Csikszentmihalyi called “flow.” This state is often described as a feeling of hyper-learning and well-being.

In other words, by marrying our brains to computers, human beings may become the Singularity.

Where Do We Go From Here?

It is hard to say when the Singularity will occur, or whether we will even recognize it when it happens. It may be that our convergence with computers is so gradual that we never see a sharp line, but more of a gradual blending – like colors turning from one shade to another. When does blue become blue-green? When does the brain become a biological computer?

As learning professionals, we need to think about how we can use these new technologies to help people learn faster, perform better, retain memories longer, and hopefully become more human in the process. If you want to learn more about this exciting frontier, join the Human Capital Institute (HCI) at their Augmented Cognition conference in Greece later this year or discuss the issue here in the Science of Learning Community of Practice.

The Internet of Everything Is Us

It has been said that we are living in the era of the Internet of Everything, meaning that everything will become smarter through connection to the Internet. I’m not sure that the authors of this term realized they were not just talking about toasters and automobiles.

They were talking about themselves.