On the Shoulders of Giants: The People of the Cognitive Revolution

by | Jan 14, 2022

The brain-based methodology is the culmination of decades of science and research, distilled into a simple method. These practices have evolved over time from many interleaving fields of literature and research. 

In this way, the end product stems from a breadth and depth of innovative scientific endeavors and outcomes. Without the major contributions of researchers and innovators within many boundary-crossing disciplines, Brain-Based methodology and its connection with modern understanding of how the learning brain works would be nowhere close to where they are today.

In this blog, we’ll look at the historical figures and events that have helped shape this new way of thinking about learning. 

The Shift Away from Behaviorism

Since the 1950s, many scientists in fields as disparate from one another as Psychology, Philosophy, Linguistics, Anthropology, Neuroscience, and Computer Science, have been aware that the preeminent philosophy underpinning learning and learning systems — behaviorism with a Capitol B — was not living up to multi-dimensional claims and, in effect, was an outdated modality for engaging brain and human cognition.

This sparked a cognitive revolution, which consumed the cerebral energies of some of the foremost thinkers and educational theorists of a post-war America that was already in methodological flux. 

The emerging Cognitive Revolution promised to open up vast new fields of research in areas that connected learning sciences with technology, neuroscience, anthropology, and other related sciences. The research was expended in the pursuit of answers to age-old questions like ‘Who are we?’ and ‘How do we learn?’ 

In that post-world war era and with a cold-war space race in full flight (Sputnik dominated scientific headlines just one year after the cognitive revolution), much of the focus was riveted in technology, engineering, and cognitive sciences. This put America at the forefront of research in scientific endeavors. 

For instance, the knowledge that a Soviet satellite was crossing over the United States every 96.2 minutes spurred the US Government into creating the Advanced Research Projects Agency (ARPA, later DARPA) in order to compete at an international level with scientific projects and military dominance. Sputnik, like no other force, inspired a new generation of engineers and scientists and did so in a very visceral way. 

Somewhere in the space-race, cold-war fall-out, the cognitive revolution became associated (wittingly or no) with the rapidly ascending field of artificial intelligence and its connective tissues in fields that included information theory, signal-detection theory, and computer theory. In truth, the cognitive revolution contributed handsomely to some of these fields and advanced theoretical understandings of academic fields like the theory of mind, linguistics with emphasis on syntactic structures, and computer simulation using neural networks.

But, unfortunately for businesses involved with information management, education, training, or talent development, the cognitive revolution failed to show up on their corporate radar screens. 

Apparently out of reach, it found a home in loftier rafters nestled with AI, Neuroscience, Computer Science, and other computational scientific fields. It became a rarefied intellectual ‘mystique’ that cemented its failure to percolate down into teacher preparation programs and practical day-to-day operations for businesses.

More than 60 years later, the majority of corporate educational systems in areas of instructional design, training, and development are still predicated on pre-modern, Skinner-derived models and behaviorist programs that function with rewards and punishments in worlds of extrinsic stimuli and operant conditioning. 

As a result, the wisdom and engagement that a cognitive approach engenders are missing in most of these operations where reactive avoidance techniques dominate corporate learning systems. 

The notion that a behaviorist worker might think… “I will work hard so that my boss won’t discover that I am inefficient and fire me” comes directly out of the playbook of a behaviorist classroom where the student thinks… “I will study for this test because I don’t want my dad (or teacher) to punish me for getting a bad grade.”

Meanwhile, post-war scientific and educational developments that occurred in the US were felt in countries around the globe. In fact, for the first time and in spite of the divide over governmental ideologies, the world seemed to grow infinitely smaller and more connected. For example, the Jodrell Bank observatory in Cheshire (UK), with its impressive Mark 1 Radio Telescope, turned out to be one of the few instruments able to track Sputnik’s launch rocket as it flew over the US mainland. By sharing information with its close ally, the UK provided much-needed telemetry and data for scientific exploration in both countries. Long shadows were being cast.

As scientists and laymen alike listened to those faint beeps from an extraterrestrial tiny metal sphere, it was clear that we had reached a new era. The cognitive revolution got swept up in this new era. The panic that resulted from not being in front caused people to look for solutions in areas that related to militaristic and space science. The first artificial earth satellite was also one of the first scientific artifacts to unite human thinking from beyond the planet.

It was into this broadening scientific community that a few years later (1978), in a telephone booth-sized windowless carrel at the back of the Science Library at the National University of Ireland, another shadow would be cast. 

At that time and in that innocuous space, as a graduate student in the College of Education, I began a research career in relation to long-standing questions in equity, equality of educational opportunity, comparative systems, and modeling methodologies. While I was not unaware that it was in this same science library that two hundred years earlier George Boole, as the first professor of Mathematics, accomplished groundbreaking work in differential equations. It would be many years before this Boolean shadow would cross his path at Microsoft in Seattle.

It makes little sense, except to view it as another edge to a long shadow, to try to explain how a Bill Gates’ telephony project that would use digital logic and algorithms for connecting wireless devices to the Internet could also spill over into learning sciences and equity. But shadows are not grounded in logic. The players who connected AI, learning sciences, neural networks, and ultimately Brain-centric Design are connected and distributed at the same time.

Some of these shadows that reach forward from past centuries are ubiquitous; they are not only visible, but they are also a veritable tangible impact of space over time. Boole’s derivations in symbolic logic and algebraic expression, both computational first principles, provided the theoretical grounding for today’s Information Age. There is no surprise that in our own time—just as this book is going to press—Geoffrey Everest Hinton (Computer Scientist in Artificial Neural Networks, University of Toronto) was awarded the coveted Turing Prize for Artificial Intelligence. Hinton, as great-grandson to the said George Boole, epitomizes the marriage of neuroscience with learning sciences and mathematical computations that excel in machine learning. In one of his acceptance speeches, he outlined the mathematical schema that activates neural circuitry for achieving (virtually flawless) speech recognition software using algorithms that are both Boolean and Neural. Discussing neural net machine learning, Hinton was explicit:

 

“The [human] brain works… It certainly doesn’t work by people writing programs and sticking them in your head. So instead of programming it to do a particular task, you program it to be a general purpose learning machine – a neural net, and then to solve any particular problem like recognizing speech, for example, you show it examples of sound waves and examples of the correct transcriptions of the sound waves, and after a while, it just learns.” (Hinton, NPR, 2019)

The cognitive revolution had come full circle. Singularity. In other words, Hinton’s groundbreaking discovery was that programming (read, traditional teaching method) does not work. It is excruciating and inefficient. This is how he described the achievement of speech recognition software that emulates human neural networks (machine learning).

 

…all we need to do is figure out how to adapt the connections – because these networks can do anything. It’s just a question of changing the connections. (Hinton, NPR, 2019)

Hinton had finally stepped into the educational side of the cognitive revolution. Though not a learning scientist or an educator in the typical sense (in frontline classrooms everyday), nevertheless, he was finally in Brain-Based territory. 

This is the essence of Brain-Based neural nets—we ‘look for’ and ‘adapt’ neural connections in a very human way. By iterating through the model, the facilitator will engage and grow neural networks by carefully building and changing circuits. 

Cast a glance back to Pestalozzi’s shadow from three hundred years earlier. At Yverdun, he ‘looked for’ and ‘adapted’ children’s physical connections to the teacher, the locale, and the learning. Hinton goes one step farther and asks questions that Brain-Based have already answered; he struggles with understanding why the human brain is not as efficient as his exquisite, impeccable, all-potential learning machine?

 

We were able to accomplish amazing things – the neural net that we create can do anything – with a few billion connections we accomplished speech recognition in any language in the world. Yet the human brain has trillions upon trillions of connections – so either they are using the wrong algorithm or they are highly inefficient. (Hinton, NPR, 2019)

 

The Effects of Behaviorism

We are indeed inefficient (teachers and learners) because we are entrenched in a Reward/punishment Social/Emotional quagmire with a fixed mindset, amygdala hijack, RAS reinforcing beliefs, and highly suspect mental models that fail to connect with how the brain works. Hinton is correct. 

If only we understood the organ we use for learning… we would indeed be trillions of times more efficient. Hinton is the very apotheosis of a cognitive revolution for education that promised so much but delivered so little.

In the same way that many scientists perceived (in the mid-1950s) that the prevailing psychological models were not delivering, many individuals (in industrial settings as well in academia) had also arrived at similar conclusions about methods and approaches that were overtly behaviorist and locked into an outdated modality. 

Skinnerian rewards and punishments were not working in Human Resources; nor were they working in corporate management offices where bullying and ego drove up levels of stress and drove down productive outcomes for employees. Neither were they working in the home or the classrooms, where bullying, socio-emotional stressors, and ego were limiting the horizons of countless children and pushing countless teachers out of education. 

Some people began to question if the behaviorist method ever did work. But there is always an element of maybe. 

For some individuals in the workplace, for some kids at home, and for some children in schools it simply doesn’t matter if there are rewards or punishments in the mix. Some individuals are simply compliant. They are resilient to the degree that no matter what the world throws at them, they make it through unblemished and appear to weather each storm. But for others, neither rewards nor threats work. Their resilience levels are intensely connected to ramifications of both rewards and punishments so much so that it costs HR departments a small fortune to manage their expectations and schools the majority of their expense and talent managing their deficiencies.

 

While teaching and learning were intrinsic survival mechanisms for ‘sapiens’ species from the beginning, it wasn’t until historical consequences that followed Napoleon’s revolutionary victories through Prussia that the growth of modern National Systems emerged. A nation on its knees, most males maimed or killed after the defeat at Jena (1806) ignited a turning point. In a moment of enlightened triumph Fichte (advisor to Frederick William III) focused the new generation on a clean break from the Ancien Régime’s use of the Quadrivium and the Trivium in favor of a vernacular through the adoption of Pestalozzi’s innovative methods. Books like Rousseau’s Social Contract screamed lines that were to change fundamental thinking systems and social practice in a French Revolution first and then in the US.

 

“Man is born free, but everywhere he is I chains.” (Rousseau, 1762)

Rousseau was familiar with Descartes who, as a rationalist, outlined much of the structure for Cartesian mathematics and systems of philosophical underpinnings that connect with innateness and universality in today’s pedagogies.

Shadows continue to stretch forward into modern times from past centuries, from previous discoveries and scientific breakthroughs. Sometimes they are accompanied by paradigm shifts that bring solid advances to stagnant thinking and end up changing the way we come to understand the world. Scientific endeavor advances in Kuhnian paradigm leaps. Take for instance the notion that one day, prevailing best thinking and accepted truisms point to a geocentric theory of space-time; the world’s place at the center, with the other planets circling poetically around us. It was that way for centuries. 

Yet, when physical and scientific observations contravene these notions, and expected outcomes are proven false, scientists do what they do best. They question reality! They theorize new eventualities! They conceptualize different results that lead to a new way of identifying and evaluating the situation. Having tried new approaches they make newer, more appropriate assessments and judgments. The results are often as spectacular and resonating as the shift from geo- to helio-centricity. Yet timing is always critical. For Copernicus, it was more prudent to wait till he was on his deathbed before he published his controversial paradigm-shifting discoveries. Nor are revolutions always the same.

The Cognitive Revolution of 1956 was quiet and inconsequential for the vast majority of people everywhere. Nevertheless, the impact was massive and the shadow that was cast reached as far backward as it did forward. Kuhn and Copernicus were some of the paradigm shifters who occupied the bookshelves of that tiny carrel in the library where Boole had taught. Although, imperceptible at the time, many of these colossuses of science cast long shadows which affected the outcomes of learning and science for generations. While it is easy to embrace the impactful shadow of Boole over Hinton—a blood connection in science and math where an expression stretching from TRUE/FALSE statements in 1849 was central to igniting Speech Recognition software one hundred seventy years later in 2019. Similarly, but specifically for several individuals who were present (and presented papers) at a famous symposium organized by the Special Interest Group in Information Theory at the Massachusetts Institute of Technology at Dartmouth University, September 11th 1956.

Several of the scientists who were present and who took part in the fateful meetings and presentations had already made tenuous, shadowed connections back in time to earlier scientists who had pioneered the beginnings of scientific research in their specialty. On that day, the great cognitive scientist, Noam Chomsky presented a paper on theoretical linguistics at this conference. He was the first linguist to follow through on systematic theories that language acquisition, with all the precision of mathematics was an innate human capacity. 

His 1956 paper contained the ideas that he expanded a year later in his monograph, Syntactic Structures, which initiated his own rarefied cognitive revolution in theoretical linguistics. Chomsky operated under the elongated shadow of Descartes,[2] who, as rationalist and philosopher, reached forward to like-minded rational thinkers with his conceptualizations regarding mind and matter. He was one of the first people to conceptualize the notion of innateness and language acquisition. 

Chomsky took this to a new level with his theories on syntactic structures and linguistic constructs of universality and internalism. Most other scientists rallied around Chomsky, who was one of the most vocal scholars against extant behaviorism, and who decried Skinner’s much-touted views about tabula rasa, ‘free will’ and externalism as amounting to nonsense.

It was at this same conference that George Miller presented his amazing paper, The Magical Number Seven Plus or Minus Two, which was intended as a polemical nugget on how humans could avoid the bottleneck created by a limited short-term memory. Miller was undoubtedly operating under a faint but connective shadow cast by the great medieval medical virtuoso Vesalius and his predecessor, the Roman scholar Galen, who were first to theorize that cognition and memory were functions of the brain. Miller, Vesalius, and Galen sat on that bookshelf together with Copernicus, Pavlov, and Skinner. Functional localization and Hebbian theory are critical mainstays of Brain-Based methodology.

IBM and neuroscience might have seemed like strange bedfellows back in 1956, but Donald Hebb’s postulate, relating to work on the neuropsychological theory of cell assemblies, demanded a computational memory capacity worthy of the best computer[4] in the field at the time. Once more, Hebb was influenced by the foundational work of a shadowed historical figure Franz Joseph Gall, whose controversial work on phrenology didn’t stick, but whose brilliant observations on functional localizations are pretty accurate to this day.[5]

The cognitive revolution was substantial in affecting change in several interrelated fields of science and artificial intelligence and eventually made its way into teaching, learning, and human cognition. The scholars who were present at the conception and who ignited a generation of learning in their respective fields contributed to technological advances, modern theoretical and conceptual spectra of social advance, yet areas that stood to benefit the most had to wait for the longest. John Bransford, a research scientist who did fabulous work in learning sciences at Vanderbilt and later at the University of Washington’s LIFE Center (Learning in Informal and Formal Environments) contributed more than any one individual to this evolving situation.

Bransford walked into the University around the time O’Mahony was planning on walking out. O’Mahony had been studying learning systems and cognitive processes for over a year, with frustration compounding for every month he was there. Despite everything he had learned about how information is processed in the prefrontal cortex, despite all of the research that was out on how stress affects the amygdala and the role of cortisol and dopamine in the learning process, and despite the underwhelming results educators continued to see in the classroom, most professors still used the behaviorist method to teach us.

Luckily, Patricia Wasley, Dean of the College of Education, was more flexible and tolerant than he. When he told her he wanted to drop out of the program, she said, “Wait. First, you have to meet Dr. Bransford.”

John’s long career in cognitive psychology and learning sciences spanned groundbreaking events at many institutions. In the years leading up to the turn of the century, he pioneered learning sciences work with a program called Jasper, which brought together experimental psychology, technology, learning sciences, and cognitive studies. The field expanded in several facets of learning sciences and emerging technologies but settled on a cognitive construct that became known as anchored instruction. Out of that research came the genesis of the challenge model, which, in turn, morphed into the theoretical underpinning for Brain-centric Design.

In the early 2000s, the LIFE team began a research project to investigate the impact of an emergent cognitive model compared to a much-used traditional method of teaching in adult learning settings. In an informal experiment, they tested both models on an engineering complex where modern manufacturing techniques were being pioneered. They used the same content, but presented it in two different ways. Results were illuminating and the cognitive revolution had finally arrived in the world of adult learning.

As the work continues in areas of learning and teaching, in HR departments and talent development arenas, neuroscience and learning sciences will bridge the gap between educators and employees, ensuring that industries do not remain silo’d in their thinking. Knowledge is only powerful when shared. In turn, we hope that future generations of educators and scientists will stand on the shoulders of this work, and in doing so, will further perfect this approach to learning and the brain.

Workers deserve to live in a world where the greatest pleasure in life—learning—is easy, effective, impactful, and enjoyable. Brain-centric Design makes this joy a reality for today’s learners and for generation’s of new employees who are beginning to enter the market today. One thing we know for sure. When learners are given the choice of an engaging brain-centric method versus a traditional memory-centric approach, they immediately perceive the difference and are happy to abandon the old way. We are reminded of the old dean who remarked of the cognitive students… “There is so much chaos in your class that the students couldn’t possibly be learning anything.” To equate compliant ‘quiet’ with learning is a mistake that is easy to make. Decibels of learning are contagious. Cognitive contagion is!

Educators are particularly mindful of the 3Rs—Reading, Writing, and Arithmetic—a pronounced tongue-in-cheek play on the sound of learning. Notwithstanding the overriding focus on promoting proficiency in these three critical areas of academic achievement, all relevant agency data (NAEP, US Department of Education, TIMMS and more) inform us that proficiency has been at a dismally underperforming standard for the past 50 years with no discernible improvement on the horizon. 

This fact alone always bothered me. How could it be that in the most advanced country in the world, and after a dozen years of schooling, young people could emerge with such dismal results? It doesn’t add up!  Asking the question was revealing. Finding the answer was mind blowing. It wasn’t the brains… it was a deep-rooted commitment to an outmoded methodology that basically consigned learners to the reactive, freeze, flight, fight learning zone. As reported earlier, the behaviorist reward/punishment model works ok for some of us (if we have sufficient resiliency and are able to withstand the drudgery). But for the rest of us, it is a death sentence.

However, when the sound of learning is not a play on 3Rs, but rather an understanding that learning occurs as neurotransmitters are released by a presynaptic neuron, which, in turn, influence electrical activity of a postsynaptic neuron and an action potential propagates the current to activate a particular circuitry. The sound of learning changes from Repetition-Regurgitate to Action-Potential-Propagate. When a learner listens to the synaptic symphony that is going on inside the head all the time, it changes the way we think about learning.

The Future of Cognitive, Brain-Based Learning is Bright

With Brain-centric Design, the new 3Rs—Reflect, Revised Thinking, and Report Out—define processes and practices that change that original and outdated paradigm. Paradoxical and delightful outcomes emerge. Even though the focus is neither on reading, writing nor arithmetic, these same proficiencies improve, along with a learner’s capacity to grow intellectually, socially, and emotionally. The simple measure of connecting neuroscience with teaching and learning invokes a paradigm shift in attitude, mental models, and intention for both the teacher and the learner. 

When the brain is in its element, it knows what to do. Learn.