Gripping the controls of a $50 million, air-to-air combat simulator on a U.S. Air Force base back in 1992 gave cognitive neuroscientist Itiel Dror, then a Harvard graduate student on a summer internship, a rare feel for the fighter pilot’s world. It was a wild ride, but not the source of Dror’s insights into some of the unique challenges facing those pilots.

Those were rooted in his expertise: a deep understanding of the underlying thought processes that affect human performance, learning, judgment and decision-making.

Dror’s challenge was to improve pilot recognition of aircraft silhouettes to keep them from attacking friendly planes. Pondering cognitive techniques and technologies that could help them distinguish aircraft under pressure, he had a brainwave.

“The innovation was to take the phenomena in psychology called the ‘caricature advantage,’ which has no practical implication, it’s purely scientific, and translate it into a tool to enhance learning,” he recalled. The benefits of the “caricature advantage” — long familiar to editorial cartoonists — is that exaggerating distinctive facial features speeds facial recognition and identification. Why not, he asked himself, subtly exaggerate different aircraft’s most pronounced features with computer morphing, helping pilots focus on them in training and ultimately make faster identifications in the skies?

He had computer software compare all aircraft models’ images and identify unique or distinctive characteristics — a nose, a wing, a cockpit shape. The brain picks up these key elements anyway, but exaggerating them slightly during training helps the cognitive system learn them faster and more efficiently.

“It’s very subtle,” Dror explained, “but it really helps the brain to know what to pay attention to during training. Then when pilots go out in the field and they see the actual aircraft, they do much better at it.”

Since then, he has continued to look at how to improve training — to turn it from punishment to a reward, as he says — but he also studies how cognitive traps can lead the smartest people in the room to make glaringly bad decisions.

“In some domains,” Dror said, “a mistake is not a big deal. If an expert wine taster doesn’t give the correct conclusion, then it’s not so bad. But if you’re talking about the military, the police or the financial domain, it’s a different story. You have to remember that experts, like smart people, can make mistakes. And smart people do very stupid things, many times.”

Dror now splits his time between conducting pure scientific research and studying and improving “technology-enhanced learning” and e-learning. Just like air-to-air combat simulators, if they’re well designed, technology-enhanced learning and e-learning can improve the way we receive, store and use knowledge. If they’re not, they can overwhelm the human cognitive system. So Dror focuses on developing helpful, real-world cognitive learning and performance enhancement applications.

“The mind is not a camera, we don’t record information passively, we are active and dynamic,” he said. And understanding the mysterious ways in which our brains are simultaneously dazzlingly smart, yet capable of tripping us up, has great relevance to experts and those who train them. That’s especially true when, like fighter pilots and police officers, those experts must make life-and-death decisions.

“There is a mismatch in technology,” Dror said. “We want technology to work efficiently. But it needs to fit human nature and the human brain like a hand in a glove, and many times they fight.”

Harnessing Shortcuts

A resident of the United Kingdom, Dror is affiliated with the University of London and has a consultancy and research company, Cognitive Consultants International. Dror’s international array of clients includes the Israeli National Police, the Netherlands National Police, Deutsche Bank, IBM and the AO Foundation, a global nonprofit that produces multimedia training for surgeons and operating room professionals. And he still works with the USAF.

For technology-enhanced learning and e-learning clients, Dror provides a vital missing link in training with his understanding of the science of the brain. Unlike some academics, he enjoys connecting the dots for people in the real world. Often, he makes clients see that while their enormously costly IT systems may look great on paper, they are hard to operate, user-unfriendly and simply don’t fit the way people perceive and absorb information.

An executive with a top international financial company, who requested to not be identified, concurred. “Itiel’s evidence-based approach showed us how cognitive overload is caused by presenting too many images and words on a screen. Furthermore, he showed us how to overcome cognitive overload — not by reducing the number of images and words, but by constructing them to better fit how the brain and cognitive system process information.”

In that early work with pilots, Dror quickly realized the caricature advantage’s wide and practical potential. Focus can be directed if critically important images are made more dominant or prominent or by using different colors to make things pop out. While a generic grey keyboard doesn’t help computer students learn, visually emphasizing the most crucial keys, which vary by software or application, does.

“In the aircraft, it’s certain features of the aircraft; in the keyboard, it’s certain keys,” he said. The advantages only become more dramatic with complex, information-heavy tasks like learning an airliner’s cockpit controls. It helps pilots both during training and actual flying if critical controls are more dominant and visible.

Left to their own devices, our brains handle the bottleneck of information constantly rushing in by being naturally adept at prioritizing, zeroing in on what’s important and developing shortcuts.

“Shortcuts,” Dror said, “enable the brain to deal with the mismatch between the capacity, the computational power, of the brain, and the need to process some of the information. The brain taking shortcuts is one of the cornerstones of human intelligence.

“The fact that we have a smaller brain, that we can’t process all the information, has, from an evolutionary perspective, forced the brain to be more sophisticated, to decide what’s important and what’s not important, to have a plan. ‘What am I going to do? What am I going to need for the future?’ This is the core of intelligence: thinking what information is important, not important and so on.”

Evidence of brain shortcuts is everywhere, once you know where to look. Take new drivers. They’re bombarded with information and pressure to make snap decisions and judgment calls, yet look how fast it all becomes second nature? Driving is a perfect example of our brains quickly learning to pick up vital information and screen out the rest.

Shortcuts help us differentiate between identical twins. Initially, they’re impossible to tell apart. Yet, our brains unconsciously compare them, identify and latch on to the subtle differences in characteristics that will tell us who is who — a larger left nostril, a slightly higher right ear. Soon, once subtle differences become so obvious that we can’t understand why the twins ever confused us at all.

“You consciously didn’t think about it,” Dror said, “but the brain, after seeing both of them so many times, figured it out.”

Shortcuts also factor into our perception of the facial features of people of races less familiar to us. It’s not racism at work but rather the scientifically established “other race effect” that can initially make it harder for someone who has grown up primarily exposed to one race to distinguish faces in other races as easily as they can their own. Once again, however, the brain soon adjusts with increased exposure and creates another shortcut.

The Too-Well Trodden Path

While our brains all constantly develop shortcuts, experts develop and rely on them more than most, Dror said, and are adept at selectively applying their attention more efficiently. “They turn into experts because they know what’s relevant or not. They have more past experience to rely on, more wisdom. They know how to look at the important information, what to disregard and how to filter better.”

Sometimes, however, these very things are what lead experts to make errors. Dror is currently writing a paper on the paradoxical brains of human experts. “The paradox being that as we enhance performance, as we tailor it more and more, we also find an increase in potential error and bias,” he said.

While such revelations are sometimes unwelcome, Dror noted, most professionals want to squeeze out bias. Their mistakes are in full view, and they want solutions.

“In most cases it’s like giving food to somebody who is hungry,” he said. “They want it, they love it, they know they’re missing it and they just can’t get enough. In the military domain, they know they shot their friend. In the medical domain, they have a dead body in front of them. In the financial domain, they have lost money.”

Law enforcement is sometimes the odd man out. “In the forensic domain, they don’t see their mistakes because people get convicted, they’re sent away, end of story.

“And I am the child who’s saying, ‘The emperor has no clothes on!’ And they want to slap the child in the face and say, ‘Be quiet!’ In the other domains, they know that the emperor has no clothes on. I’m offering real clothes and they want them because they see the dead body, they see the losses, they know they have a problem and they want to get better.”

Pascal Schmidt, head of e-learning at the Swiss AO Foundation’s AO Education, acknowledges that most further education is ineffective and says that working with the brain’s rules helps enormously. “Itiel has a fresh, deep way of looking at things. His enormous expertise enables him to see very fast if a method or a teaching concept has potential to work or not. He immediately asks the crucial questions.”

Donald H. Taylor is chairman of the U.K.-based Learning and Skills Group, which has more than 1,000 workplace learning and development professionals as members. He was keen to “replace some of the lightweight myths of learning with solid, scientifically based research.”

He credits Dror with helping group members better understand the brain’s role in learning and ways that e-learning can be optimized. “In particular, he has helped the group grasp the importance of the concept of cognitive load,” he says. “How the design of e-learning can reduce or increase this, and the knock-on effects for learning.”

The Inertia of Expertise

As might be expected, Itiel Dror is a sought-after speaker and has been known to show up at a lecture carrying a brain in a jar. But brain buddy or not, he sets the stage by starting his talks with a discussion of the human mind and the architecture of cognition. His extensive traveling library of photographs and illustrations helps him convey messages about complex scientific phenomena while warming up skeptical or hostile audiences and making experts who are convinced they are objective finally sit up and pay attention.

Sometimes he shows a very blurry image containing nothing decipherable, then a slightly enhanced version, just clear enough for some audience members to recognize something. Lastly, he shows a really clear version suddenly revealing a person, animal or object that was there all along but that was, until that moment in that final image, impossible to see. The real grabber is that if people then go back to the first blurry image, they’re likely to still be able to see what they previously could only see in the very clearest image. You can’t un-see something once you have seen it.

Dror’s point is unmistakable. “Once you go back, you see things differently, and you can’t turn it off, that’s it,” he says. This demonstration is reason enough for forensic experts to always do their first examination of evidence completely free of context. But the concept heightens the awareness of the potential for bias in all manner of experts.

Experts can seem omnipotent and objective even when they are not. And because brain shortcuts become more dominant and relied upon over time, Dror says that they can actually reduce objectivity and accuracy in highly experienced workers’ judgments and decisions. And those most sure of their own objectivity may be even more susceptible to error.

Specialized knowledge can also restrict creativity or create inflexible mindsets. Children are unfettered, thus very creative. By contrast, gaining expertise can come at the cost of boundless creativity and even the simple ability to see things from a fresh perspective.

“Many times we have to bring people from the outside to look at something we’re doing because we’re so stuck into it,” Dror said of experts. “We’re kind of locked in a state of mind or in a certain solution and we are limited. And this is exactly the paradox. It’s a cognitive tradeoff.”

Awareness of the pitfalls in complex decision-making processes can give us a new understanding of mystifying events like the global financial crisis. On this, Dror takes a more benevolent view than many. He sees honest mistakes rather than evil, subterfuge or skullduggery, simply because of his understanding of how human beings think and affect one another. And he knows that experts, far from being immune from making mistakes, are sometimes more susceptible.

“What makes them experts means they take shortcuts, they’re automatic, they take things for granted and they rely on a huge amount of experience and knowledge,” he said. “And it’s helpful in most cases. But in some cases the knowledge and experience, if they use their intuition, may take them the wrong way.”

The power of our underlying thought processes is the common thread linking expert errors in fields like forensics, finance, medicine and the military. Dror sees its impact written all over the key (or most infamous) players in the financial meltdown. While we wonder how all the top minds and brilliant financial thinkers could get everything so horribly wrong, Dror believes they were far less stupid, greedy and corrupt than many imagine. He sees smart, skilled people who lacked objectivity and awareness of issues perhaps affecting their decision-making.

“Perception and judgment in decision-making are not objective,” he explained, “and people who think they are fall into a trap. They’re vulnerable.”

Dror believes that financial industry players fell prey en masse to thought processes like wishful thinking, overconfidence in their own decisions, an escalating commitment to ideas and beliefs and plans, and an increasing unwillingness to cut their losses and change directions. “It’s not that they made one mistake,” he said. “They made mistake after mistake for many years because they escalated their commitment in the wrong places.”

He believes that conformity bias and “group think” ran rampant in entire financial communities. Everyone’s perception was off, so they chose and absorbed information selectively, were biased in their thinking and compounded all this in one another.

“All these psychological and cognitive issues in decision-making and judgment are undoubtedly the cause of the current economical crisis,” Dror asserted.

Love and Money

What his banking and financial industry clients have in common with pilots and police is risk taking and making decisions under time pressure. Fighter pilots must take risks to survive. But in banking and finance, you want people motivated and excited, not reckless, and you want risk taking tempered, not escalated.

“In the financial domain, we all know what happened when they took too much risk,” Dror said. “We’re all paying for it now.”

Not only experts can benefit from being aware of, and understanding, human cognitive processes either. Perception is far from perfection, Dror likes to say. In everyday life, he noted, what we see is not only based on what is really out there, but also, to a large extent, on who and what we are, what we expect, and what we hope for.

“Take the saying, ‘love is blind,’” he offered. “What does that mean? When you’re in love, things look different because our emotional state affects how we see things. That’s why, when a woman is pregnant, she or her spouse suddenly feel as if the world is full of pregnant women on every street. There are no more pregnant women than usual, but suddenly the brain is picking up that information.”

After recently tackling a do-it-yourself project at home, he noticed that everywhere he went he was suddenly seeing doors improperly aligned. “Because I’m doing it at home puts me in a mental state, so the brain filters out information differently,” he said.

It pays to know that how we see things is not exactly how they are, and to take a fresh look at what is usually taken for granted.

“Take what you believe is an absolute truth with a grain of salt,” Dror suggested. “Question yourself, and understand that we’re all locked in our own brain, in our own perceptions, with our own experiences that paint the world. We may have a better understanding of the world if we know that what we see is not 100 percent the world itself, it is us interacting with the world around us.”

People without this awareness tend to attribute too much of what is happening to them to the world, and not enough to themselves and to take actions based on that perception. With greater awareness, Dror said, “They can question and think about things and see them in a better light and hopefully improve their perceptions and decision-making, and the quality of their life.”

Pacific Standard magazine, 2009
All Rights Reserved. Copyright will be strictly enforced.

Sue Russell
Sue Russell