Eye Movement Research: Core NLP Hypothesis Vindicated

(c) Richard Bolstad

Finally in 2020 an ancient mystery was solved, and a cornerstone of the NLP model gained a foothold in science … but very few people noticed, even in the NLP community itself.

The History of the NLP Theory on Sensory Systems and Eye Movements

For 1200 years, people have noticed that the movement of the eyes up, down and to the sides is correlated with thinking. In a treatise called On the difference between spirit and soul, Qusta ibn Luqa (864-923) combined Nemesius’ ventricular localization doctrine with the Roman physician Galen’s account of a worm-like part of the brain that controls the flow of animal spirit between the middle and posterior ventricles. Ibn Luqa wrote that people who want to remember look upwards because this raises the worm-like particle, opens a passage, and enables the retrieval of memories from the posterior ventricle of the brain. People who want to think, on the other hand, look down because this lowers the particle, closes the passage, and protects the spirit in the middle ventricle from being disturbed by memories stored in the posterior ventricle. Qusta ibn Luqa al-BaBa’albakki, i. e. from Baalbek or Heliopolis, in Lebanon, a Melkite Christian of Greek origin, lived in Baghdad. He was a philosopher, physician, mathematician and astronomer. His work was recorded by the Moslem scientist Ibn al-Nadim (Ibn al-Nadim 1871, page 234).

It was a millenium later that the same phenomenon was noticed by the developers of psychology in the west. In 1890, psychologist William James had already defined four key types of “thought”. He said (1950, Volume 2, p58) “In some individuals the habitual “thought stuff”, if one may so call it, is visual; in others it is auditory, articulatory [verbal], or motor [kinesthetic]; in most, perhaps, it is evenly mixed.” James touches repeatedly on the issue of the eye movements which accompany and can be used as cues for sensory accessing. At one stage he quotes (Volume 2, p50) Fechner’s “Psychophysique”, 1860, Chapter XLIV. “In imagining, the attention feels as if drawn backwards towards the brain.” Describing what happens when he himself visualises, James adds (James, Volume 2, p65): “All these images seem at first as if purely retinal. I think, however, that rapid eye movements accompany them, though these latter give rise to such slight feelings that they are almost impossible of detection.”

Research identifying the neurological bases for these different types of “thought” began to emerge in the mid twentieth century. Much of it was based on the discovery that damage to specific areas of the brain caused specific sensory problems. A. Luria (1966) identified separate areas associated with vision, hearing, sensory-motor activity, and speech (the latter isolated on the dominant hemisphere of the brain). Evidence that eye movements were correlated with the use of different areas of the brain emerged in the 1960s (amongst the earliest being the study by M. Day, 1964).

In their 1980 presentation of NLP, Dilts, Grinder, Bandler and DeLozier (1980, p 17) suggest that all human experience can be coded as a combination of internal and external vision, audition, kinesthesis and olfaction/gustation. The combination of these senses at any time (VAKO/G) is called a 4-tuple. Kinesthetic external is referred to as tactile (somatosensory touch sensations) and kinesthetic internal as visceral (emotional and prioceptive). The developers of NLP then suggested (1980, p 75) that auditory representation can be usefully divided into digital (verbal) and tonal. Auditory digital is described as a class of secondary experience, run from the dominant hemisphere of the brain and metacommenting on the other types of experience.

The developers of NLP proposed that each sensory system was run by a separate area of the brain, and cues to its access were given by a particular eye movement pattern (1980, 81). The person’s representing of their experience in a particular language could be identified by the words (predicates) they use to describe their subject. For example, someone might say “I see what you mean.” visually, “I’ve tuned in to you.” auditorally, or “Now I grasp that.” kinesthetically. The standard NLP diagram of accessing cues shows that auditory digital is actually placed on the left side (suggesting that all the accessing cues on that side may correspond to the dominant hemisphere, where verbal abilities are known to be processed). The NLP eye cues model places auditory digital down, opposite kinesthetic, rather than level, opposite auditory tonal. Howard Gardiner argues that verbal thought should not be called “auditory”, as it is an entirely separate form of intelligence. He points out that the same area of the brain used by hearing people to generate verbal language is used by deaf people to generate sign language (Gardiner, 1993, p 52).

In the field of NLP the ability to distinguish left from right has been of great interest because it is central to the condition “dyslexia”. The NLP solution to dyslexia has been twofold (Blackerby, 1996, p 153-155). Firstly to retrain the person’s recognition of left and right. Secondly to teach the person to use a Visual Recall spelling strategy (looking up left and picturing the word as it has been seen before).

Psychologists try to disprove the theory

In 2012 Psychology researcher Richard Wiseman published research about the NLP Eye Movement Hypothesis. Unfortunately, the hypothesis he chose to study is not the one that NLP discovered, nor it the one referenced in their NLP references. It is their own theory, stated at the start of this abstract:

“Proponents of Neuro-Linguistic Programming (NLP) claim that certain eye-movements are reliable indicators of lying. According to this notion, a person looking up to their right suggests a lie whereas looking up to their left is indicative of truth telling. Despite widespread belief in this claim, no previous research has examined its validity. In Study 1 the eye movements of participants who were lying or telling the truth were coded, but did not match the NLP patterning. In Study 2 one group of participants were told about the NLP eye-movement hypothesis whilst a second control group were not. Both groups then undertook a lie detection test. No significant differences emerged between the two groups. Study 3 involved coding the eye movements of both liars and truth tellers taking part in high profile press conferences. Once again, no significant differences were discovered. Taken together the results of the three studies fail to support the claims of NLP. The theoretical and practical implications of these findings are discussed.”

Wiseman et alia found that this hypothesis, misnamed by them as a claim by proponents of NLP, is incorrect. That’s very good news for NLP, which has as a field always cautioned that eye movements tell us which area of the brain is being accessed, and tell us nothing about whether the access is true to facts or not true to facts. Below their attention grabbing title and abstract, Wiseman et alia do actually say “Although the originators of NLP didn’t view ‘constructed’ thoughts as lies, this notion has become commonplace, leading many NLP practitioners to claim that it is possible to gain a useful insight into whether someone is lying from their eye-movements.” To the extent that any did, they can now stop misrepresenting NLP. However, the only place Wiseman et alia actually found their hypothesis advocated was on two videos on YouTube. They claim that “two well known YouTube videos encouraging lie detectors to adopt this approach have received 30,000 and 60,000 views respectively.”

Richard Gray, an NLP Trainer, says in his article, referenced by the current Wiseman et alia research “For most right-handed people eye movement up and to the left is a signal that they are attempting to access a visual memory. Movement up and to the right usually signals that the client is constructing a visual image. Auditory patterns follow the same leftright pattern, left for remembered, right for constructed (Grinder & Bandler, p. 80 ff., Bandler & Grinder, 1979). When a client is asked a concrete question “Where were you last night?”-eye movement up or over to the right might suggest that he or she is constructing a response, not recalling one. This in itself may indicate valuable lines for further investigation.” This hardly constitutes a claim that the person is lying. Vrij and Lochun, also referenced in this study, emphasise, in complete contradiction to the research presuppositions, that: “It is important to note that NLP-theorists never mention the possibility of detecting lies by observing eye movements, nevertheless some police officers believe that it is possible to do so.”

Earlier Research On The Real NLP Eye Movement Phenomenon

Wiseman et alia say with some correctness that “Throughout the 1980s researchers examined many of the claims made by NLP practitioners. Much of this work assessed the alleged relationship between eye-movement and modality of thought, and involved recording participants’ eye-movements whilst asking them questions that encouraged to recall visual and auditory memories (e.g., ‘What colour is the front door of your house?’, ‘Can you describe the sound of your mother’s voice?’). This work consistently failed to support the claims of NLP.” Noting that this failure to gather support was a result both of NLP’s original lack of clarity and of untrained observers doing the research, Eric Einspruch and Bruce Foreman said in their 1985 review of research on NLP; “Many skilled NLP Practitioners have a wealth of clinical data indicating that this model is highly effective. Clearly these Practitioners would provide a service to the field by presenting their data in the literature so they may be critically evaluated.”

However, it is also true that research supporting this particular small part of the NLP model does exist, and did even at the time of Wiseman’s study. Some researchers have managed to combine research conditions with the requirements for training identified by Einspruch and Foreman. For example, Dr Susan Nate (Nate 2004) completed a research study of 50 children aged between 8 and 12 years old, both boys and girls, of White American, Hispanic American, Native American, Black American and Asian American ethnic identity. Two NLP trained examiners asked the children a series of 23 questions and wrote down their observed eye movements, while a video camera recorded the results. A chi square data analysis of the results was done, but the conclusions were also immediately obvious and confirmed the NLP eye accessing cue model. There was no difference in results by age, sex, ethnic group or handedness.

Susan Nate’s study consistently confirmed the NLP model in all ways except one. While a small percentage of children had the eye cues reversed from left to right, this did not seem to be predicted by their handedness. Three children (6%) had the visual eye cues only reversed, and two children (4%) had all cues reversed. The two children with all cues reversed were both right handed. Nine children (18%) looked straight ahead rather than up on some of the visual recall questions, which has been hypothesized to be a result of the information being more easily available. Some children’s eye movements were far more dramatic, taking more time and swinging further to the direction required. More compact movements were associated with faster retrieval times. Some children showed clear “lead” systems (initial meta-accessing before accessing a direction answering the question; one of the key elements requiring training to detect), to use the NLP term. One child, for example, had to repeat every singly question to himself in auditory digital (with the relevant eye movement) before finding the answer in the usual place.

Other research solves the problem of observer bias versus untrained observational failure in novel ways. Because of the systemic nature of the brain, the developers of NLP proposed that if thinking visually causes your eyes to be drawn up more, then placing the eyes up more will help you to visualise. Specifically, looking up to the left (for most people) will help them recall images they have seen before. Dr F. Loiselle at the University of Moncton in New Brunswick, Canada (1985) tested this. He selected 44 average spellers, as determined by their pretest on memorising nonsense words. Instructions in the experiment, where the 44 were required to memorise another set of nonsense words, were given on a computer screen. The 44 were divided into four subgroups for the experiment.

  • Group One were told to visualise each word in the test, while looking up to the left.
  • Group Two were told to visualise each word while looking down to the right.
  • Group Three were told to visualise each word (no reference to eye position).
  • Group Four were simply told to study the word in order to learn it.

The results on testing immediately after were that Group One (who did actually look up left more than the others, but took the same amount of time) increased their success in spelling by 25%, Group Two worsened their spelling by 15%, Group Three increased their success by 10%, and Group Four scored the same as previously. This strongly suggests that looking up left (Visual Recall in NLP terms) enhances the recall of words for spelling, and is twice as effective as simply teaching students to picture the words. Furthermore, looking down right (Kinesthetic in NLP terms) damages the ability to visualise the words. Interestingly, in a final test some time later (testing retention), the scores of Group One remained constant, while the scores of the control group, Group Four, plummeted a further 15%, a drop which was consistent with standard learning studies. The resultant difference in memory of the words for these two groups was 61% .

Thomas Malloy at the University of Utah Department of Psychology completed a study with three groups of spellers, again pretested to find average spellers. One group were taught the NLP “spelling strategy” of looking up and to the left into Visual Recall, one group were taught a strategy of sounding out by phonetics and auditory rules, and one were given no new information. In this study the tests involved actual words. Again, the visual recall spellers improved 25%, and had near 100% retention one week later. The group taught the auditory strategies improved 15% but this score dropped 5% in the following week. The control group showed no improvement. (Dilts and Epstein, 1995).

Another intriguing way to test the hypothesis is to check whether people look to the left when refering to the past and look to the right when they refer to the future, and whether they gesture in those directions and even find it easier to respond to questions about past or future from each direction. Boroditsky (2000) tested the relationship between time and space by posing questionnaires to Standford University undergraduates and it was found that there was an obvious relationship between spatial schemas and perception of time. A reaction time method was then adopted by Santiago and his colleagues (2007), who tested the spatial relation of left/right in the person’s cognitive conception of time. They found that reaction times were faster when past words were mapped onto the left key and similarly, future words with the right key. Abdul Rahman (2011) confirmed the relationship in another cultural setting in 2011.

Eye Accessing Cues Research 2015

In 2015, scientists finally detected the exact neurological response that generates what NLP calls “eye accessing cues”. The research did not show that the movements to different places indicate activation of different areas of the brain (that emerged in 2020: see below), but it did show that each eye movement occurs just before activation of a specific brain circuit holding a specific memory or idea. This research emerged out of studies of REM (Rapid Eye Movement) sleep, the phase when dreaming happens. It had always been known that eye movements to the sides occur during this sleep, and it was hypothesized that maybe sleepers are scanning things in their dream images. However even people blind from birth have these movements. The next part of the puzzle was that researchers noticed that these eye movements during sleep are similar to those that happen when awake people imagine a new image.

Finally, scanning people’s brains while asleep, researchers from Tel Aviv University found that there was a burst in the activity of neurons that occurred just after the person’s eyes flickered. This activity reflected a change of concept or scene (not image processing) during sleep. The scientists demonstrated that this was the same brain activity that occurred when awake patients were shown pictures, especially those related to their memories. “About a 0.3 seconds after the picture appears, these neurons burst — they become vigorously active,” Dr. Yuval Nir, who co-authored the study published in Nature Communications, explained to BBC News. “This also happens when people just close their eyes and imagine these pictures, or these concepts.”

The new research was conducted over a period of four years, using data collected from 39 individuals who suffer from epilepsy. The patients already had electrodes implanted in their brains to try and help manage their seizures, and this allowed Dr. Nir the perfect chance to measure the activity of around 40 individual neurons — mainly within the medial temporal lobe located towards the bottom of the brain — while the volunteers slept. Dr. Nir told New Scientist magazine “Every time you move your eyes, a new image forms in the mind’s eye.” (Andrillon et alia, 2015). And this is exactly what NLP has been saying.

Reaccessing the REM state?

Before we move on to the final part of this story, lets step to the side and notice another important implication of this train of research. Can we artificially reactivate the REM state where multiple memories are being reprocessed? Tantalizing hints emerge from processes such as the NLP Eye Movement Integration process, used by Steve Andreas. In this process, while people place their attention on a distressing memory, they move their eyes relatively quickly from side to side and corner to corner. Afterwards, they do indeed report that the memory seems to have been “reprocessed” and they feel better when they think of it. While we do not have clinical research on this exact protocol for the Eye Movement process, we do have some dramatic research on EMDR, based on a very similar protocol. A 2012 study of 22 people found that EMDR therapy helped 77 percent of the individuals with psychotic disorder and PTSD. It found that their hallucinations, delusions, anxiety, and depression symptoms were significantly improved after treatment. The study also found that symptoms were not exacerbated during treatment. Only five of the twenty-two completers (22.7%) still met criteria for PTSD after treatment. (van den Berg and van der Gaag, 2012).

2020: Vindication?

But back to the original NLP Eye Movement Hypothesis. 2020 was the first year when precisely the same data emerged from a non-NLP Psychology research process. Research by Christophe Carlei & Dirk Kerzel at the Université de Genève in 2020 initially studied people looking in different directions as they viewed French words, which they had to identify by gender. The researchers called this study “Looking up improves performance in verbal tasks” but in NLP terms this is a visual task. They acknowledge: “In contrast to verbal processing, there is evidence that spontaneous eye movements are directed to the upper-left visual field when questions are asked that imply visuo-spatial processing (Ehrlichman, 1977; Ehrlichman, Weiner, & Baker, 1974; Galin & Ornstein, 1974; Kinsbourne, 1972). Interestingly, this finding is consistent with assumptions in Neuro-Linguistic Programming (NLP, Ahmad, 2013) stating that visuo-spatial processing is better when participants look up. We recently found experimental support for this hypothesis using the unilateral gaze method (Carlei & Kerzel, 2014, 2015). More precisely, performance on visuo-spatial tasks was facilitated when observers looked to the upper left corner of the screen.” (Carlei and Kerzel, 2020). Qusta ibn Luqa, you were right, 1200 years ago!

In 2023, further support for this came from a study at Tohoku University in Japan (Matsumiya and Furukawa, 2023). “This suggests that saccades are useful for making inferences about covert perceptual decisions, even when the actions are not tied to decision making.”

Bibliography/Sources

  • Abdul Rahman, N. Conceptualisation of Past and Future as Moving from Left to Right http://ainirahman.wordpress.com/2011/01/20/conceptualisation-of-past-and-future-as-moving-from-right-to-left/
  • Afsane, M., Lotfollah, K., Seyyed, D.A.  2016, “The Impact of Neuro Linguistic Programming (NLP) on EFL Learners’ Vocabulary Achievement” Afsane, M., Lotfollah, K., Seyyed, D.A.  IOSR Journal Of Humanities And Social Science (IOSR-JHSS) Volume 21, Issue11, Ver. 11 (Nov. 2016) PP 27-37 e-ISSN: 2279-0837, p-ISSN: 2279-0845, www.iosrjournals.org DOI: 10.9790/0837-2111112737
  • Ahmad, K. Z. (2013). Lying eyes: The truth about NLP eye patterns and their relationship with academic performance in business and management studies (MBA). International Journal of Business and Management, 8(23), 67–75. doi:10.5539/ijbm.v8n23p67
  • Andrillon, T., Nir, Y., Cirelli, C., Tononi, G., and Fried, I. “Single-neuron activity and eye movements during human REM sleep and awake vision”, Nature Communications, Volume: 6, Article number: 7884, doi:10.1038/ncomms8884, 11 August 2015
  • Armstrong, T. 7 Kinds Of Smart, Plume/Penguin, New York, 1993
  • Bandler R, Grindler J (1975) The Structure of Magic I. Palo Alto, CA: Science and Behavior Books.
  • Bandler, R., & Grinder, J. (1979). Frogs into Princes. Moab, Utah: Real People Press.
  • Beck CE, Beck EA (1984) Test of the Eye-Movement Hypothesis of Neurolinguistic Programming: A rebuttal of conclusions. Percept Mot Skills 58: 175–176.
  • Blackerby, D.A. Rediscover The Joy Of Learning, Success Skills, Oklahoma, 1996
  • Bolstad, R. & Hamblett, M., Transforming Communication, Longman, Auckland, 1998
  • Boriditsky, L. (2000). Metaphoric structuring: understanding time through spatial metaphors. Cognition, 75 (1), 1-28.
  • Carlei, C., & Kerzel, D. (2014). Gaze direction affects visuo-spatial short-term memory. Brain and Cognition, 90, 63–68. doi:10.1016/j.bandc.2014.06.007
  • Carlei, C., & Kerzel, D. (2015). The effect of gaze direction on the different components of visuo-spatial short-term memory. Laterality: Asymmetries of Body, Brain and Cognition, 20(6), 738–754. doi:10.1080/1357650X.2015.1047380
  • Carlei, C. & Kerzel, D. (2020) Looking up improves performance in verbal tasks, Laterality, 25:2, 198-214, DOI: 10.1080/1357650X.2019.1646755
  • Chapman LJ, Chapman JP (1967) Genesis of popular but erroneous psychodiagnostic observations. J Abnorm Psychol 72: 193–204.
  • Day, M. “An Eye Movement Phenomenon Relating to Attention, Thoughts, and Anxiety” in Perceptual Motor Skills, 1964
  • Dilts, R. Roots Of Neuro-Linguistic Programming, Meta Publications, Cupertino, California, 1983
  • Dilts, R., Grinder, J., Bandler, R. and DeLozier, J. Neuro-Linguistic Programming: Volume 1 The Study of the Structure of Subjective Experience, Meta Publications, Cupertino, California, 1980
  • Dilts, R.B. and Epstein, T.A. Dynamic Learning, Meta Publications, Capitola, 1995
  • Dilts, R.B. Strategies of Genius, Volume I, II, and III, Meta Publications, Capitola, 1994-5
  • Einspruch, E.L., and Forman, B.D., “Observations Concerning Research Literature on Neuro-Linguistic Programming”, in Journal of Counselling Psychology, Vol 32, 4, p589-596, 1985
  • Ehrlichman, H. (1977). Field-dependence-independence and lateral eye-movements following verbal and spatial questions. Perceptual and Motor Skills, 44(3), 1229–1230. doi:10.2466/pms.1977.44.3c.1229
  • Ehrlichman, H., & Weinberger, A. (1978). Lateral eye movements and hemispheric asymmetry: A critical review. Psychological Bulletin, 85(5), 1080–1101. doi:10.1037/0033-2909.85.5.1080
  • Ehrlichman, H., Weiner, S. L., & Baker, A. H. (1974). Effects of verbal and spatial questions on initial gaze shifts. Neuropsychologia, 12(2), 265–277. doi:10.1016/0028-3932(74)90012-8
  • Ekman P (2001) Telling lies. Clues to deceit in the marketplace, politics, and marriage. New York: W. W. Norton & Company.
  • Elich M, Thompson RW, Miller L (1985) Mental imagery as revealed by eye movements and spoken predicates: A test of neurolinguistic programming. J Couns Psychol 32: 622–625.
  • Galin, D., & Ornstein, R. (1974). Individual differences in cognitive style-1. Reflective eye movements. Neuropsychologia, 12(3), 367–376. doi:10.1016/0028-3932(74)90052-9
  • Gardner, H. Frames Of Mind: The Theory Of Multiple Intelligences, BasicBooks, New York, 1993
  • Gray R (1991) Tools for the trade: Neuro-linguistic programming and the art of communication. Fed Probat 55: 11–16.
  • Heap M (2008) The validity of some early claims of neuro-linguistic programming. Skeptical Intelligencer 11: 6–13.
  • Ibn al-Nadim, Kitab al-Fihrist mit Anmerkungen hrsg. von Gustav Flugel, 2 vols., Leipzig, 1871
  • James, W. The Principles Of Psychology (Volume 1 and 2), Dover, New York, 1950.
  • Kinsbourne, M. (1972). Eye and head turning indicates cerebral lateralization. Science, 176(4034), 539–541. doi:10.1126/science.176.4034.539
  • Luria, A.R. Higher Cortical Functions In Man, Basic Books, New York, 1966
  • Matsumiya, K., Furukawa, S. Perceptual decisions interfere more with eye movements than with reach movements. Commun Biol 6, 882 (2023). https://doi.org/10.1038/s42003-023-05249-4
  • Nate, S. “Eye Accessing Cues: A Study in Storage and Retrieving Information” p 35-50 in Anchor Point Volume 18, No 4, June 2004
  • Norton, S. “Excellence in Education: NLP and Multiple Intelligences” in Anchor Point, June 1994, p 50-53
  • Oldfield RC (1971) The assessment and analysis of handedness: The Edinburgh Inventory. Neuropsychologia 9: 97–113.
  • Porter S, ten Brinke L (2010) The truth about lies: What works in detecting highstakes deception? Legal and Criminological Psychology 15: 57–75.
  • Redelmeier DA Tversky A (1996) On the belief that arthritis pain is related to the weather. Proc Natl Acad Sci USA 93: 2895–6.
  • Rhoads SA, Solomon R (1987) Subconscious rapport building: Another approach to interviewing. The Police Chief 4: 39–41.
  • Santiago, J., Lupianez, J., Perez, E., & Funes, M. (2007). Time (also) flies from left to right. Psychonomic Bulletin and Review, 14 (3), 512-516.
  • Sharpley CF (1984) Predicate matching in NLP: A review of research on the preferred representational system. J Couns Psychol 31: 238–248.
  • Sharpley CF (1987) Research findings on neurolinguistic programming: Nonsupportive data or an untestable theory? J Couns Psychol 34: 103–107.
  • ten Brinke L, Porter S (in press) Cry me a river: Identifying the behavioural consequences of extremely high-stakes interpersonal deception. Law Hum Behav. Available at: https://people.ok.ubc.ca/stporter/Publications_files/Cry%20Me%20a%20River%20-%20in%20press.pdf.
  • Thomason TC, Arbuckle T, Cady D (1980) Test of the Eye Movement Hypothesis of Neurolinguistic Programming. Percept Mot Skills 51: 230.
  • Van den Berg, D and van der Gaag, E., (2012) “Treating trauma in psychosis with EMDR: A pilot study” Journal of Behavior Therapy and Experimental Psychiatry, Volume 43, Issue 1, March 2012, Pages 664-671
  • Vrij A (2008) Detecting lies and deceit: Pitfalls and opportunities. Chichester: Wiley.
  • Vrij A, Lochun SK (1997) Neuro-linguistic programming and the police: Worthwhile or not? Journal of Police and Criminal Psychology 12: 25–31.
  • Wilcox, J., The Transmission and Influence of Qusta ibn Luqa’s “On the Difference between Spirit and the Soul”, PhD thesis, City University of New York, 1985
  • Wiseman R, Watt C, ten Brinke L, Porter S, Couper S-L, et al. (2012) The Eyes Don’t Have It: Lie Detection and Neuro-Linguistic Programming. PLoS ONE 7(7): e40259. doi:10.1371/journal.pone.0040259