Monday, August 4, 2014

NLP: How to Read People (Part 2)

Continued from NLP: How to Read People

In order to read people and understand what they ‘do’ within them about different types of experience, we must know how our brains ‘work’. For this reason, we shall use Dr Richard Bolstad’s brilliant article “Putting The “Neuro” Back Into NLP” (Bolstad, 2003).

Everything we experience of the world comes to us through the neurological channels of our sensory systems. The greatest events and the most tender interpersonal moments are "experienced" (transformed into internal experiences) as images (visual), sounds (auditory), body sensations (kinesthetic), tastes (gustatory), smells (olfactory) and learned symbols such as these words (digital). Those experiences, furthermore, can be re-membered (put together again) by use of the same sensory information.

Think of a fresh lemon.  Imagine one in front of you now, and feel what it feels like as you pick it up.  Take a knife and cut a slice off the lemon, and hear the slight sound as the juice squirts out.  Smell the lemon as you lift the slice to your mouth and take a bite of the slice.  Taste the sharp taste of the fruit.

If you actually imagined doing that, you mouth is now salivating.  Why? Because your brain followed your instructions and thought about, saw, heard, felt, smelled and tasted the lemon.  By recalling sensory information, you recreated the entire experience of the lemon, so that your body responded to the lemon you created. Your brain treated the imaginary lemon as if it was real, and prepared saliva to digest it. Seeing, hearing, feeling, smelling and tasting are the natural “languages” of your brain. Each of them has a specialised area of the brain which processes that sense. Another NLP term for these senses is "Modalities". When you use these modalities, you access the same neurological circuits that you use to experience a real lemon. As a result, your brain treats what you’re thinking about as “real”.

Perception is a complex process by which we interact with the information delivered from our senses. There are areas of the neural cortex (outer brain) which specialise in information from each of the senses (or modalities as olfactory, gustatory, somatosensory, auditory and visual). However there is no direct connection between the sense organ (the retina of the eyes, for example) and the specialised brain area which handles that sense. The cortex is the outer area of the brain, and each sense has an area of cortex specialised for it.

What we see is affected by our emotions, and it also shapes those emotions. Depression, anxiety, confusion, and anger are all related to certain types of perceptual distortion. So are joy, excitement, understanding and love. For example, the person who is depressed often actually takes their visual memories of the day's experiences and darkens them, creating a gloomy world.

Take a memory of a recent experience you enjoyed, and imagine seeing it dull and grey. Usually, this doesn't feel as good, so make sure you change it back to colour afterwards.

On the contrary, if there was no emotionally coloured experience during the day or some period, you would remember nothing. We can not live without bright events and people we love. Otherwise, we would imagine continuation of stories the way it would satisfy us. Or, remember events with intensively negative connotation.

In the absence of pleasant and bright positive experience, for example, the person with Post Traumatic Stress Disorder recreates vivid and terrifying flashbacks of hurtful events to be frightened or suffer (to get intensive experience).

Some talented people construct or create bright events in their imagination to substitute the dullness and boredom of their everyday existence. Our brain does not make difference between actually seen or imagined events and people.

That is why sometimes it is not easy to distinguish reality from their created experience without knowing brain’s mechanisms functioning.

Sensory Accessing and Representational Cues

As a person goes through their daily activities, information is processed in all the sensory modalities, continuously. However, the person's conscious attention tends to be on one modality at a time. It is clear that some people have a strong preference for "thinking" (to use the term generically) in one sensory modality or another.

As early as 1890, the founder of Psychology, William James defined four key types of "imagination" based on this fact. He says “In some individuals the habitual “thought stuff”, if one may so call it, is visual; in others it is auditory, articulatory [to use an NLP term, auditory digital], or motor [kinesthetic, in NLP terms]; in most, perhaps, it is evenly mixed. The auditory type... appears to be rarer than the visual.

Person’s of this type imagine what they think of in the language of sound. In order to remember a lesson they impress upon their mind, not the look of the page, but the sound of the words....

The motor type remains -perhaps the most interesting of all, and certainly the one of which least is known. Persons who belong to this type make use, in memory, reasoning, and all their intellectual operations, of images derived from movement.... There are persons who remember a drawing better when they have followed its outlines with their finger.” (James, 1950, Volume 2, p58-61).

Research identifying the neurological bases for these different types of “thought” began to emerge in the mid twentieth century. Much of it was based on the discovery that damage to specific areas of the brain caused specific sensory problems.

By the time NLP emerged in the 1960s, then, researchers already understood that each sensory system had a specialised brain area, and that people had preferences for using particular sensory systems. In their original 1980 presentation of NLP, Dilts, Grinder, Bandler and DeLozier (1980, p 17) point out that all human experience can be coded as a combination of

internal and external vision (V),
audition (A),
kinesthesis (K)and
olfaction/gestation (O/G).

Kinesthetic external is referred to as tactile (somatosensory touch sensations) and kinesthetic internal as visceral (emotional and prioceptive).

The developers of NLP noticed that we also process information in words and that words too have a specific brain system specialised to process them, as if they were a sensory system.

They described this verbal type of information as “auditory digital”, distinguishing it from the auditory input we get, for example, in listening to music or to the sound of the wind.

In thinking in words (talking to ourselves) we pay attention specifically to the “meaning” coded into each specific word, rather than to the music of our voice.

Robert Dilts (1983, section 3, p 1-29) showed that different brain wave (EEG) patterns were associated with visual, auditory, tactile and visceral thought. The developers of NLP claimed to have identified a number of more easily observed cues which let us know which sensory system a person is using (or "accessing") at any given time.

Amongst these cues are a series of largely unconscious eye movements which people exhibit while thinking (1980, p 81). These "eye movement accessing cues" have become the most widely discussed of all the NLP discoveries. Outside of NLP, evidence that eye movements were correlated with the use of different areas of the brain emerged in the 1960s (M. Day, 1964).

In the absence of fresh impressions, positive & pleasant experience, bright meaningful events and people, persons get stuck in their preferable channels of perception and processing information, suffer and experience ‘problems’ or get encouraging information from outside sources. For instance,

Auditory digital – talk to themselves or think, think and think
Visual – see or construct images: events, people, coloring experience
Auditory – talk to other people/ ‘hear voices’ or listen music, radio, TV, etc
Kinesthetic – get self-stimulating experience watching erotic films (as simulators), race in the streets, fight, etc.

To be continued…

Natalia Levis-Fox,
NLP Practitioner
Tel.: +7 928 266 93 13
Private practice, on-line consultations
License No 314265119000560

  1. Adler, R. “Crowded Minds” in New Scientist, Vol. 164, No. 2217, p 26-31, December 18, 1999.
  2. Bandler, R. Using Your Brain For A Change Real People Press, Moab, Utah, 1985
  3. Bolstad, R. and Hamblett, M. “Visual Digital: Modality of the Future?” in NLP World, Vol 6, No. 1, March 1999
  4. Bolstad, R. “Putting The “Neuro” Back Into NLP” Christchurch, NZ. 2003.
  5. Day, M. “An Eye Movement Phenomenon Relating to Attention, Thoughts, and Anxiety” in Perceptual Motor Skills, 1964
  6. Dilts, R. Roots Of Neuro-Linguistic Programming, Meta Publications, Cupertino, California, 1983
  7. Dilts, R., Grinder, J., Bandler, R. and DeLozier, J. Neuro-Linguistic Programming: Volume 1 The Study of the Structure of Subjective Experience, Meta Publications, Cupertino, California, 1980.
  8. Hoffman, D.D. Visual Intelligence W.W. Norton & Co., New York, 1998
  9. James, W. The Principles Of Psychology (Volume 1 and 2), Dover, New York, 1950.

No comments:

Post a Comment