Search This Blog

Showing posts with label emotions. Show all posts
Showing posts with label emotions. Show all posts

Where Do We Feel Different Emotions in the Body? Love Make Us Warm All Over.

I am fascinated by the gestures of great speakers. I am studying Hitler's Body Language for  Discovery Channel Documentary Series. Hitler practiced specific gestures to make when he was giving speeches and many of them are expansive and weapon like gestures to make him appear large powerful and omnipotent and dangerous. In several of his practiced gestures in the famous posed Hoffman Photos one hand is at the head level or above it. Hitler used anger in most of his speeches and its interesting that anger actives the upper body, that is the head, shoulders upper chest and hands and arms.
Here is an interesting study about what part of the body is activated when we feel different emotions. The findings where self reported, so more research needs to be done. But I find it fascinating that we think we feel different emotions in different parts of are body.

http://www.npr.org/sections/health-shots/2013/12/30/258313116/mapping-emotions-on-the-body-love-makes-us-warm-all-over

Mapping Emotions On The Body: Love Makes Us Warm All Over

People drew maps of body locations where they feel basic emotions (top row) and more complex ones (bottom row). Hot colors show regions that people say are stimulated during the emotion. Cool colors indicate deactivated areas.
People drew maps of body locations where they feel basic emotions (top row) and more complex ones (bottom row). Hot colors show regions that people say are stimulated during the emotion. Cool colors indicate deactivated areas.
Image courtesy of Lauri Nummenmaa, Enrico Glerean, Riitta Hari, and Jari Hietanen.
Close your eyes and imagine the last time you fell in love. Maybe you were walking next to your sweetheart in a park or staring into each other's eyes over a latte.
Where did you feel the love? Perhaps you got butterflies in your stomach or your heart raced with excitement.
When a team of scientists in Finland asked people to map out where they felt different emotions on their bodies, they found that the results were surprisingly consistent, even across cultures.
People reported that happiness and love sparked activity across nearly the entire body, while depression had the opposite effect: It dampened feelings in the arms, legs and head. Danger and fear triggered strong sensations in the chest area, the volunteers said. And anger was one of the few emotions that activated the arms.
The scientists hope these body emoticons may one day help psychologists diagnose or treat mood disorders.
"Our emotional system in the brain sends signals to the body so we can deal with our situation," says Lauri Nummenmaa, a psychologist at Aalto University who led the study.
"Say you see a snake and you feel fear," Nummenmaa says. "Your nervous system increases oxygen to your muscles and raises your heart rate so you can deal with the threat. It's an automated system. We don't have to think about it."
That idea has been known for centuries. But scientists still don't agree on whether these bodily changes are distinct for each emotion and whether this pattern serves as a way for the mind to consciously identify emotions.
Basic emotions, such as happiness, sadness and fear, form the building blocks for more complex feelings.i
Basic emotions, such as happiness, sadness and fear, form the building blocks for more complex feelings.
Toddatkins/Wikimedia.org
To try to figure that out, Nummenmaa and his team ran a simple computer experiment with about 700 volunteers from Finland, Sweden and Taiwan.
The team showed the volunteers two blank silhouettes of a person on a screen and then told the subjects to think about one of 14 emotions: love, disgust, anger, pride, etc. The volunteers then painted areas of the body that felt stimulated by that emotion. On the second silhouette, they painted areas of the body that get deactivated during that emotion.
"People find the experiment quite amusing. It's quite fun," Nummenmaa tells Shots. "We kept the questions online so you try the experiment yourself." (You can try it here.)
Not everybody painted each emotion in the same way. But when the team averaged the maps together, signature patterns emerged for each emotion. The team published these sensation maps Monday in the Proceedings of the National Academy of Sciences.
The team still doesn't know how these self-reported sensations match with the physiological responses that occur with emotion.
But previous studies have found marked changes in bodily sensations in mood disorders, Nummenmaa says. "For instance, with depression sometimes people have pain in their chest."
And there's even some evidence that when you change your own body language — like your posture or stance — you can alter your mind.
Neuroscientist Antonio Damasio, who was not involved in this study, says he's "delighted" by Nummenmaa's findings because they offer more support for what he's been suggesting for years: Each emotion activates a distinct set of body parts, he thinks, and the mind's recognition of those patterns helps us consciously identify that emotion.
"People look at emotions as something in relation to other people," Damasio, who is a professor at the University of Southern California, says. "But emotions also have to do with how we deal with the environment — threats and opportunities." For those, Damasio says, you need your body as well as your mind.


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at www.PattiWood.net. Check out Patti's website for her new book "SNAP, Making the Most of First Impressions, Body Language and Charisma" at www.snapfirstimpressions.com. Also check out Patti's YouTube channel at http://youtube.com/user/bodylanguageexpert.

Argo and Audience Interaction


Have you ever been to a movie where the entire audience applauded enthusiastically at the end of the story then all stayed through the ending credits and clapped again? I saw Argo a few weeks ago and really enjoyed the movie and the audience's response to it. Though my friend Jerry said I had to stop gasping so loud, I was not the only one who found it a emotionally gripping movie. This was one of those fantastic theater audience experiences. The Isopraxism as we pulled together and expressed the same emotions nonverbally made the movie more powerful. Yes, sometimes it is better to get of the comfy couch and go to the movies.

It is also great to see a movie about real heroes doing noble needs. Argo is a terrific movie. Here is the link to the trailer.Trailer


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at www.PattiWood.net. Check out Patti's website for her new book "SNAP, Making the Most of First Impressions, Body Language and Charisma" at www.snapfirstimpressions.com. Also check out Patti's YouTube channel at http://youtube.com/user/bodylanguageexpert.

What Are Emotion Expressions For?


Ever wonder why we raise our eyebrows in surprise? Do you want to know why people smile when they meet a stranger or the reason why teenage girls scrunch up their noses in disgust at their parent’s rules? Why do we have common facial expressions for emotions?  Here is a new research study that explains the origin of facial expressions.

What Are Emotion Expressions For?

ScienceDaily (Jan. 3, 2012) — That cartoon scary face -- wide eyes, ready to run -- may have helped our primate ancestors survive in a dangerous wild, according to the authors of an article published in Current Directions in Psychological Science, a journal of the Association for Psychological Science. The authors present a way that fear and other facial expressions might have evolved and then come to signal a person's feelings to the people around him.



The basic idea, according to Azim F. Shariff of the University of Oregon, is that the specific facial expressions associated with each particular emotion evolved for some reason. Shariff cowrote the paper with Jessica L. Tracy of the University of British Columbia. So fear helps respond to threat, and the squinched-up nose and mouth of disgust make it harder for you to inhale anything poisonous drifting on the breeze. The outthrust chest of pride increases both testosterone production and lung capacity so you're ready to take on anyone. Then, as social living became more important to the evolutionary success of certain species -- most notably humans -- the expressions evolved to serve a social role as well; so a happy face, for example, communicates a lack of threat and an ashamed face communicates your desire to appease.

The research is based in part on work from the last several decades showing that some emotional expressions are universal -- even in remote areas with no exposure to Western media, people know what a scared face and a sad face look like, Shariff says. This type of evidence makes it unlikely that expressions were social constructs, invented in Western Europe, which then spread to the rest of the world.

And it's not just across cultures, but across species. "We seem to share a number of similar expressions, including pride, with chimpanzees and other apes," Shariff says. This suggests that the expressions appeared first in a common ancestor.

The theory that emotional facial expressions evolved as a physiological part of the response to a particular situation has been somewhat controversial in psychology; another article in the same issue of Current Directions in Psychological Science argues that the evidence on how emotions evolved is not conclusive.

Shariff and Tracy agree that more research is needed to support some of their claims, but that, "A lot of what we're proposing here would not be all that controversial to other biologists," Shariff says. "The specific concepts of 'exaptation' and 'ritualization' that we discuss are quite common when discussing the evolution of non-human animals." For example, some male birds bring a tiny morsel of food to a female bird as part of an elaborate courtship display. In that case, something that might once have been biologically relevant -- sharing food with another bird -- has evolved over time into a signal of his excellence as a potential mate. In the same way, Shariff says, facial expressions that started as part of the body's response to a situation may have evolved into a social signal.

Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at www.PattiWood.net. Check out Patti's website for her new book "SNAP, Making the Most of First Impressions, Body Language and Charisma" at www.snapfirstimpressions.com. Also check out Patti's YouTube channel at http://youtube.com/user/bodylanguageexpert.

Patti Reveals What is Behind Their "Cry Cover" Smiles


This is an expression I call the cry cover smile. Yes, most people who give this expression believe they are covering their true emotions with a smile.
This expression is typically found in men and I think comes from the need to keep a “stiff upper lip.” Many times this expression is an attempt to hide many intense emotions of sadness, fear and anger. I see it in men who typically have very strong egos and power and are caught and brought down. There are several photos of this expression in former Governor Blagojevich.

Congressman Wiener’s expression is a suppressed fear, disgust and anger (If you cover up his mouth and look at just his eyes you will see the whites around his eyes and his sideways glance, and disgust. Notice the wrinkled nose that is a unique movement of the face given in disgust.)
If I knew exactly when he gave that expression I could tell you whether he was disgusted with himself for what he did or disgusted with the media at a particular question or bringing his behavior to light. The wrinkled, upraised chin and tight lips show the suppression of fear and also of anger.

Spitzer also has a cry cover smile. His chin is more raised and more defiant and proud and more of the bottom lip is raised and held inside the mouth. The corners of the mouth come down significantly in a way that is more common to this expression showing his need to smile through the pain. Cover his mouth and you see his eyes are more hooded downwards at the corners and sad. This combination reminds me of the classic sad clown painted face.


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://pattiwood.net/. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Mind Reading Computers, Computers that Read Facial Expressions

I have blogged before about the research on computers that read facial expressions, paralanguage and gestures. Here is research I am following at Cambridge.

http://www.cl.cam.ac.uk/research/rainbow/emotions/mind-reading.html
Automatic inference of complex mental states

Promotional material for the silent screen star Florence Lawrence displaying a range of emotions
People express their mental states, including emotions, thoughts, and desires, all the time through facial expressions, vocal nuances and gestures. This is true even when they are interacting with machines. Our mental states shape the decisions that we make, govern how we communicate with others, and affect our performance. The ability to attribute mental states to others from their behaviour, and to use that knowledge to guide our own actions and predict those of others is known as theory of mind or mind-reading. It has recently gained attention with the growing number of people with Autism Spectrum Conditions, who have difficulties mind-reading.

Existing human-computer interfaces are mind-blind — oblivious to the user’s mental states and intentions. A computer may wait indefinitely for input from a user who is no longer there, or decide to do irrelevant tasks while a user is frantically working towards an imminent deadline. As a result, existing computer technologies often frustrate the user, have little persuasive power and cannot initiate interactions with the user. Even if they do take the initiative, like the now retired Microsoft Paperclip, they are often misguided and irrelevant, and simply frustrate the user. With the increasing complexity of computer technologies and the ubiquity of mobile and wearable devices, there is a need for machines that are aware of the user’s mental state and that adaptively respond to these mental states.

A computational model of mind-reading
Drawing inspiration from psychology, computer vision and machine learning, our team in the Computer Laboratory at the University of Cambridge has developed mind-reading machines — computers that implement a computational model of mind-reading to infer mental states of people from their facial signals. The goal is to enhance human-computer interaction through empathic responses, to improve the productivity of the user and to enable applications to initiate interactions with and on behalf of the user, without waiting for explicit input from that user. There are difficult challenges:

It involves uncertainty, since a person’s mental state can only be inferred indirectly by analyzing the behaviour of that person. Even people are not perfect at reading the minds of others.
Automatic analysis of the face from video is still an area of active research in its own right.
There is no ‘code-book’ to interpret facial expressions as corresponding mental states.

Processing stages in the mind-reading system
Using a digital video camera, the mind-reading computer system analyzes a person’s facial expressions in real time and infers that person’s underlying mental state, such as whether he or she is agreeing or disagreeing, interested or bored, thinking or confused. The system is informed by the latest developments in the theory of mind-reading by Professor Simon Baron-Cohen, who leads the Autism Research Centre at Cambridge.

Prior knowledge of how particular mental states are expressed in the face is combined with analysis of facial expressions and head gestures occurring in real time. The model represents these at different granularities, starting with face and head movements and building those in time and in space to form a clearer model of what mental state is being represented. Software from Nevenvision identifies 24 feature points on the face and tracks them in real time. Movement, shape and colour are then analyzed to identify gestures like a smile or eyebrows being raised. Combinations of these occurring over time indicate mental states. For example, a combination of a head nod, with a smile and eyebrows raised might mean interest. The relationship between observable head and facial displays and the corresponding hidden mental states over time is modelled using Dynamic Bayesian Networks.

Results

Images from the Mind-reading DVD
The system was trained using 100 8-second video clips of actors expressing particular emotions from the Mind Reading DVD, an interactive computer-based guide to reading emotions. The resulting analysis is right 90% of the time when the clips are of actors and 65% of the time when shown video clips of non-actors. The system’s performance was as good as the top 6% of people in a panel of 20 who were asked to label the same set of videos.

Previous computer programs have detected the six basic emotional states of happiness, sadness, anger, fear, surprise and disgust. This system recognizes complex states that are more useful because they come up more frequently in interactions. However, they are also harder to detect because they are conveyed in a sequence of movements rather than a single expression. Most other systems assume a direct mapping between facial expressions and emotion, but our system interprets the facial and head gestures in the context of the person’s most recent mental state, so the same facial expression may imply different mental states in diffrent contexts.

Current projects and future work

Monitoring a car driver
The mind-reading computer system presents information about your mental state as easily as a keyboard and mouse present text and commands. Imagine a future where we are surrounded with mobile phones, cars and online services that can read our minds and react to our moods. How would that change our use of technology and our lives? We are working with a major car manufacturer to implement this system in cars to detect driver mental states such as drowsiness, distraction and anger.

Current projects in Cambridge are considering further inputs such as body posture and gestures to improve the inference. We can then use the same models to control the animation of cartoon avatars. We are also looking at the use of mind-reading to support on-line shopping and learning systems. The mind-reading computer system may also be used to monitor and suggest improvements in human-human interaction. The Affective Computing Group at the MIT Media Laboratory is developing an emotional-social intelligence prosthesis that explores new technologies to augment and improve people’s social interactions and communication skills.

We are also exploring the ethical implications and privacy issues raised by this research. Do we want machines to watch us and understand our emotions? Mind-reading machines will undoubtedly raise the complexity of human-computer interaction to include concepts such as exaggeration, disguise and deception that were previously limited to communications between people.

Further projects and links
Demonstrations of the system with volunteers at the CVPR Conference in 2004
Royal Society 2006 Summer Science Exhibition (including video)
Affective computing group at MIT
Autism Research Centre at the University of Cambridge
The mind-reading DVD


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Americans Focus On That Particular Person To Figure Out Their Emotions While Japanese Look At The Other People In The Area To Figure Emotions

The Japanese may not make as much eye contact with the individual in the conversation. That could make an American feel uncomfortable and make negative assessments about their Japanese conversational partner.

When It Comes To Emotions, Eastern And Western Cultures See Things Very Differently
Science Daily (Mar. 7, 2008) — A team of researchers from Canada and Japan have uncovered some remarkable results on how eastern and western cultures assess situations very differently.
Across two studies, participants viewed images, each of which consisted of one centre model and four background models in each image. The researchers manipulated the facial emotion (happy, angry, sad) in the centre or background models and asked the participants to determine the dominant emotion of the centre figure.

The majority of Japanese participants (72%) reported that their judgments of the centre person's emotions were influenced by the emotions of the background figures, while most North Americans (also 72%) reported they were not influenced by the background figures at all.

"What we found is quite interesting," says Takahiko Masuda, a Psychology professor from the University of Alberta. "Our results demonstrate that when North Americans are trying to figure out how a person is feeling, they selectively focus on that particular person's facial expression, whereas Japanese consider the emotions of the other people in the situation."

This may be because Japanese attention is not concentrated on the individual, but includes everyone in the group, says Masuda.

For the second part of the study, researchers monitored the eye movements of the participants and again the results indicated that the Japanese looked at the surrounding people more than the westerners when judging the situation.

While both the Japanese and westerners looked to the central figure during the first second of viewing the photo, the Japanese looked to the background figures at the very next second, while westerners continued to focus on the central figure.

"East Asians seem to have a more holistic pattern of attention, perceiving people in terms of the relationships to others," says Masuda. "People raised in the North American tradition often find it easy to isolate a person from its surroundings, while East Asians are accustom to read the air "kuuki wo yomu" of the situation through their cultural practices, and as a result, they think that even surrounding people's facial expressions are an informative source to understand the particular person's emotion."

These findings are published in the upcoming issue of Journal of Personality and Social Psychology and the results are replicated in a collaborative study between Huaitang Wang and Takahiko Masuda (University of Alberta, Canada) and Keiko Ishii (Hokkaido University, Japan)

Email or share this story:

Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Your Voice Communicates Emotions Through Paralanguage, Dogs Understand Our Nonverbal Cues

People and to Dogs, What is Paralanguage?
Understanding the "tone" of someone’s speech well is related to your ability to be empathetic.

The variance in pitch and rhythm of the voice called prosody (a subset of paralanguage) conveys emotion in the voice. Because dogs are unusually adept at reading human body language and paralanguage they understand what you prosody is saying. You may have experienced that when praising or chastising your dog. If you say, "bad boy." to a dog using a loving happy voice your dog will probably respond as if you have praised him, by coming in closer to you, wagging his tail and or licking your face. Just as saying "You are a great dog" in a loud, mean, strident, attacking voice may make cause your dog to retreat and or crouch low and bring down his head. The pitch and rhythm of the voice can affect how your dog responds to you.
Prosody is not only the way dogs understand and read emotions, human also understand the emotions communicated through prosidy.




Patti Wood, MA, Certified Speaking Professional The Body Language Expert Web- http://www.PattiWood.netI have a new quiz on my YouTubestation. Check it out!YouTube- YouTube - bodylanguageexpert's Channel