Search This Blog

Showing posts with label facial expressions. Show all posts
Showing posts with label facial expressions. Show all posts

Smiling, Makes You Feel Good!

Smiling Makes You Feel Good!

Research on the positive effects of smiling.
ewscientist.com/article/mg15020279.300-act-now-think-later--fear-not-politicians-that-elusive-feelgood-factor-can-be-created-in-an-instant-just-appeal-to-our-primal-instincts-advises-david-concar.html Act now, think later - Fear not, politicians. That elusive feel-good factor can be created in an instant. Just appeal to our primal instincts, advises David Concar
27 April 1996
Magazine issue 2027. Subscribe and save

Department stores opt for nice smells and muzak; impresarios use warm-up acts. But psychologist Sheila Murphy has an infinitely more devious way of getting people in the right frame of mind. First she sits them in front of a screen in her lab at the University of Southern California in Los Angeles. Then she flashes up images of smiling faces.

Nothing obviously devious about that: smiles make people cheerful. The rub is that Murphy's smiles last for just a few thousandths of a second. That's way too fast for the human brain to know what it's looking at. And yet, according to in-depth studies carried out over many years by Murphy, veteran emotions researcher Robert Zajonc and their colleagues, these split-second flashes of teeth and warmly wrinkled eyes induce a measurably more positive frame of mind.

It sounds crazy. How can people respond to facial expressions too short-lived to permeate...?


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Mind Reading Computers, Computers that Read Facial Expressions

I have blogged before about the research on computers that read facial expressions, paralanguage and gestures. Here is research I am following at Cambridge.

http://www.cl.cam.ac.uk/research/rainbow/emotions/mind-reading.html
Automatic inference of complex mental states

Promotional material for the silent screen star Florence Lawrence displaying a range of emotions
People express their mental states, including emotions, thoughts, and desires, all the time through facial expressions, vocal nuances and gestures. This is true even when they are interacting with machines. Our mental states shape the decisions that we make, govern how we communicate with others, and affect our performance. The ability to attribute mental states to others from their behaviour, and to use that knowledge to guide our own actions and predict those of others is known as theory of mind or mind-reading. It has recently gained attention with the growing number of people with Autism Spectrum Conditions, who have difficulties mind-reading.

Existing human-computer interfaces are mind-blind — oblivious to the user’s mental states and intentions. A computer may wait indefinitely for input from a user who is no longer there, or decide to do irrelevant tasks while a user is frantically working towards an imminent deadline. As a result, existing computer technologies often frustrate the user, have little persuasive power and cannot initiate interactions with the user. Even if they do take the initiative, like the now retired Microsoft Paperclip, they are often misguided and irrelevant, and simply frustrate the user. With the increasing complexity of computer technologies and the ubiquity of mobile and wearable devices, there is a need for machines that are aware of the user’s mental state and that adaptively respond to these mental states.

A computational model of mind-reading
Drawing inspiration from psychology, computer vision and machine learning, our team in the Computer Laboratory at the University of Cambridge has developed mind-reading machines — computers that implement a computational model of mind-reading to infer mental states of people from their facial signals. The goal is to enhance human-computer interaction through empathic responses, to improve the productivity of the user and to enable applications to initiate interactions with and on behalf of the user, without waiting for explicit input from that user. There are difficult challenges:

It involves uncertainty, since a person’s mental state can only be inferred indirectly by analyzing the behaviour of that person. Even people are not perfect at reading the minds of others.
Automatic analysis of the face from video is still an area of active research in its own right.
There is no ‘code-book’ to interpret facial expressions as corresponding mental states.

Processing stages in the mind-reading system
Using a digital video camera, the mind-reading computer system analyzes a person’s facial expressions in real time and infers that person’s underlying mental state, such as whether he or she is agreeing or disagreeing, interested or bored, thinking or confused. The system is informed by the latest developments in the theory of mind-reading by Professor Simon Baron-Cohen, who leads the Autism Research Centre at Cambridge.

Prior knowledge of how particular mental states are expressed in the face is combined with analysis of facial expressions and head gestures occurring in real time. The model represents these at different granularities, starting with face and head movements and building those in time and in space to form a clearer model of what mental state is being represented. Software from Nevenvision identifies 24 feature points on the face and tracks them in real time. Movement, shape and colour are then analyzed to identify gestures like a smile or eyebrows being raised. Combinations of these occurring over time indicate mental states. For example, a combination of a head nod, with a smile and eyebrows raised might mean interest. The relationship between observable head and facial displays and the corresponding hidden mental states over time is modelled using Dynamic Bayesian Networks.

Results

Images from the Mind-reading DVD
The system was trained using 100 8-second video clips of actors expressing particular emotions from the Mind Reading DVD, an interactive computer-based guide to reading emotions. The resulting analysis is right 90% of the time when the clips are of actors and 65% of the time when shown video clips of non-actors. The system’s performance was as good as the top 6% of people in a panel of 20 who were asked to label the same set of videos.

Previous computer programs have detected the six basic emotional states of happiness, sadness, anger, fear, surprise and disgust. This system recognizes complex states that are more useful because they come up more frequently in interactions. However, they are also harder to detect because they are conveyed in a sequence of movements rather than a single expression. Most other systems assume a direct mapping between facial expressions and emotion, but our system interprets the facial and head gestures in the context of the person’s most recent mental state, so the same facial expression may imply different mental states in diffrent contexts.

Current projects and future work

Monitoring a car driver
The mind-reading computer system presents information about your mental state as easily as a keyboard and mouse present text and commands. Imagine a future where we are surrounded with mobile phones, cars and online services that can read our minds and react to our moods. How would that change our use of technology and our lives? We are working with a major car manufacturer to implement this system in cars to detect driver mental states such as drowsiness, distraction and anger.

Current projects in Cambridge are considering further inputs such as body posture and gestures to improve the inference. We can then use the same models to control the animation of cartoon avatars. We are also looking at the use of mind-reading to support on-line shopping and learning systems. The mind-reading computer system may also be used to monitor and suggest improvements in human-human interaction. The Affective Computing Group at the MIT Media Laboratory is developing an emotional-social intelligence prosthesis that explores new technologies to augment and improve people’s social interactions and communication skills.

We are also exploring the ethical implications and privacy issues raised by this research. Do we want machines to watch us and understand our emotions? Mind-reading machines will undoubtedly raise the complexity of human-computer interaction to include concepts such as exaggeration, disguise and deception that were previously limited to communications between people.

Further projects and links
Demonstrations of the system with volunteers at the CVPR Conference in 2004
Royal Society 2006 Summer Science Exhibition (including video)
Affective computing group at MIT
Autism Research Centre at the University of Cambridge
The mind-reading DVD


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Americans Focus On That Particular Person To Figure Out Their Emotions While Japanese Look At The Other People In The Area To Figure Emotions

The Japanese may not make as much eye contact with the individual in the conversation. That could make an American feel uncomfortable and make negative assessments about their Japanese conversational partner.

When It Comes To Emotions, Eastern And Western Cultures See Things Very Differently
Science Daily (Mar. 7, 2008) — A team of researchers from Canada and Japan have uncovered some remarkable results on how eastern and western cultures assess situations very differently.
Across two studies, participants viewed images, each of which consisted of one centre model and four background models in each image. The researchers manipulated the facial emotion (happy, angry, sad) in the centre or background models and asked the participants to determine the dominant emotion of the centre figure.

The majority of Japanese participants (72%) reported that their judgments of the centre person's emotions were influenced by the emotions of the background figures, while most North Americans (also 72%) reported they were not influenced by the background figures at all.

"What we found is quite interesting," says Takahiko Masuda, a Psychology professor from the University of Alberta. "Our results demonstrate that when North Americans are trying to figure out how a person is feeling, they selectively focus on that particular person's facial expression, whereas Japanese consider the emotions of the other people in the situation."

This may be because Japanese attention is not concentrated on the individual, but includes everyone in the group, says Masuda.

For the second part of the study, researchers monitored the eye movements of the participants and again the results indicated that the Japanese looked at the surrounding people more than the westerners when judging the situation.

While both the Japanese and westerners looked to the central figure during the first second of viewing the photo, the Japanese looked to the background figures at the very next second, while westerners continued to focus on the central figure.

"East Asians seem to have a more holistic pattern of attention, perceiving people in terms of the relationships to others," says Masuda. "People raised in the North American tradition often find it easy to isolate a person from its surroundings, while East Asians are accustom to read the air "kuuki wo yomu" of the situation through their cultural practices, and as a result, they think that even surrounding people's facial expressions are an informative source to understand the particular person's emotion."

These findings are published in the upcoming issue of Journal of Personality and Social Psychology and the results are replicated in a collaborative study between Huaitang Wang and Takahiko Masuda (University of Alberta, Canada) and Keiko Ishii (Hokkaido University, Japan)

Email or share this story:

Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

The Study Reveals That In Cultures Where Emotional Control Is The Standard, Such As Japan, Focus Is Placed On The Eyes To Interpret Emotions

In my coaching I find that my clients from Asian cultures have a hard time understanding and being understood by Americans. The findings in the study below lead me to believe that they are looking at a different part of the face for information about emotions.

Culture Is Key To Interpreting Facial Emotions
Science Daily (Apr. 5, 2007) — Research has uncovered that culture is a determining factor when interpreting facial emotions. The study reveals that in cultures where emotional control is the standard, such as Japan, focus is placed on the eyes to interpret emotions. Whereas in cultures where emotion is openly expressed, such as the United States, the focus is on the mouth to interpret emotion.


Across two studies, using computerized icons and human images, the researchers compared how Japanese and American cultures interpreted images, which conveyed a range of emotions.

"These findings go against the popular theory that the facial expressions of basic emotions can be universally recognized," said University of Alberta researcher Dr. Takahiko Masuda. "A person's culture plays a very strong role in determining how they will perceive emotions and needs to be considered when interpreting facial expression"

These cultural differences are even noticeable in computer emoticons, which are used to convey a writer's emotions over email and text messaging. Consistent with the research findings, the Japanese emoticons for happiness and sadness vary in terms of how the eyes are depicted, while American emoticons vary with the direction of the mouth. In the United States the emoticons : ) and : - ) denote a happy face, whereas the emoticons :( or : - ( denote a sad face. However, Japanese tend to use the symbol (^_^) to indicate a happy face, and (;_;) to indicate a sad face.

When participants were asked to rate the perceived levels of happiness or sadness expressed through the different computer emoticons, the researchers found that the Japanese still looked to the eyes of the emoticons to determine its emotion.

"We think it is quite interesting and appropriate that a culture that tends to masks its emotions, such as Japan, would focus on a person's eyes when determining emotion, as eyes tend to be quite subtle," said Masuda. "In the United States, where overt emotion is quite common, it makes sense to focus on the mouth, which is the most expressive feature on a person's face."

These findings are published in the current issue of The Journal of Experimental Social Psychology and are a result from a collaborative study between Masaki Yuki (Hokkaido University), William Maddux (INSEAD) and Takahiko Masuda (University of Alberta). The results also suggest the interesting possibility that the Japanese may be better than Americans at detecting "false smiles". If the position of the eyes is the key to whether someone's smile is false or true, Japanese may be particularly good at detecting whether someone is lying or being "fake". However, these questions can only be answered with future research.

Email or share this story:Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

High-Testosterone People Feel Rewarded By Others' Anger

High-Testosterone People Feel Rewarded By Others' Anger, New Study Finds
ScienceDaily (May 12, 2007) — Most people don't appreciate an angry look, but a new University of Michigan psychology study found that some people find angry expressions so rewarding that they will readily learn ways to encourage them.


•"It's kind of striking that an angry facial expression is consciously valued as a very negative signal by almost everyone, yet at a non-conscious level can be like a tasty morsel that some people will vigorously work for," said Oliver Schultheiss, co-author of the study and a U-M associate professor of psychology.

The findings may explain why some people like to tease each other so much, he added. "Perhaps teasers are reinforced by that fleeting 'annoyed look' on someone else's face and therefore will continue to heckle that person to get that look again and again," he said. "As long as it does not stay there for long, it's not perceived as a threat, but as a reward."

The researchers took saliva samples from participants to measure testosterone, a hormone that has been associated with dominance motivation.

Participants then worked on a "learning task" in which one complex sequence of keypresses was followed by an angry face on the screen, another sequence was followed by a neutral face, and a third sequence was followed by no face.

Participants who were high in testosterone relative to other members of their sex learned the sequence that was followed by an angry face better than the other sequences, while participants low in testosterone did not show this learning advantage for sequences that were reinforced by an angry face.

Notably, this effect emerged more strongly in response to faces that were presented subliminally, that is, too fast to allow conscious identification. Perhaps just as noteworthy, participants were not aware of the patterns in the sequences of keypresses as they learned them.

While high-testosterone participants showed better learning in response to anger faces, they were unaware of the fact that they learned anything in the first place and unaware of what kind of faces had reinforced their learning.

Michelle Wirth, the lead author of the study and now a postdoctoral researcher at the University of Wisconsin, Madison, added: "Better learning of a task associated with anger faces indicates that the anger faces were rewarding, as in a rat that learns to press a lever in order to receive a tasty treat. In that sense, anger faces seemed to be rewarding for high-testosterone people, but aversive for low-testosterone people."
She said the findings contribute to a body of research suggesting that perceived emotional facial expressions are important signals to help guide human behavior, even if people are not aware that they do so.

"The human brain may have built-in mechanisms to detect and respond to emotions perceived in others," she said. "However, what an emotional facial expression, such as anger, 'means' to a given individual—whether it is something to pursue or avoid, for example—can vary."

U-M psychology researchers Michelle Wirth and Schultheiss, the authors of the study, published their findings in the journal Physiology and Behavior.




Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://PattiWood.net. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Are You Really Listening?


Body Language expert, Patti Wood, is quoted in Club Solutions Magazine about the importance of becoming a GENTLER listener. Do you want your relationships to improve? Check out the body language cues that will help you become a "gentler" listener at the link below.
http://www.scribd.com/doc/35165517/Club-Solutions-Listening


Patti Wood, MA, Certified Speaking Professional - The Body Language Expert. For more body language insights go to her website at http://pattiwood.net/. Also check out the body language quiz on her YouTube Channel at http://youtube.com/user/bodylanguageexpert.

Making Facial Expressions, Do Some of Us Have an Advantage?


The other day I went to an estate sale. A few minutes after entering the house I asked the woman running the sale the price of a Christmas ornament and she smiled and said, "You walked into the house with the biggest smile I have ever seen then you actually smiled more as you started looking at things. You brought all this positive energy into the house. Can you come back tomorrow?" I know it sounds weird, but I have had these wierd kinds of comment made all my life. I seem to have a big ole' Bozo the clown smile. I mean it is really big. If you are not old enough to remember Bozo, picture the Joker in Bat Man only happy and not quite as much lipstick. Well maybe more lipstick in my case, but a more flattering color. I have always wondered about my unusually large and expressive smile. My mother swears I came out of the womb smiling. I certainly had the smile as a child. I remember there where ads for the secrets for building big arm muscles in the back of comic books when I was a kid. Aimed at scrawny boys, the ads promised you could build the muscles of a brawny body builder if you only sent them some money. What if only certain people had lots of muscles to build and others, the scrawnies, had fewer muscles in the first place? Perhaps we would not have a level playing field no matter how we worked on building the few muscles we had.

Well, as far as the muscles that control facial expressions, this lack of level playing field has been proven. New research by Dr Bridget Waller a scientist at the University of Portsmouth has found that not everyone has all the muscles that can control our facial expression. Some of us have more and some of us have less--in some cases as little as 60% of the muscles needed. Think of how having more or fewer facial expression forming muscles in your face could affect your ability to interact less or more effectively with others; to bond less or more with people.

In her study published in the American Psychological Association journal, Dr. Wallers discusses the first systematic study on cadavers to discover the variations of musculature structure in the face as they control facial expressions. It appears that the face is the only part of the body to have this unique set of differences. I have read the study and I am stunned. It seems that these variations in allotment of muscles cause some people to have a unique facial signature--in my case, a bozo like smile. People without certain muscles can compensate for their lack using other muscles, but it may involve extra effort. What if the phenomena of personality type differences such as introversion and extroversion was affected by having more or less of the facial muscles required to make facial expressions? I know research says that task-orientated individuals especially those in the "Get it Right" DISC personality type of my "techie" audiences can't understand while the "touchy feelies" in the "Get Along" "Get Appreciated" categories smile so much or show so much emotion. In the same vein the "touchie feelies" can't understand why the "techies" don't show any noticeable positive emotions. To which the techies reply, "I am not going to give a fake smile." Could it seem forced and fake to them because it takes more effort? Hummm, that could be interesting research. What do you think? Oh, one more interesting tid bit. The only other part of the body where they have seen differences in muscles between people: the forearm. I always knew Popeye wasn't really eating all the spinach.

Here is a link to the study. And below that some of the findings.
http://www.sciencedaily.com/releases/2009/02/090211161852.htm
Dr Waller is from the Centre for the Study of Emotion in the Department of Psychology. She collaborated with anatomists at the University of Pittsburgh and Duquesne University in the USA.
They found that all humans have a core set of five facial muscles which they believe control our ability to produce a set of standard expressions which convey anger, happiness, surprise, fear, sadness and disgust. But there are up to nineteen muscles which may be present in the face and many people do not possess all of them. for example, The Risorius muscle, which experts believe controls our ability to create an expression of extreme fear, is found in only two thirds of the population.

Dr Waller said: “Everyone communicates using a set of common signals and so we would expect to find that the muscles do not vary among individuals. The results are surprising - in some individuals we found only 60 per cent of the available muscles.”
She said that everyone is able to produce the same basic facial expressions and movements but we also have individual variations.
“Some less common facial expressions may be unique to certain people,” she said. “The ability to produce subtly different variants of facial expressions may allow us to develop individual ‘signatures’ that are specific to certain individuals.”
She said that there are significant implications for the importance of facial expression in society.
“Facial expression serves an essential function in society and may be a form of social bonding,” Dr Waller said. “It allows us to synchronise our behaviour and understand each other better.”
Dr Waller has completed studies which examined facial expressions in apes. She said that primates who live within social groups have a more elaborate communication repertoire including more complex facial expressions.
“There is a theory that language evolved to help us bond us together in social groups and we may be able to apply the same theory to facial expressions,” she said.
The face is the only part of the human anatomy which has been found to display such a massive variation in muscle structure. In the only other example of muscular differences, the forearm has a muscle which approximately fifteen per cent of the population don’t have.
Dr Anne Burrows from Duquesne University was one of the anatomists on the study. She said: “The problems with quantifying facial musculature is that they're not like other muscles. They're fairly flat, difficult to separate from surrounding connective tissue and they all attach to one another. They are very unlike muscles of the limbs, for example.
“The variation we see in the face is absolutely unique,” said Dr Waller.
Dr Waller said that actors need not worry because people will compensate for a lack of one muscle by using another to develop a similar expression. And people can learn to develop a facial expression by practising in front of a mirror.
“As humans we are able to change the level of control we have over our facial expressions,” said Dr Waller. “There is a great deal of asymmetry in the face and the left side is generally more expressive than the right. But someone who is unable to raise one eyebrow without raising the other could in fact learn to raise just one.”
The implication for those actors who have had botox speaks for itself.
________________________________________
Adapted from materials provided by University of Portsmouth, via AlphaGalileo