Affective Computing, A new way to measure human emotions | GSR research in UX

Udit Maitra
7 min readMay 3, 2022

--

In this article, you’ll learn how to quantify human emotion and apply it in the business to provide a better user experience for your target audience.

During Cafe Usability Test

Assume you’re conducting UX research (Cafe Usability Test) to learn how your users interact with your website in real life. As a result, you conducted UX research and discovered that consumers were both happy and worried at times.

Later you asked them, “How easy it was?”

They said, “Oh! It was not so hard, quite easy”

— — — — — -

You asked them, “How did you like it?”

They said, “I liked it”

— — — — — -

etc.

You must be happy now that you’ve received such positive feedback and then you have shared all of your research findings with your product team and stakeholders.

Now let’s understand in this example what a research error could be there:

  1. User’s basis (Attitude vs. Behavior): It’s been proven in human psychology that what people think they do and what they actually do aren’t always in sync, thus users may say they enjoy the design but that doesn’t indicate they do.
  2. Researchers’ bias: As a researcher, you may face a huge amount of researcher bias during the observation process and it could lead to wrong research data.
  3. Lack of objectivity: If multiple researchers observe the same study, the pivotal emotional moments may be deferred based on the accuracy of their observation skills, leaving us with additional ambiguity and subjective judgments.
  4. The intensity of emotion: Without a question, observation is a useful technique for comprehending emotion, but it is difficult to determine whether or not someone is happy, and if so, how happy are they actually? The same goes for negative emotions.

Now we have seen there are many challenges while we are trying the perceive actual users’ emotions.

Even I got a chance to take interview a few UX managers and Leads to understand what approach they take to measure users’ emotions. Below showing the statistics for the same.

N=26, This chart shows what UX leaders in their organization do to capture user emotion based on the interview data.

Now I’d want to introduce Affective Computing as a dimension of User Experience.

What is Affective Computing?

Emotion AI or affective computing, is an emerging technology that allows computers and systems to recognize, process, and imitate human feelings and emotions. It’s a multidisciplinary field that focuses on computer science, psychology, and cognitive science.

So, how can we apply Effective Computing in business, specifically in the UX research domain, to better (objective behavior measurement) understand human emotions?

Normally user’s emotions can capture in two different ways,

  1. Subjective measurement (What they say, user’s attitude)
  2. Objective measurement (What they do, user’s unconscious behavior)

1. Subjective measurement:

We generally focus on the Subjective measurement when it comes to user’s emotions like:

Attitude: You can use self-reported matrices such as these:

  1. PANAS (Positive and Negative Affect Schedule)
  2. NPS (Net Promoter Score)
  3. ASQ (After-scenario questionnaire)
  4. BMIS (Brief Mood Introspection Scale)
  5. Probing Questions during the time of the interview

2. Objective measurement:

  1. Skin conductance or GSR
  2. Eye-tracking and pupil dilatation
  3. ECG to measure heart rate changes
  4. EEG or MRI to check brain activity
  5. Facial recognition software

So, in this article, we’ll discuss GSR (Galvanic Skin Response) study in user experience. Which will assist us in understanding user behavior.

What is GSR ( galvanic skin response) research?

Your sweat glands are activated by nerves that can be sensitive to emotions, hormones, and other stressors. When you feel stress, your body temperature rises, prompting your sweat glands to kick in.

There is a strong correlation between your skin resistance with brain activity, As part of the autonomic nervous system, your skin acts as an output sensor, controlling the sympathetic (fight or flight response) and parasympathetic (rest and digest response) nervous systems, as illustrated in the image below.

When such changes occur as a result of emotional changes, your skin relistens varies.

Research Goal:

As a result, we wanted to test the hypothesis that emotional changes can be a reflection of skin resistance. Vice versa we wanted to capture skin resistance variance to determine the emotional changes in the human brain.

Arousal (or intensity) is the level of autonomic activation that an event creates, and ranges from calm (or low) to excited (or high). Valence, on the other hand, is the type of emotion.

We will solely focus on the Arousal level in this study, I mean pinpoints moment of excitement, frustration, or increased cognitive load experienced by the participants.

As a result, we intend to create a device that would assist us in measuring human skin resistance and transmitting the data wirelessly to our laptop, tablet, or mobile.

Introducing Hmotion…

This is a product video of what Hmotion watch can capable doing of

And below is the prototype device:

It’s a prototype device that we created to measure human skin conductance. As you can see, you have to put on your wrist and finger, and it sends data to your device via Bluetooth.
Prototype demo video

Research Plan:

We’ve chosen three different types of video content to help us understand when the participants’ emotional state changes, how intense the feeling is, and where the pinpoints are.

And my research colleague (Tejal) and I are preparing the research plan.

Equipment:

  • Hmotion (GSR device) wearable device to identify and send the emotional changes of our participants.
  • We recorded the entire session and captured the user’s facial expressions with the DJI Osmo pocket 2.
  • We needed a laptop with Adruino software to monitor and record the live data.
  • We have used Microsoft Surface Hub 2S, where participants watched all the video content.
  • And, of course, some chocolate to welcome the participants.
The equipment we have used for this study

Location:

We’ve chosen a controlled environment/lab to conduct our research activity, to eliminate multiple factors which could impact our research data.

Here we have conducted our study

Who We Tested?

We invited 5 participants from our office.

Independent variable (IV): Video content (emotional trigger points in the video)

Dependent variable (DV): Skin resistance value due to trigger points in the video

Activity:

We’ve given them an overview of the activity, including the purpose of the study and how we’ll use the data. Then we helped them in wearing the device in their hand and afterward asked them to watch the content alone in the room with the light turned off to help them focus on the video and relax. Along the way, we were monitoring live data from participants on our laptops from the outside of the lab, and it was incredibly fascinating for us to get this kind of information and learn more about human behavior.

Post the research, we asked them a few questions to collect their subjective opinion.

Research Data:

After the research, we have collected a variety of data like:

  1. Users’ GSR data (which you could open using Google sheet or Microsoft excel)
  2. Facial expressions of participants in raw video format.
Example of how we have received users’ live GSR data in Google sheet

Data Analysis:

Because we built the entire project from scratch and didn’t have access to sophisticated data analysis software, manually analyzing data and identifying patterns was difficult and time-consuming.

Nonetheless, it’s also an amazing thing to do manual analysis, and I’m sharing the report below with you to help you understand how we were able to track user emotional changes (Arousal level) with this technology.

You will see how easily we can identify a person’s emotional pinpoints while he is watching this video content.

How we can understand human psychological data such as skin conductance to identify the correlation between GSR data and emotional response, I’m sure you found fascinating.

Now the question is can we apply this knowledge to action to improve the user experience?

The answer is an absolute yes!

But how?

Alright, Let me give some quick scenarios:

  1. Assume you are a marketing expert who is creating content or an advertisement video in such a way that people will engage. If you are concerned about which area or how you will present the content so that users will feel the connection in order to make the emotional connection, in other words, what may have triggered an increased level of arousal or how many emotional peak points or pinpoints are there while they are watching the video/Ads.
Impact of Media Advertisements on Consumer Emotions, you will easily identify how your end customer will be emotionally impacted before launching the final Ads.

2. Let’s say you’re trying to figure out how your design prototype will work for your end-user and you want to make sure you’re reducing cognitive load while they’re using it, or you’ve designed a game and are doing some research before releasing it on the market, So watch the below video how it can help you figure out where your users are stressed or anxious while using your app.

The participant was playing Asphalt 8, and he wasn’t expecting any new interaction models, thus it was interesting to see how his cognitive load had increased as well.

Summary

When it comes to usage, it has a huge amount of potential in other fields as well as UX research, and this device will enable all business leaders, including UX practitioners across the nation to better understand their customers’ emotions and measure them in order to improve their service and product.

I look forward to speaking with you about any questions or discussions you may have. :)

Emotions play a huge role in business and in the success of your business, Emotion is what really drives the purchasing behaviors, and also, decisions making in general.

That’s what the late great sales trainer, Zig Ziglar, used to say, “People buy on emotion and justify on logic.”

and a special thanks to Tejal Uchil for your assistance as a Researcher.

And please applaud if you like it.

Thank you :)

--

--