Empower your business with emotion recognition with Azure Cognitive Services and ML

Emotion recognition with Azure Cognitive Services and machine learning

Emotion recognition with Azure Cognitive Services and machine learning
Posted :

Emotion recognition is the process of identifying and interpreting human emotions. It is a complex task that can be challenging to do accurately, but it has the potential to revolutionize the way we interact with computers and each other.

The global emotion detection and recognition market size was valued at $21.7 billion in 2021 and is projected to reach $136.2 billion by 2031. – Allied Marketing

Azure Cognitive Services offers a variety of APIs that can be used to build emotion recognition applications. These APIs use machine learning to analyze facial expressions, speech patterns and other factors to identify different emotions.

In this blog post, we will explore how to implement emotion recognition using Azure Cognitive Services, .NET Core, and Xamarin and discuss its diverse use cases.

Some of the use cases of emotion recognition

  • Effective customer engagement: Recognizing the emotional state of customers via CCTV camera in retail shop can enable the sales team to engage more effectively and create lasting relationships.
  • Education: Emotion recognition can be used to improve learning outcomes by providing teachers with real-time feedback on the students’ emotional state. This information can help teachers to tailor their lessons to the students’ needs and to create a more engaging learning environment.
  • Employee productivity monitoring: Emotional intelligence in the workplace can improve employee productivity and well-being.
  • Healthcare systems: Emotion recognition can be used to diagnose and treat mental health disorders. For example, a doctor might use emotion recognition to identify patients who are at risk for depression or anxiety.

You can read more about the benefits of emotion detection.

How to leverage Azure Cognitive Services for emotion recognition

Azure Cognitive Services offers a range of APIs and pre-trained models, making it easy to integrate emotion recognition capabilities into applications. The Face API provides accurate facial emotion detection, allowing developers to identify emotions from facial expressions.

To use Azure Cognitive Services, you need an Azure subscription and should log in to the Azure portal to create the Cognitive Service instance.

Integrate this API in .NET Core

To integrate the Face API into a .NET Core application, first, install the “Microsoft.Azure.CognitiveServices.Vision.Face” package from NuGet. Then, use the provided code to take an image, detect emotions and evaluate the dominant emotion.



static async Task DetectEmotion(string filepath)
        {
            using (var client = new FaceClient(new ApiKeyServiceClientCredentials("<your-face-api-key-here>"), new System.Net.Http.DelegatingHandler[] { }))
            {
                client.Endpoint = faceEndpoint;
                using (var filestream = File.OpenRead(filepath))
                {
                    var detectionResult = await client.Face.DetectWithStreamAsync(filestream, returnFaceId: true, returnFaceAttributes: emotionAttribute, returnFaceLandmarks: true);
                    foreach (var face in detectionResult)
                    {
                        Console.WriteLine(JsonConvert.SerializeObject(face.FaceAttributes.Emotion));
                        var highestEmotion = GetEmotion(face.FaceAttributes.Emotion);
                        Console.WriteLine($"This face has emotional traits of {highestEmotion.Emotion} ({highestEmotion.Value} confidence).");
                    }
                }
            }
        }
 
        static (string Emotion, double Value) GetEmotion(Emotion emotion)
        {
            var emotionProperties = emotion.GetType().GetProperties();
            (string Emotion, double Value) highestEmotion = ("Anger", emotion.Anger);
            foreach (var e in emotionProperties)
            {
                if (((double)e.GetValue(emotion, null)) > highestEmotion.Value)
                {
                    highestEmotion.Emotion = e.Name;
                    highestEmotion.Value = (double)e.GetValue(emotion, null);
                }
            }
 
            return highestEmotion;
        }

Adding emotion capabilities to a native Xamarin app

To create a Xamarin app with emotion recognition capabilities, utilize the Xamarin.Forms platform and CrossMedia plugin. The app allows capturing an image, processing emotions and displaying the results.




private async Task captureBtn_Clicked(object sender, EventArgs e)
{
    if (await CrossMedia.Current.Initialize())
    {
        if (CrossMedia.Current.IsCameraAvailable)
        {
            // 50% compression and a unique file name.
            var file = await CrossMedia.Current.TakePhotoAsync(new Plugin.Media.Abstractions.StoreCameraMediaOptions { Name = Guid.NewGuid().ToString(), CompressionQuality = 50 });
            if (file == null)
            {
                await DisplayAlert("Cannot capture", "We cannot store the file in the system, possibly permission issues.", "Okay");
            }
            else
            {
                // Load the image in view
                emotionResults.Text = "Processing the image...";
                imageView.Source = ImageSource.FromStream(() =>
                {
                    return file.GetStream();
                }); 
                await processImage(file);
            }
        }
        else
        {
            await DisplayAlert("Cannot select picture", "Your device needs to have a camera, or needs to support photo selection.", "Okay");
        }
    }
    else
    {
        await DisplayAlert("Problem", "Cannot initialize the low-level APIs.", "Okay");
    }
}


Architecture diagram

machine learning with emotion recognition

Integrating machine learning with emotion recognition

Azure Machine Learning can be used to build and deploy custom emotion recognition models that are more accurate than the default models.

Emotion recognition technology can be used to capture and analyze the emotions of shoppers in a retail store. This information can then be used by sales teams to make more effective decisions.

For example, if a sales associate sees that a customer is feeling angry, they can adjust their approach to de-escalate the situation. Or, if a sales associate sees that a customer is feeling happy, they can use this information to build rapport and make a sale.

The use of emotion recognition technology in retail is still in its early stages, but it has the potential to revolutionize the way that sales teams interact with customers. By understanding the emotions of shoppers, sales teams can be more effective at building relationships and closing deals.

emotion recognition with azure

Explore the power of emotion recognition

Emotion recognition technology has the potential to revolutionize communication, improve mental health care, enhance workplace productivity and benefit various industries. By leveraging the power of Azure Cognitive Services, .NET Core, and Xamarin, developers can create applications that better understand and respond to human emotions, leading to more empathetic interactions and stronger connections.

Emotion recognition is a powerful technology with the potential to improve our lives in many ways. As the technology continues to develop, we can expect to see even more innovative applications of this powerful technology in the future.

Softweb Solutions has a team of experts with experience in machine learning, facial recognition and natural language processing. Contact our experienced Azure consultants to better understand how emotion recognition can help your organization.

Need Help?
We are here for you

Step into a new land of opportunities and unearth the benefits of digital transformation.

Related Resources