Bing Emotion API-Cognitive Services-part3

In this tutorial we will learn how to use Bing Emotion API ,In our previous example of cognitive demo we had combine two cognitive services Bing Speech API and Bing Emotion API using that we are finding images using speech  to text(STT). To Extend this demo we will be using Bing Emotion API which will detect the emotion of person from the image that we had searched using Bing Image Search API. I would highly recommend you to go through previous to parts before starting with this demo.

Part1 , Part2

To register for Bing Emotion API follow the steps.

  1. Navigate to https://azure.microsoft.com/en-us/try/cognitive-services/
  2. Navigate to Vision tab and select Bing Emotion api and press get api key.Emotion.png
  3. Select I agree and Your region as shown in imageregion
  4. Login With Microsoft account to get your api keysign_in
  5. Save your keyKey.png

 

Now from Part2 ,we already had the url of the searched image using Bing Image Search API, So will use the same url to get the emotion of person Using Bing Emotion API. We will simply create one method named as GetEmotion which takes String Url of image.( which we already have in part2)

 

private async Task GetEmotion(string imageUri)
{
var client = new HttpClient();
var queryString = HttpUtility.ParseQueryString(string.Empty);
// Request headers
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "Your_key_Bing_Emotion_API");
// Request parameters
var uri = "https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize?" + queryString;
EmotionRequest request = new EmotionRequest();
request.url = imageUri;
byte[] byteData = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(request));
using (var content = new ByteArrayContent(byteData))
{
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
var response = await client.PostAsync(uri, content);
var json = await response.Content.ReadAsStringAsync();
if (response.IsSuccessStatusCode)
{
List<EmotionResponse> emotionResponse =
JsonConvert.DeserializeObject<List<EmotionResponse>>(json);
if (emotionResponse != null && emotionResponse.Count > 0)
{
var scores = emotionResponse[0].scores;
Dictionary<string, double> dScores = new Dictionary<string, double>();
dScores.Add("anger", scores.anger);
dScores.Add("contempt", scores.contempt);
dScores.Add("disgust", scores.disgust);
dScores.Add("fear", scores.fear);
dScores.Add("happiness", scores.happiness);
dScores.Add("neutral", scores.neutral);
dScores.Add("sadness", scores.sadness);
dScores.Add("surprise", scores.surprise);
var highestScore = dScores.Values.OrderByDescending(score => score).First();
//probably a more elegant way to do this.
var highestEmotion = dScores.Keys.First(key => dScores[key] == highestScore);
await Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal,
new Action(
() =>
{
this.MySpeechSentiment.Text = $"Emotion: {highestEmotion},";
this.MySpeechSentimentConfidence.Text =
$"confidence: {Convert.ToInt16(highestScore * 100)}%";
}));
}
else
{
this.MySpeechSentiment.Text = $"I'm not able to get the emotion, sorry.";
}
}
else
{
await Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal,
new Action(() => { this.MySpeechSentiment.Text = "Could not get emotion from this image";
}));
}
}
}
view raw demo3.xaml.cs hosted with ❤ by GitHub

 

This Method will make post call to Bing Emotion API with required parameters and API will return the json result. To store this json result we will create one class and named it EmotionResponse  again it is an optional ,one can directly bind the json result.

 

First download the part2 from here, then follow the steps.

 

  • Replace Design MainWindow.xaml with this
<Grid>
<Grid.RowDefinitions>
<RowDefinition Height="Auto"></RowDefinition>
<RowDefinition Height="Auto"></RowDefinition>
<RowDefinition Height="Auto"></RowDefinition>
<RowDefinition Height="Auto"></RowDefinition>
<RowDefinition Height="*"></RowDefinition>
</Grid.RowDefinitions>
<Grid.ColumnDefinitions>
<ColumnDefinition Width="Auto"></ColumnDefinition>
<ColumnDefinition Width="*"></ColumnDefinition>
<ColumnDefinition Width="3*"></ColumnDefinition>
</Grid.ColumnDefinitions>
<TextBlock FontSize="24" Grid.Row="0" Grid.Column="0" Grid.ColumnSpan="3" Margin="4">Bing Speech,Image Search and Emotion Demo</TextBlock>
<Button x:Name="button" HorizontalAlignment="Center" VerticalAlignment="Center" Width="75" Click="button_Click"
Grid.Row="1" Grid.Column="0" Grid.RowSpan="3" Margin="4" Height="75">
<Button.Content>
<StackPanel Orientation="Vertical">
<TextBlock FontSize="16">Speak</TextBlock>
</StackPanel>
</Button.Content>
</Button>
<TextBlock x:Name="status" TextWrapping="Wrap" Text="Not Listening" VerticalAlignment="Center" FontSize="16" Visibility="Collapsed"
Grid.Row="1" Grid.Column="1" HorizontalAlignment="Center" Margin="4"/>
<StackPanel Orientation="Horizontal" Grid.Row="1" Grid.Column="1" Grid.ColumnSpan="2">
<TextBlock x:Name="MySpeechResponse" FontSize="20" Margin="4" TextWrapping="Wrap" VerticalAlignment="Center" />
<TextBlock x:Name="MySpeechResponseConfidence" FontSize="12" Margin="4" TextWrapping="Wrap" VerticalAlignment="Center" />
</StackPanel>
<StackPanel Grid.Row="3" Grid.Column="1" Grid.ColumnSpan="2" Orientation="Horizontal">
<TextBlock x:Name="MySpeechSentiment" Margin="4" FontSize="16" TextWrapping="Wrap" VerticalAlignment="Center" />
<TextBlock x:Name="MySpeechSentimentConfidence" FontSize="12" Margin="4" TextWrapping="Wrap" VerticalAlignment="Center" />
</StackPanel>
<Image x:Name="searchImage" Margin="4" Stretch="Uniform"
Grid.Row="4" Grid.Column="0" Grid.ColumnSpan="3"/>
<StackPanel Orientation="Vertical" HorizontalAlignment="Center" VerticalAlignment="Center" x:Name="RecordingBar" Visibility="Collapsed"
Grid.Row="0" Grid.Column="0" Grid.RowSpan="5" Grid.ColumnSpan="3">
<ProgressBar HorizontalAlignment="Left" Width="207" Margin="0,16,0,0" IsIndeterminate="True" />
</StackPanel>
</Grid>
view raw MainWindow.xaml hosted with ❤ by GitHub
  • Replace MainWindow.xaml.cs with below
namespace BingSpeechImageEmotionDemo
{
using Newtonsoft.Json;
using System.Windows.Threading;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Web;
using Microsoft.CognitiveServices.SpeechRecognition;
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
///
public partial class MainWindow : Window
{
private MicrophoneRecognitionClient micClient;
public MainWindow()
{
InitializeComponent();
this.micClient = SpeechRecognitionServiceFactory.CreateMicrophoneClient(
SpeechRecognitionMode.ShortPhrase,"en-US", "Your_Key_Bing_Speech_API");
this.micClient.OnMicrophoneStatus += MicClient_OnMicrophoneStatus;
this.micClient.OnResponseReceived += MicClient_OnResponseReceived;
}
private void MicClient_OnMicrophoneStatus(object sender, MicrophoneEventArgs e)
{
Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal,
new Action(
() =>
{
if (e.Recording)
{
this.status.Text = "Listening";
this.RecordingBar.Visibility = Visibility.Visible;
}
else
{
this.status.Text = "Not Listening";
this.RecordingBar.Visibility = Visibility.Collapsed;
}
}));
}
private async void MicClient_OnResponseReceived(object sender, SpeechResponseEventArgs e)
{
if(e.PhraseResponse.Results.Length>0)
{
await Application.Current.Dispatcher.BeginInvoke(DispatcherPriority.Normal,
new Action(
() =>
{
this.MySpeechResponse.Text = $"'{e.PhraseResponse.Results[0].DisplayText}',";
this.MySpeechResponseConfidence.Text = $"confidence: {e.PhraseResponse.Results[0].Confidence}";
}));
this.SearchImage(e.PhraseResponse.Results[0].DisplayText);
}
}
private async void SearchImage(string phraseToSearch)
{
var client = new HttpClient();
var queryString = HttpUtility.ParseQueryString(string.Empty);
// Request headers
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "Your_Key_Bing_Search_API");
// Request parameters
queryString["q"] = phraseToSearch;
queryString["count"] = "1";
queryString["offset"] = "0";
queryString["mkt"] = "en-us";
queryString["safeSearch"] = "Moderate";
var uri = "https://api.cognitive.microsoft.com/bing/v5.0/images/search?" + queryString;
var response = await client.GetAsync(uri);
var json = await response.Content.ReadAsStringAsync();
// MessageBox.Show(json.ToString());
BingImageSearchResponse bingImageSearchResponse = JsonConvert.DeserializeObject<BingImageSearchResponse>(json);
var uriSource = new Uri(bingImageSearchResponse.value[0].contentUrl, UriKind.Absolute);
await Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal, new Action(() =>
{
this.searchImage.Source = new BitmapImage(uriSource);
}));
await GetEmotion(uriSource.ToString());
}
private async Task GetEmotion(string imageUri)
{
var client = new HttpClient();
var queryString = HttpUtility.ParseQueryString(string.Empty);
// Request headers
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "Your_Key_Emotion_API");
// Request parameters
var uri = "https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize?" + queryString;
EmotionRequest request = new EmotionRequest();
request.url = imageUri;
byte[] byteData = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(request));
using (var content = new ByteArrayContent(byteData))
{
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
var response = await client.PostAsync(uri, content);
var json = await response.Content.ReadAsStringAsync();
if (response.IsSuccessStatusCode)
{
List<EmotionResponse> emotionResponse =
JsonConvert.DeserializeObject<List<EmotionResponse>>(json);
if (emotionResponse != null && emotionResponse.Count > 0)
{
var scores = emotionResponse[0].scores;
Dictionary<string, double> dScores = new Dictionary<string, double>();
dScores.Add("anger", scores.anger);
dScores.Add("contempt", scores.contempt);
dScores.Add("disgust", scores.disgust);
dScores.Add("fear", scores.fear);
dScores.Add("happiness", scores.happiness);
dScores.Add("neutral", scores.neutral);
dScores.Add("sadness", scores.sadness);
dScores.Add("surprise", scores.surprise);
var highestScore = dScores.Values.OrderByDescending(score => score).First();
//probably a more elegant way to do this.
var highestEmotion = dScores.Keys.First(key => dScores[key] == highestScore);
await Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal,
new Action(
() =>
{
this.MySpeechSentiment.Text = $"Emotion: {highestEmotion},";
this.MySpeechSentimentConfidence.Text =
$"confidence: {Convert.ToInt16(highestScore * 100)}%";
}));
// await
// this.Speak(
// $"I'm {Convert.ToInt16(highestScore * 100)}% sure that this person's emotion is {highestEmotion}");
}
else
{
//await
// this.Speak(
// $"I'm not able to get the emotion, sorry.");
}
}
else
{
await Application.Current.Dispatcher.BeginInvoke(
DispatcherPriority.Normal,
new Action(() => { this.MySpeechSentiment.Text = "Could not get emotion from this image"; }));
//await
// this.Speak(
// $"Could not get emotion from this image.");
}
}
}
private void button_Click(object sender, RoutedEventArgs e)
{
this.MySpeechSentiment.Visibility = Visibility.Visible;
this.MySpeechSentimentConfidence.Visibility = Visibility.Visible;
this.MySpeechSentiment.Text = string.Empty;
this.MySpeechSentimentConfidence.Text = string.Empty;
this.MySpeechResponse.Text = string.Empty;
this.MySpeechResponseConfidence.Text = string.Empty;
this.searchImage.Source = null;
this.micClient.StartMicAndRecognition();
}
}
}
view raw MainWindow.xaml.cs hosted with ❤ by GitHub
  • Also add two classes to store Json Response of Emotion API EmotionRequest and EmotionResponse
public class EmotionRequest
{
public string url { get; set; }
}
view raw EmotionRequest.cs hosted with ❤ by GitHub

 

using System.Collections.Generic;
using Newtonsoft.Json;
public class FaceRectangle
{
public int height { get; set; }
public int left { get; set; }
public int top { get; set; }
public int width { get; set; }
}
public class Scores
{
public double anger { get; set; }
public double contempt { get; set; }
public double disgust { get; set; }
public double fear { get; set; }
public double happiness { get; set; }
public double neutral { get; set; }
public double sadness { get; set; }
public double surprise { get; set; }
}
public class EmotionResponse
{
public FaceRectangle faceRectangle { get; set; }
public Scores scores { get; set; }
}
view raw EmotionResponse.cs hosted with ❤ by GitHub

 

Source Code

Output:

Output.png

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s