Determining a persona’s intrinsic value is a herculean feat at best. By asking key questions about the state holder and their users in order to examine needs, wants and desired outcomes, systems now provide a unbiased based approach that considers:

  • How do we make sure we are building products powered by AI that takes in consideration human behavior?
  • How do we make sure our users/stakeholders are represented accurately in this product?

My perception is not of the world, but of my brain’s model of the world.

-Chris Firth, Making Up The Mind, 2007

One of the key factors to examine is the emotional analysis of the or stakeholders being tested. AI uses image metadata that collects data on how a person communicates verbally and non-verbally to understand mood or attitude. The technology, also referred to as emotional analytics, provides insights into how a customer perceives a product, the presentation of a product or their interactions with a customer service representative.

Artificial Emotional Intelligence

Emotion AI gathers cues about a user’s emotional state from a variety of sources, including facial expressions, muscle tension, posture, hand and shoulder gestures, speech patterns, heart rate, pupil dilation and body temperature. The technology that supports emotion measurement and analysis includes sensors, cameras, big data, deep learning analytics engines.

Companies that offer artificial emotion intelligence:

  • Affectiva: analyzes complex and nuanced human emotions and cognitive states from face and voice.
  • Humanyze: allows for more informed company wide decisions. Utilizes hidden patterns of corporate-owned communication to measure how work gets done.
  • CrowdEmotion: utilizes image meta data to track attention, facial coding to understand engagement, and implicit testing to quantify memorability.
  • Emotient: (now owned by Apple) deploys artificial intelligence to identify and understand emotional .
  • Microsoft Azure: AI tools designed for developers and data scientists to assist in creating data for a wide range of products. .

Here a example of Affectiva in action using their artificial emotional intelligence tools:

API’s:

  • RESTfull: (Representational State Transfer) designed to take advantage of existing protocols. While REST can be used over nearly any protocol, it usually takes advantage of HTTP when used for Web API.
  • IBM Watson’s Tone Analyzer: detects emotion/tones within text using linguistic analysis.
  • Microsoft Emotion API: using facial expressions as input, it perceives a set of emotions for each face within a bounding box.
  • Bitext’s API: a linguist bot categories and cross analyzes customer reviews, emotions, keywords and user’s online conversations.
  • Qemototion: a weak AI that will detect the main emotion of the speech and defines the corresponding emotion in terms of temperature.
  • PreCieve API: (now owned by Oracle) a text analysis processor that covers many deterrent analyses. It is an open source API, and available for Node JS.

Here is a screenshot of a stock image that I ran through Microsoft’s API:

Why Does It Matter?

It comes down to a unbiased approach. Artificial emotional intelligence allows for the ability to ameliorate your own cultural, emotional, physical and spiritual persona and replace it with valuable data.

Human beings, viewed as behaving systems, are quite simple. The apparent complexity of our behavior over time is largely a reflection of the complexity of the environment in which we find ourselves.
 ― Herbert A. Simon, The Sciences of the Artificial

Artificial emotion intelligent data focuses our decisions surrounding each component of the product and adds a layer of unbiased, real-world considerations to any conversation.



Source link https://uxdesign.cc/-user-reactions-using-ai-49a1d20bc7ce?source=rss—-138adf9c44c—4

LEAVE A REPLY

Please enter your comment!
Please enter your name here