04 Feb Body Language in Understanding
How much can you hear without a word being spoken? How often does something about a person’s face, posture or hand gestures completely contradict the “normal interpretation” of the words they speak, creating a sense of sarcasm or other indirect message? To what extent are the academy awards influenced by an actor’s ability to use face, hands and posture to effectively communicate deeper meanings, subtext and exformation? Consider the other side of the coin: is it possible to differentiate true contrition from empty apology without the non-verbal elements of body language? The answers to these questions can cast serious doubt on the use of technology, especially technology without human intervention, as a vehicle to understanding. Today I’d like to talk about sentiment and eye contact in the context of automated human language understanding.
Understanding Context Cross-Reference Click on these Links to other posts and glossary/bibliography references
|Prior Post||Next Post|
|Mapping a Thought||Rings of Power: Workflow and Business Rules|
|interpretation message||Stanford Project|
|body language subtext||Thompson Reuters|
More and more companies that care about their “brand” image with consumers, use sentiment analysis and other tools to evaluate how people feel about them. In the past, companies would use surveys and hire PR firms, and the like, to gather feedback from individuals and groups to get a glimpse into people’s perceptions of their brand and products or services. In addition to these types of services, a crop of new tools and strategies are available to help brand and product managers and marketers track public perceptions. We hear about scoring individuals as “net promoters” or “detractors” and using the score as a basis for tailoring outreach to effectively address these sentiments. There are contact center technologies that build and mine interaction data for clues that can help a company improve its public image. Using sentiment as a basis for targeted marketing efforts can also help a company improve its bottom line. And although most of these technologies are only good at the emoticon level, rough data is often better than anecdotes only.
The social media sphere is a rich source of sentiment. Stanford has a remarkable capability you can read about here: http://nlp.stanford.edu/sentiment/ (Be sure to follow the commentary on Bilbo’s birthday parting remarks). Big data is a powerful platform for extracting and storing sentiment data from social media. Look at FADS. Yesterday I saw an interesting announcement from Thompson Reuters about a sentiment analysis capability they are offering tied to Twitter and other social media sources: http://techcrunch.com/2014/02/03/twitter-raises-its-enterprise-cred-with-thomson-reuters-sentiment-analysis-deal/. It is similar to an existing Bloomberg capability. And there are more big players in the industry. Automatically gleaning a person’s intent from things said or written is becoming big business, but can it be done sans the ability to read body language? Yes, but not as reliably.
Eye Contact and Body Language
Because of the omnipresence of subtext, it is often extremely important to make and maintain eye contact during communication. A Seattle attorney describes this in a helpful blog. He suggests that, as part of the social contract of communication, in the courtroom, eye contact is an essential element of mutual understanding. Naturally, the mutuality of this understanding is an important component of what the attorney is seeking. But isn’t this what we all seek? Spoken language, body language, social frameworks, cognitive dissonance, subtext, context — it’s too much for a system to correlate. The idea of a device being able to outsmart its human creators seems farfetched to me. But this does not deter me from attempting to build systems and populate ontologies that attempt to correlate as many of these factors as possible to deliver the ability to automatically interpret intent. Someday, you and your device may see eye to eye, despite the inevitable cognitive dissonance and distance between “flesh and blood” and “bits and bytes”.
|Click below to look in each Understanding Context section|
|4||Perception and Cognition||5||Fuzzy Logic||6||Language and Dialog||7||Cybernetic Models|
|8||Apps and Processes||9||The End of Code||Glossary||Bibliography|