Companies everywhere are looking for ways to improve customer service. For example, companies with call-in support centers might track how long agents take to answer calls, or how long customers stay on hold. While many companies are looking at every possible method to gain customer insights, many don’t go far enough. How many businesses take facial recognition into account as a way to track the quality of their customer service efforts? Big data could make facial recognition for brick-and-mortar businesses just as ubiquitous as call tracking is for call centers.
The growth of VoIP and cloud computing has sparked a revolution in customer service. While most of us are familiar with the line “This call may be monitored for quality assurance purposes,” a whole host of call monitoring tools have, in fact, been deployed—no doubt helped by the advent of hosted PBX systems that allow call centers to scale up quickly. A lot of them are deploying sophisticated analytics that record calls and monitor certain metrics, such as call length and hold time.
Many of them are looking at nonverbal cues, such as how stressed a customer is. While it’s possible for humans to listen to recorded phone calls in order to establish quality control, it takes a lot of time. What a lot of businesses are doing is using voice analytics to process these calls. These systems can alert companies if they’re getting a lot of angry calls.
With big data, it’s possible to look at faces to get nonverbal cues. Faces are rich with information, according to Joe Clabby, president of Clabby Analytics. He wrote about nViso software that uses IBM’s Watson, the computer that won on Jeopardy!, to analyze the faces of people participating in focus groups.
“This software analyzes digital emotional reactions to products and situations, and once this data has been captured, many questions can be poised, such as, ‘What is happening to make these customers react this way? Why are customers happy or unhappy with a particular product or proposed action? or What is likely to happen if we change a product to reflect this feedback?’ The answers to these questions may require the data to be structured differently for analysis, and this is where the ‘zone’ concept comes in. Data can be put in,” said Clabby.
Customer response is emotional rather than rational, and companies try to appeal to emotions instead of product features. If you don’t believe this, watch a bunch of gamers arguing about their favorite systems sometime.
A great use case is analyzing faces in focus groups. Marketers can watch how people react viscerally to products, and use that feedback to make decisions on product features and branding. All these faces require lots of storage space for facial recognition systems, and for systems to be able to make decisions based on trends in the data.
While the focus group is a controlled experience, it’s how people are responding in real time that makes the difference. Fortunately, there’s a way to analyze faces without a bunch of people having to sit in front of screens all day. Big data makes it possible to process the faces, reading the emotions of customers and letting marketers and customer-service staff determine how they’re doing and propose alternatives to keep their customers happy.
Facial analysis also captures a lot of unconscious responses, which is why it’s becoming an increasingly valuable marketing tool. Big data makes it possible to see these trends in people’s emotions over time, which is perhaps the most important part of marketing: How people react viscerally to brands? Humans have already been doing this for thousands of years, but it takes time to train and hire them, not to mention the financial investment required.
The Bank of New Zealand is already using facial recognition software to gauge customer reactions to various financial scenarios, from a last-minute plane ticket to planning a wedding. The advantage of using facial recognition is that people are more honest with their faces than when filling out questionnaires. The system can capture subtle responses much faster than a human can.
Amscreen is another innovative use of facial recognition. It can analyze the age and gender of shoppers at checkout counters and deliver them targeted advertising by using cameras mounted at supermarket checkout stands. The faces of the people who pass through supermarkets each day adds up to a lot of faces to process and even more data to store. There’s just no way humans can handle that by themselves without big data processing and storage.
There’s nothing more unique to us than our emotions, and reading our faces could allow businesses to better tailor their products to our desires. It will also be much cheaper and easier than sending out coupons in the newspaper with fingers crossed, or sending out sales representatives to observe how shoppers behave in person.
As with any data collection, privacy is a big concern. Marketers and customer service departments might end up inadvertently creating George Orwell’s nightmarish vision of the future in 1984, where the “Thought Police” can arrest people because their subtle facial expressions betrayed their “thoughtcrime.” The Snowden revelations have also made a lot of people more wary about privacy. Many people don’t want to be tracked wherever they go.
While people in a focus group can consent to having their likenesses used, what about people in stores? Businesses will want to keep their customers feeling safe with their data, but many of them will find these new tools irresistible. Companies wanting to use big data to look for trends in people’s faces should take care to anonymize personal data.
They could scrub the names off of records before they’re put into the system or remove location information. SQL-on-Hadoop technologies like Apache Drill can have multiple data sources simultaneously. It is even possible to create a view that omits information while leaving the original data intact.
More Accurate Facial Recognition
With the large sample sizes that big data offers, it will be possible to build more accurate facial recognition algorithms. Facebook can recognize faces about 97 percent of the time already, building on its vast network of users. With more mobile phone owners capable of recording good quality video, the next big step will be video facial recognition. It’s not that much of a stretch, since a video is simply a string of lots of still images. This means Facebook could offer a powerful marketing tool on top of its existing social media marketing tools.
With more cameras and better software, marketers and customer service departments can make better use of non-verbal cues to improve their offerings. By using big data, it will be like having a thousand sharp-eyed people monitoring a bunch of cameras in real time, but without the expense of having actual human beings performing that task. Businesses can put the money they would have invested in humans monitoring people and use it to make their products better, armed with the emotional information they’ve gleaned from faces.