It’s spooky season, and between jumpscare animatronics and haunted houses, there’s something worse around the corner.
Doom loops. The elimination of human agents. A company that just knows too much about you.
It makes your skin prick, it sends you into a rage, and it tells you the business just doesn’t care about helping you.
CX Dive spoke to three experts about the spookiest customer experiences, what they do to customers, and how they can cost companies.
Doom loops: Worse than a corn maze?
You reach out to customer service and instead of reaching a resolution, you are redirected endlessly.
Most customers have experienced this — and recently too. One-third of customers encountered a doom loop in their last service or support interaction, according to Gartner’s latest State of the Customer survey.
There are three types, Gartner Director Analyst Christopher Sladdin said in an email: “IVR doom loops, where phone menus never lead to a person or resolution; chatbot doom loops, where bots can’t understand intent or escalate; and multichannel doom loops, where customers bounce from web to email to phone and often end up back where they started.”
Doom loops drive customers nuts, but they can also come back to bite companies.
“Doom loops make service journeys high-effort and frustrating, eroding trust and loyalty,” Sladdin said. “In today’s economic climate, a poor service experience can tip the balance between a customer remaining loyal or switching to another brand.”
They are also entirely self-inflicted by organizations, Sladdin said. It’s often the outcome of businesses that try to push all customers, no matter their need, to self-service options and away from costly human representatives.
“We know that, above all else, customers want service and support experiences that deliver on two brilliant basics: resolution on first contact, and easy access to a human agent,” he said. “Doom loops prevent both of these expectations from being realized.”
Customer service without humans can be terrifying
While automation can make customer service inquiries feel quicker and more efficient, cutting people out of the equation can be deadly to the experience.
“Leaning too heavily on AI at the expense of some human touch creates that kind of customer experience horror story,” Mario Matulich, president and managing director of Customer Management Practice, said in an email.
For example, think about a customer with a very complicated billing issue, according to Matulich. This problem needs a bit of empathy to solve, while the canned responses of a chatbot will often fall flat. If they can only reach a live agent after exhausting every other option, they may have already reached the limit of their patience.
“This is not a service failure; this is lost business,” Matulich said. “The scariest part is that many of these incidents could have been straightforward if humans had just stepped in at some point.”
Removing human touchpoints makes customers feel less valued in addition to being unheard, according to Matulich. The cornerstones of great custom service are empathy and accountability, and even the best AI has trouble replicating those qualities.
“It strips away a foundation of trust,” Matulich said. “True, technology can process the data, but it cannot sense the tone, frustration, or sigh in a client’s voice when they say, ‘I just need a real agent to help me.’”
Personalization shouldn’t become a doppelganger
AI-enhanced personalization can be helpful, but there is a fine line between useful and creepy, especially as the technology pushes the limits of personal data sharing.
Take video generator InVideo AI’s new digital twin feature, which lets customers create an AI version of a person that talks, speaks and gestures just like them, Terra Higginson, principal research director at Info-Tech Research Group, said in a LinkedIn post.
While custom avatars created with the tool are supposed to only be available to the customer, InVideo retains rights to use that data to operate and improve its services, according to Higginson. The agreement opens up issues of control and consent that go far beyond making videos feel more personal.
“I like personalization when it actually helps,” Higginson said in an email. “But when I use a tool that asks me to turn my face, voice and gestures into a digital version of me, it starts to feel less about personalization and more like making a copy of a person.”
Higginson questioned whether a customer will get a say in where their personal details are stored, presented or deleted. It can be hard to pull data back after the fact, and most consumers don’t realize how permanent their decision can become after their information is fed into an AI model.
Transparency is the best policy as AI-powered personalization becomes more prevalent, according to Higginson. Customers will be more loyal to companies that are open about how their data is being collected and used in the experience.
“These platforms and tools can be useful, but they need guardrails,” Higginson said. “Companies need to tell users in plain language what they keep, why they keep it, and for how long. Give them real choices to say no or to delete it. Brands that do that keep trust. Brands that don’t lose it.”
 
     
                             
                             
    
            
         
                    
                
             
    
             
                
                     
    
             
    
         
    
         
    
         
    
         
    
         
    
         
    
         
    
         
        
     
    
             
    
             
    
            