Winning Customer Loyalty:

Ensuring AI Transparency and Data Security in Retail

GenAI is changing how we think about sales/service experiences and personalization in digital tools and marketing. Although customers are more willing to engage with AI tools than ever, they have heightened concerns around data privacy and security. How retailers go about addressing these concerns can make or break customer trust.


In a recent Ipsos CX study, we found over 73% of consumers want retailers to be transparent about their use of AI-powered virtual assistants. We also asked, “When thinking about e-commerce / retail experiences, which of the following statements represents how certain you are about recognizing if the chatbot is AI-powered vs. human-powered?” Only 36% of consumers stated that retailers clearly and transparently communicated that the chatbot is AI-powered.


Transparency is critical. When a customer feels forced into using AI or is unsure about how their data is being used, it can lead to uncertainty (learn more about Ipsos’s Forces of CX). Where customers feel uncertain, they may feel confused, nervous, anxious or vulnerable and are therefore unlikely to continue further engage with that brand. On the flip side, being transparent invites customers to feel that the company is reliable and safe, thereby building trust and loyalty.


When developing GenAI-powered tools, here’s how brands can instill a sense of certainty and build customer trust:

  • be transparent about where GenAI is used to power the experience
  • Clearly communicate how data privacy and security commitments are managed
  • The information provided by virtual assistants must be accurate, concise and reliable
  • Always provide other methods to engage with the brand outside of the GenAI tool

Stephanie Bannos-Ryback

EVP xperience Service Lines (CX, UX, EX), Ipsos