At Zendesk, we believe that our customers’ feedback helps us improve and grow as a support organization. We ask for feedback after each support interaction to gauge how satisfied the customer was by sending a customer satisfaction survey. Replies to this survey constitute our customer satisfaction (CSAT) score, which we use as a measure of how we’re doing as a support team.
However, this isn’t where it ends: comments left in response to the survey may translate into product improvements, enhancements of the customer experience, and reviews of processes and policies. We want to hear what customers have to say through customer satisfaction surveys—be it if their responses are positive or, not so much.
Today, as part of our Zendesk on Zendesk discussion series, I’ll shed some light into how we navigate the intricate workings of the customer satisfaction survey and ratings, including a newer Zendesk feature that lets you drill into the reasons for bad satisfaction ratings.
Our discussion is broken into several sections, including:
- A general overview of the customer satisfaction survey we send out, and how we use tags and automations to avoid sending duplicate surveys to a customer
- Surfacing feedback outside of our support organization (for the sake of the customer experience)
- How to add a second question to the survey to understand the reason behind bad satisfaction ratings better
Leaving your customers satisfied means asking the right questions that can turn into actionable feedback. A better CSAT survey might just be what your customer service needs.
Read the full post in the forums, ask questions, and tell us how you collect and manage feedback from customer satisfaction surveys