How does Involve.ai help customer success professionals?
Involve.ai is building a customer intelligence solution to help organizations bring a truly customer-centric approach to their organizations through data. All customer success professionals want to be customer-centric but finding the right way to bring information together is a struggle. Getting the data together and identifying what actually is going to make customers the most successful and for the customers that we may lose, which is like a punch in the gut.
Involve can unify and integrate data from all the tools that interact with customers within the organization. With all the data together, using Ai, Machine learning, and Natural Language processing the customer success professionals now have the capability to be able to look at that and understand the nuances across all of our customers and understand what is driving their success. Understand what their organization is doing well, what needs adjustment to provide additional value.
Data fluency and how do Customer Success professionals think about big data?
Customer data has been collected for several years but the data has always been living in different systems. Different leaders collect different data about customers from sales, marketing, customer success and use it to uncover insights and tell a story. While this may work in some cases, the fact that the data exists in silos may not always enable the organization to leverage the data to get actionable insights and make decisions.
If you do this holistically and look across all the data together, you can uncover insights that you never ever thought of asking. It is a big advantage to be able to get smart insights on your own without depending on a data scientist and that too in real-time. Customer success professionals can make decisions and back them up with data. With data instead of anecdotal information, you can make a case to get funding or support from other functions for your projects.
When is the right time to start deploying Customer Success Tools, especially in smaller companies?
Traditionally we start off with figuring out delivering the best solution to the customers and then when we have to put processes in place we have to go backward and rip off some band aids. If you set up processes and systems early you don't have to do a lot of backtracking. The earlier you can do it the better. Define the customer journey, identifying that experience you want customers to have, and setting up the metrics and deliverables that go with that is critical. Then you can wrap a system around it. The earlier you can get visibility of what is happening with the customers, you set yourself up for scale early. Instead of having to be reactive again, take steps backward to go forward.
So I would say define, define the journey, define those deliverables. Put the system in place as quickly as possible. Do not let the system technology know when features drive your process but put it in as early as you can to get that lift.
Quantitative versus qualitative data and what are the key pieces of data to collect?
So both are especially important because the quantitative allows you to quickly assess averages and trends and it makes it a little bit easier to get a pulse on what's happening. If you get a good representative sample of the data. Qualitative is so critical because it tells a different story in a lot of cases.
For example, you might have a net promoter survey, where a customer has a promoter right nine or 10, your executive sponsor, you know, gives you a 10. The program owner on that Net Promoter survey gives you an eight, but on average, their promoter, it is great. But if you do not look at the comments to see that the program owner gave you a passive score because they're struggling to do the change management and adapt the solution. You're going to miss out all that rich data to understand like there's an opportunity to jump in with this customer and do a better job, help them adapt. And so without that qualitative part of the story, you know, it could lead you in a different direction to make decisions than if you weighed both of those things.
One of the things that we're doing at involve.ai, I believe, for the first time ever, we were able to take quantitative and qualitative data into consideration into the health scoring. And so there's the data-driven component that looks at the sentiment of the customer
through slack messages
Net Promoter comments
Customer satisfaction survey comments,
Zoom recordings and other channels of customer interaction
Qualitative analysis can be run through technology through the AI to understand what the customer is saying, what are the categories of content that are coming through support, or through customer sat?
And then again, on the quantitative side, it's interesting to look at those trends and actually do they marry up to the qualitative data? So a lot of interesting types of insights that you can get from that can help inform your process.
I think in terms of the type of data, you know, the usual suspects would be customer satisfaction, scoring across the customer journey and at various milestones. I am a believer that the Net Promoter survey can be a piece of the puzzle that is informative and helpful, should never be the only piece of data and then product adoption or service. Package consumption, training resources that have been adopted and consumed. Community metrics if you have a community and then support, information support scoring, and then, of course, anything else that you can incorporate from your CRM or from the additional applications across functions that are collecting customer data would also be helpful to bring in, but some of the core ones, you know, are that satisfaction and the product adoption, engagement with community and webinars and then all the one piece I left out is touch, customer touch. So those interactions between calls, meetings, emails, all of those things also tell a story.
What kind of leading indicators are important to collect and predict term. We always think specifically measuring qualitative and quantitative with the champions and trying to understand where they're at and silence is usually a big indicator?
If you can establish some of the trends and understand them, you can start to build in predictions, and those leading indicators, for example, product adoption is a great indicator. If a customer has a stable adoption, and suddenly you see a spike, or it declines for two or three months, then you see a spike. The spike could be an indicator that the customer is exporting data and trying to, you know, collect their aggregate information because they're looking at another, you know, competitor potentially.
So it's those types of leading indicators of any anomalies in usage are great, you know, a fantastic way to proactively engage and then sentiment around while sentiment from any angle, whether it's from emails or meetings that the CSM (Customer Success Manager) had, where a competitor name is mentioned, or product objections, you know, where they are struggling to adapt. Sometimes customers explain things in a way that it's not always clear what the real issue is. And so being able to flag that and deep dive like where they are struggling with adoption, for example, or their executive sponsor, you know, left and there's no one to replace them right away. And so how do you keep those valuable conversations going? When people leave the organization is another leading indicator of potential risk.
Support indicators, in terms of escalations or support volume is another change and volume is another great indicator. In some of the analysis we have done for our customers, in one case, a high volume of support tickets with medium severity related to expansion opportunities. The reason was that those were customers who were adopting and super-engaged, and they were getting help and pointed in the right direction from support. Now, you could argue that you could take those support categories and put a better job into the product and it's a community but for that customer that was an indicator that high volume was good for other customers. We have seen that low engagement was correlated significantly to retention. Because those customers had a good self-service experience. So, what is interesting is, it is not sort of a one result, it is going to apply across the board or even across your products. So super important to be able to dive into those indicators.
Is it possible or useful to measure the rate of sharing within your client company? How much they are interacting with each other and how much they are spurring each other to use the product?
There is the ability to do that. What I have seen work well is when you do have collaboration across roles. Within a single platform, or at least you can pull the activities together, for example, who's meeting with the customer by what role? And if you can capture that information, you're able to see what is the impact on the customer over time based on the number of touches and by what roles and you can start to dig into best practices there to say, if we have QBR hours that happen more than 90 days before renewal, and the CSM and the account manager together, have jointly touched the customer, you know, X number of times between the QBR and the renewal, like this results in not only renewal but expansion. You know, those are the types of things that you can dig into. But it's really a matter of just being able to collect by, you know, the role and then the interactions, which is very possible.
How should the emerging phenomenon of CS Ops be defined?
What I found over time for me personally, my experiences, I've leveraged CSM to be my Ops arm before I could make a case and get an actual CSS resource. I think the mistake that is made often or at least I made it is I chose to continue to hire you know revenue generating or revenue protecting CSMs (Customer Success Managers) versus balanced that with getting an Ops person who give me the scale to save you know, one or two CSM headcount. If I had been able to balance that sooner, processes would have been in place faster, systems would have been optimized more quickly. Like all of those things. CS operations is a great title. One of the things I believe is that it's become a bit of a catch-all. So sometimes they are responsible for all of the internal enablement. They are responsible for all of us scaled programs. So, you know, here built this self-serve, you know, digital experience, also responsible for all the measurement, you know, and tracking of metrics and then process improvement and systems. It’s become a critical and evolving role that needs to be the right hand of the VP of CS or Chief Customer Officer.