Off Center

Contact Centerfold: DEALERTRACK TECHNOLOGIES

8/13/2013

4 Comments

 
There are companies that talk about being customer-focused, and then there are companies like Dealertrack Technologies that back such talk up with real action.

A few years ago, Dealertrack – a leading provider of web-based software solutions for the automotive industry – implemented a ‘Voice of the Customer’ (VoC) initiative featuring a comprehensive and dynamic customer satisfaction (C-Sat) survey process. The initiative has enabled the company to continuously drive performance improvement, elevate the customer experience and enhance the bottom line.

I recently caught up with Dealertrack’s Senior Manager of Technical Support, Dayna Giles, who was gracious enough to answer my barrage of questions about her center’s VoC and C-Sat success with much eloquence and insight.


When did you implement your current Customer Satisfaction survey process, and what was the main objective for doing so?

The Dealertrack Customer Satisfaction survey process has been in place since early 2009 and rolled out through the different solution groups and teams through October 2010. The main objective of this is to understand from our clients’ perspective what we are doing well, and what can improve on, as well as whether or not they would be willing to recommend our solution in the marketplace.


How soon after an interaction with an agent is the customer surveyed? How many questions does the survey feature, and what are the nature of those questions?

The survey is emailed to the client immediately after the case is resolved.

We have a total of six questions on our survey. The nature of most of those questions are specific to the agent and the interaction (empathy, follow-up, understanding and satisfaction with technical resolution), with the other question being whether or not the client would recommend our support team. There is additional space for clients to provide comments or feedback to help improve our product, our service, or future interactions.


Do you survey only callers, or also customers who interact with Dealertrack via email, IVR and web self-service?

Our surveys are tied to the client email address so we survey any form of client interaction based on our case-tracking system.


Who evaluates the survey data/feedback, and how often?

We have an internal team dedicated to the VoC process. We have monthly debrief meetings that involve key leadership team members where discussion occurs around all VoC metrics and initiatives to improve results.


Do you have a “customer recovery” process in place for customers who indicate notable dissatisfaction following an interaction? How soon after such customers complete a survey does your center contact them, and how do customers typically respond?

Our supervisors and managers call our clients back on all the dissatisfaction alerts or client requests we receive. Once such a client responds to a survey, they are contacted within one business day. Clients typically respond positively to being contacted by a supervisor or manager on a dissatisfaction survey.


Do you incorporate customers’ ratings and direct feedback into agents’ Quality scores and coaching?

Yes, we incorporate customer ratings and feedback into team member quality scores and coaching in a couple of ways. We have a team member scorecard – Team Member Performance Index (TMPI) – and a Service Experience Index (SEI) that includes both the Quality Performance Assessment (QPA) score and the Transactional Net Promoter Score (TNPS) to give the agent an overall grade or ranking for the month. During monthly agent review sessions team members receive feedback on the above.


How do agents feel about having the Voice of the Customer integrated with your Quality monitoring process?

When we initially rolled out this program, team members were not confident that they would be able to influence client satisfaction. Team members believed challenges with a product or other issues that were outside their control would overshadow the service they could provide. We very quickly learned this was not the case – how a team member delivers the message and manages the interaction is often the determining factor in whether a client is satisfied or not.


What other kinds of actions do you take on the customer data and feedback you receive?

We often use client feedback to improve our internal processes. For example, since supervisors or managers make the callback to our clients, they receive direct feedback they may not otherwise hear. They bring that feedback to daily meetings where we are able to discuss where we are as a team and look to make improvements. It could be a lack of training on the team member’s part, and in discussing this feedback we may find that similar training is needed across the team. We then work with our training team to provide this specific training to improve the team member’s knowledge and confidence.


I hear your center has seen vast improvements to its Net Promoter Score. Care to elaborate? To what do you attribute such an increase?

Over the course of 29 months we saw a great increase in our Transactional Net Promoter Score. From February 2011, with a score of 5%, to June 2013, with a score of 75% – that’s a 70% increase! The biggest increase occurred between February 2011 and March 2011, when we saw 15% improvement (from 5% to 20%). The second biggest increase occurred July 2012 to August 2012, when we saw a 14% improvement (from 46% to 60%).

We attribute such an improvement to team member focus on VoC. We ran a number of competitions to improve team member awareness that each client interaction could result in a customer survey. It became part of our daily language and part of our culture.


High customer satisfaction doesn’t happen without high agent satisfaction. What kinds of things does your center do to keep agents happy and engaged? 

Rewards & recognition
We have a couple of major awards that we give out on a monthly and quarterly basis, including Service Star of the Month, which is based on Transactional NPS scores and the number of positive customer comments the agent receives via surveys. We also have our quarterly Star Quarterback award, which is based on peer nominations regarding a team member’s demonstration of Dealertrack’s Vision, Mission and Values, as well as, internal and external client feedback and overall performance. 

In addition, Customer Service Week is one of our favorite weeks here. We do a number of fun free activities – bingo, funky sock day, favorite sports team day – and some pretty cost-effective activities. Cotton candy machines are around $30 to rent and the sugar is roughly $8. Minimal cost and effort but maximum results! The thing our team looks forward to the most each year is the breakfast we make – bacon, eggs, pancakes, hash browns, fruit, OJ… the works! The leadership team cooks the breakfast and serves our team members. For a couple hundred dollars we can feed over 200 people and physically serve and thank them for all they do.

Empowerment 
We run multiple focus groups concurrently where our team members are assigned a topic and given an opportunity to provide their feedback and any potential improvements they see we could make. In order to be successful, our team members have to feel we are giving them the opportunity to do so and as leaders we don’t always have the answers. It’s great to get ideas flowing from the team and create a ground swell. The company/leadership recognizes that support team members ARE the advocates for our clients and the client experience with products and service.

Also, our Level 2 agents are encouraged and empowered to train our Level 1 agents. Each L1 agent has an aggressive goal to complete 120 hours of training per year.  L2s are encouraged to provide a vast number of those hours of training.

Advancement opportunities
Team members are often selected from the Technical Support department to move up to various roles in the company – from Quality Assurance to Installation to Product Management. We develop and encourage future growth for our team members. Many of our support teams have higher internal turnover (promotion/transfer) than external, which is rare in the contact center industry.

Work-at-home opportunities
We currently have a number of remote employees on our team. We like to give team members, based on their role, the opportunity to work from home.

Stress reduction tactics
When we have a system incident or outage, we often get the team lunch. Or if it’s a Friday, or if it’s hot, or if we simply feel like it, we’ll get ice cream or treats. It doesn’t have to be a great expense to the company to make someone smile.


Dealertrack Technologies – The Big Picture
Contact center locations: Dallas, Texas; South Jordan, Utah; Groton, Conn.
Hours of operation: Main Support Hours of operation are Mon-Fri 6am-6pm MT; Sat 7am-4pm MT; Sun on-call support.
Number of agents employed: 150+
Products/services supported/provided: Software for the automotive industry.
Channels handled: Phone, IVR, email, web self-service.
What so great about them: The ‘Voice of the Customer’ initiative they implemented in 2009 has led to huge increases in customer satisfaction and loyalty, not to mention a highly engaged frontline.


4 Comments

Contact Centerfold: PHILADELPHIA INSURANCE COMPANIES

11/26/2012

4 Comments

 
After embarking on a robust voice-of-the-customer project, Philadelphia Insurance Companies is more agile in identifying and addressing problems, leading to higher customer satisfaction.

(NOTE: This article was originally published in 1to1Magazine in August by 1to1Media, who has granted Off Center permission to use it here. The article, originally titled “Philadelphia Insurance Companies Listens to Its Customers”, was written by 1to1Media’s Senior Writer Cynthia Clark.)

It's not always simple for businesses to sift through the mountains of information that customers are providing, understand what clients are saying, and then build strategies that address problems. Philadelphia Insurance Companies wanted to find an efficient solution to get actionable data based on what its customers were saying. Although the insurance carrier was collecting feedback through several channels – including annual, transaction, and web surveys, as well as mystery shopping – the organization wanted to find a one-stop-shop solution to analyze its customers' feedback.

According to Seth Hall, the company's vice president of operations, customers were filling out surveys, and the data which was being collected failed to effectively measure performance and identify areas of improvement. "The results were objectively captured,” says Hall, “but there wasn't a lot of information as to how customers perceived the company, whether they were pleased with the level of service, and whether they would recommend us to someone."

Additionally, while the company was receiving thousands of calls every month and hundreds of emails daily, these multiple sources of customer feedback made it difficult to develop a holistic and enterprise-wide analysis of what customers were saying. On the other hand, Philadelphia Insurance Companies was putting a lot of emphasis on internal scorecards to determine whether processes were going according to established standards – e.g., the length of time it took the company to respond to a claim. Since the company was doing well and scorecard results were good, it was challenging to decide what changes needed to be made.



Making the Most of Client Feedback

Soon after joining the company in 2009, Hall noticed that the results outlined by the scorecards were not always reflected in the organization's results. He quickly realized that Philadelphia Insurance Companies was missing out on important feedback because of the lack of a reliable VOC program that could provide data on which to base decisions and to determine the success of new projects. At the end of 2010, the organization implemented MarketTools' customer satisfaction program, allowing the insurance firm to capture key data from customer feedback, which is aggregated and visible in real time. This allows the company to notice trends quickly and to be agile in taking the necessary action. Hall says low scores trigger automatic emails to the responsible person, who can then contact the customer, get more information, and resolve the issue.

The new information allows Philadelphia Insurance Companies to know exactly how it's doing in the eyes of its customers rather than depend solely on the internal scorecards. "We match our internal scorecards with external feedback," says Hall. This new strategy has led to some discoveries of problems that might have cost the organization money. Scorecards, for example, indicated the need to add more customer reps since the average speed of answer had slipped. But upon reviewing its VOC data, the company realized that first-call resolution was a much more important indicator of customer satisfaction. "This data was instrumental in us changing the metrics/goals and thus not hiring the additional staff we thought we needed," Hall explains.

Another incident that reiterated the importance of having a VOC program took place October 2011 when the company started administering a collector car insurance product. In order to cut costs, the company revised its policy of sending auto ID cards with the renewal notice. According to Hall, VOC feedback was astounding, with about 10 percent of the 130,000 policy holders criticizing the decision. This real-time feedback allowed the organization to quickly reverse its decision and just seven weeks after launching the new policy start sending out auto ID cards. "Without a VOC program, it would have taken us much longer to react, losing customers and upsetting people," says Hall.

VOC feedback has also indicated that some customers feel it can be difficult to get information from the company and contact the right person. "To remedy this we are now putting together targeted and transactional surveys to find out what exactly we can be doing differently, and obviously better," Hall says.

Although Philadelphia Insurance Companies is retaining its internal scorecards, it no longer needs to base all its decisions on these results. In fact, the scorecards indicated no major billing issues while the VOC program found there to be more negative comments surrounding billing than anything else. This is the information the company needed to start working on a new billing system that is expected to launch next year.

The impact of listening to customers has been noticeable, and Philadelphia Insurance Companies has seen an increase in NPS scores – from the mid to upper 40s before implementing the VOC program to 51 at the end of last year.



A Client-Focused C-Suite

In a bid to solidify the trust that customers have in the company and improve the one-on-one relationship, C-suite executives have been at the forefront of reaching out to clients who leave less-than-stellar feedback. According to Hall, customers are surprised to be receiving a call from the company's hierarchy.

Such a policy makes it clear that the company is not afraid to apologize for its mistakes, says Hall. "Although we do a lot of things well, we aren't perfect, but we're going to call you, apologize, and fix the problem. And we'll do it very quickly."

(Reprinted with permission by 1to1Media.)



4 Comments

DELOITTE CallCenter

3/7/2011

4 Comments

 
Every time I talk to long-time colleague Gerry Barber – a man whose sparkling contact center management and consulting career spans three decades – I learn something new about customer care and see things from a fresh perspective.

Gerry currently directs Deloitte’s CallCenter in Nashville, Tenn., where he has used his years of expertise and insight to drive continuous improvement and help set Deloitte apart from its competitors – and most other contact center organizations, for that matter.

I ran into Gerry at a conference in New Orleans last summer, where he told me his center was – not surprisingly – doing some cool and innovative things. I told him I wanted details, and that I would likely blab about them in writing at some point. Always eager to help the collective contact center industry raise the bar and rethink norms, Gerry happily agreed to speak to me on the record.

Here’s everything I asked, and – more importantly – how he responded.  

GL: Deloitte states that it aims to sustain a culture of “distinctive service”. Is this just a buzz phrase, or does it truly have teeth? What’s Deloitte CallCenter’s definition of   “distinctive service”, and how is it measured and promoted within the contact center and to your internal customer base?


GB: Deloitte’s focus on distinctive service is evident and ingrained in our CallCenter vision statement, training materials and in the metrics we use to run our centers. In our CallCenter, the definition of distinctive service actually starts with the internal customers we serve. Our customers help us define distinctive service through the positive comments they provide in our customer satisfaction survey responses. With this said, to us distinctive service is the art of delivering knowledge the customer can use, providing solutions that give peace of mind, and delivering service that is beyond expectations.

GL: I am intrigued by your center’s “Wall of Distinction” that you mentioned. Please explain what the Wall is exactly, why you created it, and how your analysts get on it.

GB: The Wall of Distinction was created to recognize consistency in delivering distinctive service. The criteria for securing a place on the wall is to achieve high customer satisfaction survey scores in addition to receiving a high number of specific customer compliments. An analyst will remain on the wall for a period of six months based on their performance during the past six months. We update the wall twice yearly: in October (during Customer Service Week), and again in April. 

GL: Do your analysts receive other related awards/recognition when they attain “Wall” status?

GB: We celebrate our analysts’ success in delivering distinctive service in several ways.  Daily, we circulate an email to all CallCenter personnel listing all of the positive customer compliments we’ve received the previous day. In the email, we include the name of the analyst along with the customer’s comment that identifies how the analyst delivered distinctive service.  On a monthly basis, we again review all of our customer compliments and I send a congratulatory note to those analysts receiving the highest number of customer compliments during the month; some analysts will receive monetary awards.

Our quality recognition program has evolved organically over time. Along the way, analysts’ comments have enhanced these programs shaping what we do today. 

GL: Your center has a somewhat unique quality scoring model in place. Please briefly describe the model and explain the reasons behind it.


GB: We wanted a monitoring program that encourages continuous improvement. For our needs, none of the standard approaches seem to drive performance – not the 100-point scale, nor the “check the boxes” approach, nor averaging scores. In our model, we’ve removed the idea of numeric scores. We’ve also resisted averaging scores for the collective call.  Rather we look for a set of “quality targets” across each of the four distinct and separate contact quality attributes. These four attributes include: 1) Call Ownership; 2) Communications and Courtesy; 3) Documentation; and 4) Resolution Effectiveness. Then, based on what we find within the context of the call, we determine if (in each of the four quality attribute areas) the analyst “missed the targets”, was “approaching the targets”, “met the targets”, or delivered distinctive service (exceeded targets).

Our quality team then looks at the percentage of instances, over time, that the analyst was either on target or surpassed the target. It is also important to understand and document where the analyst missed targets and where they can make improvements. Our goal for our analysts is to have 80% or more of their monitored calls “on target or above”, with no more than 5% of calls at “missed targets”. This is how we gauge individual analyst improvement.

Our analysts are coached daily but work with their team leader/coach twice monthly to review quality monitoring results. Most importantly, our analysts are given time off the phones to review their monitored calls prior to their coaching sessions.

Our analysts have bought into the process in a big way. They seem to have better insight as to what it takes to deliver distinctive service. We’ve also seen a movement on the part of analysts to do significant self-coaching.

GL: You’ve mentioned that your center has been exploring “unique ways to mix up random monitoring”. Could you please elaborate?


GB: Our quality team monitors a minimum of five random calls per month. One example I can share is that we asked each analyst to select one recent call that they felt surpassed our stated service targets. Our quality team then included this call as one of the five monthly quality monitoring calls. It was interesting to review these calls to see if the analyst had a solid understanding of what it takes to “exceed targets”. The calls indicated which analysts required more coaching and helped us better define what distinctive service looks like. 

GL: What impact has your “distinctive service” had on such things as customer satisfaction, first-call resolution, analyst engagement/retention, and operations costs?

GB: On a seven point customer survey scale, we have moved the bar on customer satisfaction consistently over the past three years. First-call resolution has also significantly improved year over year.

Regarding agent engagement and retention, our analyst engagement scores are among the highest within the Deloitte US Firms.

And best of all, over the past three years while service delivery has greatly improved, our cost of delivering support service has significantly decreased. Better service at a lower cost has been the result.

GL: Is there anything else you would like to add, Gerry?

GB: Yes. One secret ingredient to our success is the collaboration and inclusion produced by our continuous quality program.  In addition to the analysts’ role I mentioned earlier, our team leaders are measured on their team’s success. They are also measured on the amount of time they spend coaching for continuous improvement, which should be about 60% of their work day. And as mentioned, our analysts participate in the process by reviewing the same quality sampling as our service quality team and team leaders do.

Deloitte CallCenter – the Big Picture:
Location: Three centers: Tennessee, India, and California
Hours of operation: 24 x 7
Number of agents: Approximately 170
Products/services provided: Deliver a number of support-type services, including IT support,
business application support and HR benefits support, to name a few.
Channels handled: Phone, email, self-service
What’s so great about them? They sustain a culture of “distinctive service” – highlighted by a unique and comprehensive quality monitoring initiative as well as an employee rewards/recognition program that centers almost entirely around customer satisfaction ratings and feedback.


4 Comments

Contact Centerfold

Here's where I feature “sexy” contact centers – customer care organizations that are doing exciting things and aren’t afraid to reveal some “hot” secrets of their success.

C’mon – you know you want to look.

Archives

November 2013
August 2013
June 2013
March 2013
January 2013
November 2012
September 2012
August 2012
July 2012
June 2012
May 2012
March 2012
February 2012
January 2012
December 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
January 2011
November 2010
October 2010
September 2010
August 2010
July 2010

Categories

All
Agent Development
Agent Empowerment
Agent Engagement
Agent Hiring
Agent Recognition
Agent Retention
Agent Rewards And Recognition
Agent Training
Agent Wellness
Albridge Solutions
Alternative Labor Pools
Blinds.com
Blue Ocean Contact Centers
Call Center Culture
Call Center Efficiency
Call Center Training
Capital One
Chat Management
Coaching
Collectcorp
Constant Contact
Contact Center Excellence
Contact Center Hiring
Contact Center Training
Contact Center Turnover
Corporate Social Responsibility
C-Sat
Customer Experience
Customer Feedback
Customer Focused
Customer-focused
Customer Loyalty
Customer Satisfaction
Customer Satisfaction Measurement
Employee Wellness
E-support
Exemplary Call Library
Extreme Contact Center Satire
Fcr
First-call Resolution
Fish!
Forecasting & Scheduling
Home Agents
Multichannel Call Center
Multichannel Management
NY Life - AARP Operations
Performance Management
Quality Assurance
Quality Monitoring
Rewards And Recognition
Rewards & Recognition
Senior Management Support
Shopify
Social Customer Care
Social Customer Service
Social Media
Social Media And The Contact Center
Social Media Monitoring
Veterinary Pet Insurance (VPI) Customer Care Center
Virtual Call Centers
Virtual Queuing
Voc
Voice Of The Customer
Web Chat
Work At Home
Work-at-home
Workers With Disabilities
Workforce Management
Zappos Customer Loyalty Team