Tuesday, June 18, 2013

40 Stats Shaping the Future of Contact Centres

I came across some interesting research that highlights the important role contact centres play in retaining customers.  Click here.

The article brings home the ongoing importance of focusing on customer satisfaction. At Telnet we call this "The Care Factor" and we measure it with real-time NPS scores obtained via our IVR at the end of each call. Since introducing NPS about 2 years ago independent customer satisfaction surveys undertaken by our clients have all recorded significant improvements.

Happy customers stick around and Telnet's clients all love that!

John Chetwynd

Labels: , , , ,

Friday, June 14, 2013

New Zealand Calling

It is good to see growing interest by Australian businesses in New Zealand as a location for their contact centres. View this story by Australian ABC News Video

John Chetwynd

Labels: ,

Wednesday, April 10, 2013

VIDEO: 2013 Census "Pop-Up" Contact Centre

The 2013 Census was the first full census for 7 years, and to handle the calls, . Telnet turned the Ballroom of the Stamford Plaza Hotel in the middle of Auckland City into a 170 seat temporary “Pop Up” contact centre.

Handling over 40,000 calls on census day itself, with wait times of only 4 seconds took an innovative approach.

Watch the video to see the full story.

See http://www.telnet.co.nz/casestudies.aspx for more Telnet case studies

Labels: , , , , , ,

Thursday, March 21, 2013

Creating the Census 2013 "Pop-up" Contact Centre

So how does a New Zealand contact centre make space for over 200 additional staff to take a massive hit of additional phone calls over predominantly a 3 day period? Oh, and there is to be no disruption to the existing work handled by the centre….

That was the challenge that Telnet’s IS team was given around 18 months ago when Telnet won the contract to once again provide contact centre services for the New Zealand census.
There were many things on our to-do list from making sure there were enough telephone lines, server capacity through to network and even internet bandwidth that made this something other than your normal contact centre project.  The bigger challenge though – at least from a systems and resource perspective – was “where do we put everyone?”

Our home for the 2013 Census
The perfect answer to this conundrum turned out to be somewhat unexpected – the ballroom of the 5-star Stamford Plaza hotel, in the heart of Auckland’s CBD (and only 100 metres or so from Telnet’s head office).  Despite the rather unlikely sounding collaboration, the massively supportive hotel management and staff played host to hundreds of Telnet contact centre staff over the two main weeks of census, and became part of a success story that not only showcased the resourcefulness and innovation of Telnet’s IS team, but really brought to home how important it is that our suppliers really see themselves as true partners.    

Indeed without our partners, and their engagement and excitement about this project (and them coming up with quite a few non-standard, outside the box ways for us to buy/hire/use their products or services) there is no way we could have pulled this off.  Throughout this post I’ll mention a few of those key providers – not because they are paying us – but as my small way of saying thanks for all their help!

Stamford Plaza provided the space for us to work; their in house AV team from Spyglass worked their magic providing the not in significant supply of power to all the desks and Vector Communications handled the job of providing fibre services to connect our voice and data networks back to HQ. Next step was to install kit to all the desks; almost 1.5km of network patch leads, NEC IP Phones, Power over Ethernet(PoE) network switches and lots and lots of HP Laptops and power supplies!

One or two laptops....
Making the decision to use laptops to provide the computer systems for the centre was an important one that saved a lot of time and energy in set up – though the initial challenge to the team was where we could get them from.  Typical NZ based “events” don’t call for this quantity of hardware - so the “events rental” companies couldn’t really help.  Luckily, our friends at Public Technology (who supply a lot of our desktop hardware) came through with a great mix of HP Elitebooks and Probooks that were more than capable of handling our ContactSuite application.

One of our UPS units along with
our "Core" switch
Using laptops and not desktop PCs also, almost by accident, solved another challenge of providing systems support to such a high profile event – by their very nature the laptops are tolerant of power loss – so this saved us from having to provide battery backed power for the desks.  IT Power provided us with UPS units to support our networking (including powering theVector kit at the hotel) – meaning we could continue to operate fully in the event of a short term (an hour or so) power failure.

"Snapshot" stats display
Once we had set up the desks, the final piece of the puzzle before adding the agents was plugging in all the headsets (provided by Cackle), Firing up ContactSuite (developed by our own incredible development team – thank guys!!- powered by infrastructure from Dimension Data), and delivering some calls (which of course relied on the very excellent Zeacom Communications Center product, with telco cloud based routing management through Gen-i's Tollfree Self Manage product. )

Thanks to some pre-work, we had the contact centre live and taking calls within a couple of hours of us getting access to the room.  Over the rest of that day, and the next two days, we fully set up the rest of the centre, moving staff to each newly configured set of workstations to give them some real world use before the big census day.  We set up a projector and stats display (Using the Zeacom Snapshot product) so the whole centre could see the calls we were receiving, but more importantly, keeping our contact centre supervisors “heads up” and walking the floor, rather than tied to their desks watching the screen.

The Pop-Up Centre in full flow
Census day itself was pretty much the monster we expected –we took over 40,000 calls and 6000 emails but we had no significant technical issues with our new centre, had network utilisation (our 100Mbit Fibre links that we spent a lot of time worrying over!) of only around 20-30% at peak and perhaps best of all,   our staff (most of these hired en-masse through WINZ – see John Chetwynd’s blogpost) answered all these calls with an average wait time of only 4 seconds.
"It's hardly even tickling the fibre link!"

By the end of the “day-after-census” we were starting deconstruction.  A day and a half, and a lot of packing, cable coiling, de-gaffa-taping and stacking later, we’d turned our contact centre back into three pallets of kit to go back to storage or our various suppliers.  It was an awesome ride, and we’re proud to have been the supplier of services to such an important project as the NZ Census.

If you’d like to hear more about our story, or maybe even have a need for something like this yourself then drop us a line. In the mean time though, here are some of the people we'd like to thank, and a selection of photos from the pop-up Census 2013 contact centre at the Stamford Plaza.

Steve Hennerley,
GM Information Systems

A big thanks to all our partners!

Labels: , , , , , ,

Thursday, March 14, 2013

Census Helpline: Managing peak call volumes

When you accept the contract to manage the help line for a country's Census you do so with a certain amount of trepidation. The experience is pretty unique. Typically a successful contact centre evolves over time through continuous process improvement both in terms of resourcing to meet expected volumes and delivering a quality experience.

With a project like Census the rules all change. The centre needs to scale from zero to major  volumes in a matter of days and the staff need to hit the deck running answering a wide range of queries confidently and efficiently.  This is what the team at Telnet has achieved in the last week with the New Zealand Census. In the attached press release we talk about how we answered  75,000 calls over 3 days with an average wait time of 3 seconds. We complemented our existing staff with over 200 temporaries obtained by our sister company CallCentre People Limited who worked closely with Work and Income New Zealand (WINZ). Also, Knowledge Management systems, designed for ease of use, enabled staff to quickly identify and resolve customer queries.

I am pleased to say that with a lot of planning and very hard work by our  loyal Telnet team we pulled it off. Programmes like this that test you and the learnings provide a platform for innovation and further growth.

John Chetwynd

Press release

Labels: , ,

Monday, October 1, 2012

Are you over complicating your customer service

I was reading this article by Schumpeter in last week's Economist  (click here)  and I realised how closely it aligns with our thinking at Telnet.

We are finding that the more we focus on the basics of customer service the better our customer satisfaction scores become. We agree with Schumpeter when he infers that you need to make it easy for customers to contact you if they really need too. And of course when they do, make sure their query is understood and resolved as quickly and efficiently as possible.

It isn't rocket science really, but it does take a commitment to getting the basics right.

John Chetwynd
Managing Director

Labels: , ,

Friday, September 14, 2012

Testing the theory - DR Testing Takeaways

Everyone says you need to do it - hey I even said you should do it, but how many of us do "really" test our disaster recovery processes? I mean from start to finsih, warts and all.  We've been testing our plans for years - but somehow never really got to putting everything together into a single full scale "get out there an do it" kind of test.  We've been able to extrapolate and make assumptions, and generally be pretty happy about our capabilities should the world of BAU come to an end, but could never say, for sure, in the end, it would all hang together.

We decided - particularly on the back of some fairly big updates this year, that we really needed to do a full on, no holds barred test, by taking our primary centre fully offline, and then seeing how it all went - here are some of the things we learned:

Be Prepared?
Yes, it's a question and not a statement - you really want to make a decision on just how "prepared" you want to be.  When we talked to our partners, we found everyone wanted to "plan" this test - and that's something you need to be a little careful of.  Having everyone and everything all in the best places possible to achieve a successful outcome might well be what you are used to doing - but in this case you risk lulling yourself into a false sense of security - in a real disaster, all the prep time you have has already gone.

What you DO need to be prepared however, and prepared as well as possible, is for is a clean and rapid rollback.  If something goes disasterously wrong with the test, or if curcumstances mean that safety or the business is put at risk - you really need to make sure that however you simulate the disaster, you can "unsimulate" it as fast as possible

Accept Failure
Whatever you think going into it - there are going to be things that don't work as you thought they should - to be honest I'd be more worried if everything DID work as it was supposed to - if so you probably missed something!  If your test is realistic (and not planned to primarily highlight the best bits of your DR plan!) then there should always be things you can learn - even if it's just an opportunity to speed something up.

When writing the test plan, it helps to have someone who didn't design the recovery procedure recommend what the scenario is - if you can resist it, try not to overthink the situation.

Record everything
Have someone who is not involved in the recovery act as referee, they will be able to avoid the hustle and bustle of trying to make things work, and they will actually have the time to write things down as the test progresses. The referee is also a great pair of eyes on other opportunity to improve processes that might be missed by those who are in the middle of it all

So.. How did it go?
I guess you are all wondering how it went for us then?  I suppose it would be unfair for me to preach the things above and not tell our story - so here goes...

We planned our DR event for late at night, when we only have  a few staff around, and impact to customers would be minimal - maybe not as big and scary as the middle of the day - but the sysems and processes are identical so it's still a valid test.

We simulated a complete loss of our contact centre and data systems, and at 11:18pm we pulled the plug (quite literally in some cases) on our internet, phone lines and external WAN connections.  Simultaneously we killed the lights and the staff had to get themselves out and into cabs to our DR site (diverting critical lines to mobiles as they went).

Once at the DR site, the fun began....

Overall we had a successful test - it was a great validation of the work we'd done over the last year - but the real value came in the things that maybe didn't go 100% to plan.  It took longer than expected for some of our recovery servers to come up - something only a realistic test would show - we've since reorganised the startup process.  One of our backup telephony servers also decided not to fail over cleanly (even though the previous 6 test were flawless) - we'd never seen the issue before - but now we know about it we're better placed for next time (test or real event).

One of the most suprising things though was how engaged the staff who participated in the excercise were - even though they were "off the clock" by later in the evening (morning!) - many staff were keen to stay on and help out even when they weren't specifically needed anymore (we had a second crew back at HQ who took over once the testing was complete and we'd "rolled back" - this save additional delays whilst we got staff back to base)

If you want to hear more about our test (particularly if you are a client of ours), drop me a line, I'll be happy to tell you in more gory detail!  But I'd like to leave you with one final, and most important learning from this exercise.  I've said it before, and I'll surely say it again...

Test it.  Test it again
No amount of talking about it, looking at diagrams, or testing parts of your DR process is anywhere near as valuable as taking the risk to do a full scale test.  If you've never done it (or only done it part way) make a resolution to yourself to prove it.  What's the worst that can happen?  If you keep your primary site/systems ready to go - not a lot - but you sure will learn where you need to focus your efforts. Once that's done, start thinking about doing it all over again.... best of luck !!

Steve Hennerley
GM IS ,Telnet

Labels: , , , , , , ,