{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Dan Huddart heads Web Analytics at RSA Group, a global insurance company. He oversees a group of analysts which works as an internal agency across all RSA brands, overseeing analytics and optimisation of digital properties in 33 countries.  



Below he talks about testing, personalisation, a global web dashboard and how showing images of the right breed of dog can significantly improve pet insurance conversion rates.

How important is online business for RSA in terms of percentage of sales made online (both direct and through brokers)?

The online channel has a dominant role already in some of our markets, and a rapidly growing role in others. In the UK a significant volume of insurance is purchased online or with an online assist, and in other regions where that isn’t yet the case, online is playing a critical role in marketing and in the research phase for consumers.

To put it into context, we now transact well over £1bn each year through online channels and that’s up significantly from just a few years ago.


Can you describe the role of your team within RSA, and how you act as an internal agency across the business?

We act as a centralised analytics team, sitting inside a global e-business hub which also contains teams of designers, UX practitioners, front-end and back-end developers and a project management function. Within the analytics team I currently have three web analysts and one AB/personalisation specialist.

The role we play varies, ranging from acting as a full service agency to light touch work and consultancy.

Some of RSA’s global brands have just one or two people as a web team, and for these partners we’d typically offer a full service, including requirements gathering, technical implementation, report generation and insight workshops to help move the needle on whatever their growth metrics are locally. 

At the other end of the scale are RSA’s more established online brands, which already have a strong web team and dedicated analysts. Our role here is more strategic; we renegotiate contracts at group level and provide short-term support for big projects, such as full site redesigns or site improvements.

The economies of scale we get in this arrangement are huge, both in reducing cost from suppliers and in our ability to reuse code and knowledge across the business.


What do you think are the pros and cons of this approach (versus having an external agency or consultancy)?


The biggest pros are around quality and channel knowledge. We build and maintain insurance sites for up to 33 countries, so there’s a raft of specialist channel knowledge on hand and we have a good idea of what works and doesn’t work for each type of market. 

There are some cost savings, particularly when you consider that each brand may bring in its own local agency otherwise. What’s more important is that we have RSA’s best interests at heart – we’re not targeted to make a profit from our services, only to drive up the results of our partners.

The main con I think is the risk of the teams becoming too specialised and dropping out of step with the wider digital community. Insurance isn’t pushing digital boundaries as quickly as retail and telecoms sectors are, and we must ensure we continue to bring elements of those leading industries into our own. I think we solve this through personal development and staff engagement – everyone is encouraged to participate in industry-wide events and experimentation of new technologies and methodologies is very much encouraged.


Are there any good examples of insights you have gleaned from one RSA brand or market, which have led to successful changes in other markets or businesses?

A good recent example is personalisation. Our Polish brand www.link4.pl wanted to improve the cross-sell between car and home insurance, and so we experimented using Adobe Test & Target to personalise the homepage. This enabled us to encourage existing customers to cross-sell rather than simply have static content.

I had a similar challenge in the UK but, in this case, our aim was to encourage existing customers to try making changes to their policies online. It was a different brief to the Polish one, but the underlying technology is identical and so we just lifted and shifted everything we’d learnt from one site to the other. It really boosted some of our key KPIs, with a 6% increase in existing customers logging their account, without impeding any new customer sales.

We discussed those two results on one of the global knowledge sharing calls and then instantly had two more partners requesting it, one in the UK and one in Scandinavia. They both have different metrics they’re trying to hit, but we’re reusing most of the code again and starting to see the same results on some huge sales channels.

Do you use a single web analytics vendor across the business?

We have a list of preferred tools, which is essentially a list of tools we have demonstrated in at least two of our brands already and have the skills in-house to implement and analyse with. Occasionally our partners will opt to use a local supplier but this is quite unusual now, given the buying power and depth of expertise we have as a central function.

Default homepage: 

Version for prospects: 

For existing customers: 



How do you ensure that your time is spent on driving actionable insights and recommendations for key stakeholders within the business, rather than just web reporting?


Generally if a report can’t be automated, I don’t agree to deliver it, and we can usually offer to do something more valuable with that time such as analysis reviews and insight workshops. Most of the tools we use have great automation capabilities but to be honest it’s quite rare that partners ask for regular reports once they’ve seen our pitch and way of working. 

If you offer someone the choice between a big spreadsheet of numbers versus a list of data-driven improvement ideas, backed with examples from our other brands, they rarely choose the spreadsheet.


Can you talk a bit about the global dashboard you have set up and the visibility this has at the most senior levels of the business?


As we rolled out analytics implementations, we’ve maintained a consistent tag and data structure across all sites, with modifications for special markets like broker or niche insurance lines. This was initially designed to make it easy for analysts to work across different brands, but the other benefit is that I can roll that data up into country, continent and global views.

We use Adobe Analytics across all sites, and pull data through the API into a central dashboard which is designed to help the group’s senior team identify opportunities and trends at a global level. To say this has been popular would be a huge understatement; the global web dashboard now has a regular slot at quarterly board meetings with our Chief Executive and Chief Financial Officer.

The trends we pull from this dashboard are shifts in device usage, marketing, usability performance, customer shopping behaviour and shifts in external factors such as the activity of insurance aggregators.


How do you use testing and personalisation to improve online performance? What tech do you use for this?


Online testing is something we’ve been pushing for a while and personalisation really kicked off here in 2012, although it was present in other channels (offline marketing for example) before that.

We use Adobe Test & Target in most regions, firstly because it integrates well with our Adobe Analytics data and secondly because it’s flexible enough for us to handle both the simple and complex briefs with one tool.

Creating a culture where change is AB tested can be a challenge, especially as it can add to the complexity of deployment. Our design and UX teams love it though, because it’s the best way of finding out exactly how each change impacts our online metrics. A few years ago I was busy convincing people of the need to test, now I’ve got more demand for testing than we can keep up with.

Personalisation is running a year or two behind that but is on the same journey. From what we’ve learnt so far, it has the capability to really drive up user experience and engagement.

There’s been some bad press about personalisation which is centred around its use for third party retargeting, but when used appropriately it just means that the messages you see are more likely to match what you’re looking for on the site.  

People seem to love their local shops because they get a personalised service, and the same is true online. I really believe that the concept of a static homepage will be unthinkable in a few years time.


Can you give some examples of what changes you have made as a result of
 testing? 

We have implemented a lot of UX changes based on test results, and sometimes it’s surprising to see what customers prefer. 

We don’t do those awful button tests, e.g. changing the colour of a button to see whether you can drive up click-through. I’d question how that really benefits customers in any way.

What we do is look for ways of presenting information, ways of gathering data and ways of navigating which customers find easiest and most intuitive to use. We also test to find out what messaging to use in each stage of the customer journey, because essentially the website should exist to make insurance understandable and accessible if the web is your channel of choice.

The most interesting tests are usually the wacky ones which have some sort of emotive trigger. We have played around with personalising our pet insurance journeys, such as changing images to show the correct breed of dog for owners who are getting a quote for a pure breed.

It seems daft when you explain the concept but that little test ramped up conversion to sale by 4%, which proves that customers respond well to a user experience which feels personal to them.


To what extent are you joining up online and offline data, for example using CRM data (as well as cookie data) to target the right customers at the right time?


We’ve been able to analyse online and offline customer data together for many years, the real shift change at the moment is the ability to do it in real time and allow our websites to personalise using offline data which they can look up instantly.

We’re experimenting with this heavily at the moment and there are a ton of ideas being testing to understand how we can use real time online/offline integrations to improve customer experience. I see it as a logical progression to the personalisation tests, we should just make that personalisation better and even more relevant to the customer’s needs.


Graham Charlton

Published 26 February, 2013 by Graham Charlton

Graham Charlton is the former Editor-in-Chief at Econsultancy. Follow him on Twitter or connect via Linkedin or Google+

2565 more posts from this author

Comments (9)

Comment
No-profile-pic
Save or Cancel
James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Thanks Graham,

A really insightful glance at the rigours of analytics/optimisation in a global brand.

One comment really surprised me:

"We don’t do those awful button tests, e.g. changing the colour of a button to see whether you can drive up click-through. I’d question how that really benefits customers in any way."

That jars with the importance that RSA places on the optimisation and testing process. Colour is an important visual stimulus and different colour schemes can play a role in driving action, working in tandem with the underlying content structure. To say that testing button style is awful misses the point in my opinion - I've seen tests that have driven increased conversion simply by changing button design.

Of course, that's a tiny part of the a much larger picture but to embrace testing, then ignore a component of it, seems rather strange.

What does everyone else think?

Thanks
james

about 3 years ago

Avatar-blank-50x50

Emma Durant, Global Marketing Strategist at LionbridgeEnterprise

Great interview. It would be really interesting to know if they take the testing down to a local, regional level, in order to take into account cultural differences?

about 3 years ago

Paul Rouke

Paul Rouke, Founder & Director of Optimisation at PRWDSmall Business Multi-user

Thanks for a really insightful Q&A Graham. Only last week I had the pleasure of training the head of testing at RSA along with a raft of other brands, and I have to say they are one of the most mature organisations I have come across when it comes to having an established, fully integrated testing & optimisation strategy.

Some if the case results they see from personalisation are exceptional, and it gave other delegates plenty to think about!

@James - just a thought but perhaps Dan and his team prefer to look at more progressive, radical tests rather than the well shouted about "We changed the button colour from red to green and saw a 648% uplift" type tests.

about 3 years ago

Avatar-blank-50x50

Sarah Hughes

Dan Huddart says "Insurance isn’t pushing digital boundaries as quickly as retail and telecoms sectors are" but RSA's use of testing and analytics is impressive. I'd suggest there aren't many Boards that regularly review their global web dashboards as RSA does.

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Hi Paul,

Yes, I'm sure they do look at more sophisticated testing, as the approach outlined in the blog would suggest.

However, the comment about not testing buttons could be misleading to people who aren't well versed in optimisation.

I'd be interested to hear how they structure their testing, whereby the small components are factored into a more sophisticated testing plan.

Perhaps I misread but it came across as advocating ignoring the button test scenario, as opposed to saying "We like to adopt progressive, radical tests which include individual component design as well as higher level concepts".

I think it would really help non-specialists to understand how simple test scenarios can be incorporated into something much grander.

Thanks
james

about 3 years ago

Dan Huddart

Dan Huddart, Head of Analytics & Web Development at RSA Group

@James A good area for discussion. I'll offer two reasons to explain why I think that:

1) As Paul correctly points out, I prefer tests which I know are likely to drive bigger improvements for customers. If you look at a page design and prioritise all the test opportunities, button colour always comes near the bottom of the list for me because there are so many more proven fruitful techniques.

2) I've never seen a test result (from any company) where changing button colour has delivered an improvement in a real customer measure (eg, sales, satisfaction, retention) which hasn't degraded another. The only situation I'd suggest they are valid is where your CTAs aren't easy to find on the page, which we would already know from user testing.

@Emma Yes we do, generally our tests are run at country & brand level. There are some examples where we've gone down to region level but they'd usually be tied to local initiatives. Whilst it can be interesting, it cuts sample sizes considerably and therefore slows the test programme down.

@James Good to hear you met Matt and thanks for the great feedback

@Sarah thanks :)

about 3 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Thanks Dan.

I think your point 2 is why I think simple tests can actually be really effective, but you only get to a simple test scenario if you've done proper analysis/user testing to pinpoint a specific issue.

I've seen landing pages where people struggled to pick out the CTA due to its design - blended in to the background so people missed it amongst the content. Content was considered persuasive but people then had no clear visual signal of where to click. A change in button design had significant impact on conversion.

I'm not sure how that would degrade another customer measure?

I'm not claiming to be an expert in CRO, so happy to learn from someone more experienced.

It is an interesting discussion, always good to learn from others in the industry.

cheers
james

about 3 years ago

Dan Huddart

Dan Huddart, Head of Analytics & Web Development at RSA Group

@James

"Simple tests can be effective" - absolutely. My point is that changing the colour of a button is not the best use of your finite test capabilities.

If your site has low contrast buttons, to the extent that customers don't notice them, then that should be picked up in user testing. This is faster, cheaper and more insightful than AB testing for this sort of feedback. You could use AB tests to discover this, but it's the wrong tool for the job, and the ROI is low compared to alternatives.

In button tests I've analysed, bigger buttons with a higher contrast did increase click-through slightly, but that increase simply drops out of the funnel again in a later step. If you imagine it through the customer's eyes, you’re putting a bigger, brighter button in front of them and expecting that to convince them to do something. I’d suggest that using your AB resource to improve the messaging, understanding, simplicity and relevance of your content will deliver far greater benefits.

Ultimately there's no right or wrong answer here, we can each just speak from our experience. I'd love to hear any views on this

about 3 years ago

Avatar-blank-50x50

Dan White

Good to see your thinking, Dan. Your team has taken the lead in the general insurance space for this, principally by taking the mentality of a retailer, which is no mean feat. Well done.

about 3 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.