Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Companies whose conversion rates have improved carry out 50% more tests on their websites than companies whose conversion didn’t improve. However, 7% are testing nothing at all.
This difference is even more apparent when looking at sales. Companies with a large increase in sales carried out over two times as many tests as the average.
Of the companies that carry out testing, 60% carry out one or two A/B multivariate tests a month and only 6% perform more than 10 tests a month.
Let’s take a look at what areas & elements our respondents are testing and what they find to be the most challenging stages.
Over a quarter (28%) of companies are satisfied with their conversion rates (either 'very' or 'quite' satisfied), up by 6% since 2012 and the highest level since 2009.
Addtionally, around three-quarters (73%, up from 65% in 2012) indicate they have seen an improvement in conversion rates in the last 12 months
The fifth annual Conversion Rate Optimization Report, produced in association with RedEye, also found that the proportion of organisations who say they experienced an increase in sales conversion rates has significantly gone up, from 60% in 2012 to 70% this year.
The research, based on a survey of almost 1,000 client-side and agency digital marketers, revealed that A/B and multivariate testing, using multiple methods to improve conversion and having a structured approach are among the seven factors most correlated with improved conversion and sales...
Your adverts and their messaging are integral to your PPC success. Like the campaign itself, they need constant optimisation, revision and testing.
Planning is probably the most important part of ad text testing. Without a solid plan you are simply going to stick a load of messages out there and see what comes back.
This can lead to unfair testing practices and ultimately, worse results.
Find out what you should be considering when it comes to ad text testing...
Every ecommerce site needs video, which helps drive interest and engagement from consumers, but how do you know if videos actually bring any money to the bottom line?
As a general rule, shoppers love videos, since they show off products in a way that still images can’t.
But how can site owners know if all the work they’re putting into videos is actually generating a return on investment?
Here's some statistics we've seen this week, for your delectation.
For more digital marketing stats, check out our Internet Statistics Compendium.
Landing pages are an integral part of paid search. Effective pages mean you convert more visitors to the outcomes you need and in quality-score based search engines they make your ad more competitive.
Securing the click is only the start of the conversion journey. The quality of the user journey after the click will determine your ability to convert paid search traffic into desired outcomes.
A common mistake in paid search programmes is for the focus to be entirely on keyword targeting and CPC management, ignoring the vital role that landing page optimisation plays in converting visits into actions.
I'm referring to PPC landing pages here, as some of these tips are taken from our PPC Best Practice Guide, but many of these factors apply equally to email and other landing pages...
A/B/n and multivariate testing is one of the most important CRO (conversion rate optimisation) activities for continually improving your website, and yet for some it can be difficult to get started with.
In this post I’ll share three frequently asked questions we hear time and time again from our clients when just starting out with A/B and multivariate testing.
We have to go undercover into shops, speak to sales staff, buy and try products and speak to customer service teams to uncover the objections our visitors face online.
When we delve into the offline world and go beyond surveys and analytics we can find out the hidden causes of abandonment online, remove them and improve our conversion rates.
Here are four simple techniques for finding those hidden gems...
Last week saw the launch of the Econsultancy / WhatUsersDo User Experience Survey Report at an event held in London, with a panel of brands including Hobbs, PhotoBox and UBM.
Below, for the benefit of those who couldn't attend the event, the panellists answer some questions about the research and the approach to user experience within their organisations.
It seems a lot of companies are happy to work on 'hunches' and best guesses when it comes to user experience, with 45% not conducting any UX testing at the moment.
However, of these companies, the majority have begun to see the light, with 73% planning to start testing in the next 12 months.
Here are a few highlights from the survey...
Although digital marketing is considered to be a relatively new industry, many of the theories underlying it have been around for almost 90 years and are still generating sales for some of the web’s biggest brands.
In 1923, Claude C Hopkins wrote Scientific Advertising, one of the most valued resources in the advertising industry.
Hopkins pioneered split testing of his ads and defined a set of principles which, when applied to digital marketing can increase both traffic and conversions.
The 21st century marketer needs an extensive toolkit. As well as the ‘standard’ skills of creativity, organisation and management, these days they also need to be web literate, social media savvy and equipped with basic data science skills.
Amongst all of these areas of technology competence one that is growing in importance, but is perhaps still misunderstood, is website testing.
Testing is the new intuition in site development and optimisation. Rather than relying on hunches, the modern web marketer will test potential changes to their site before deploying them thus, we are led to believe, ensuring their efficacy.
However, if all changes are now tested, how come we don’t all have perfect sites? If testing only tells us the truth, how come we still sometimes go down dead ends?
The answer lies not necessarily in the tests, but in the ways that they’re applied. We’ve seen thousands of testing processes run across a huge variety of sites and what’s struck us is that the issues that led to unsuccessful tests were common across industries.