Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
How do you know your website pages are performing as well as they could be if you're not testing them to make sure? You don’t, is the honest answer.
You could be missing out on a lot of potential sales, shares, clicks, or whatever it is you’re trying to get out of your visitors.
Conversion Rate Optimisation (CRO) is the process of understanding pain points associated with a conversion path (e.g. from click to sale, lead or other important business metric) and creating test hypotheses from which to create incremental growth.
If you analyse incoming traffic sources, device usage and perhaps other segments like country of visit, recency, bounce and exit rates via your analytics tool, you'll quickly discover some interesting data points.
A/B Testing is more successful when it is driven by insight from users, but how should you go about incorporating it?
When seeking to optimise a website, what is it that defines whether or not a test has been successful?
It would be easy to fall into the trap of thinking that a test is only successful if it results in a positive uplift of some sort (e.g. higher conversion rate), but in fact the truth is far more complex.
The increased understanding and use of testing is giving companies more ability to tailor customer experiences in an improved, genuinely personalised way.
Serving each segment a tailored experience also has the future potential to raise conversion levels even higher than the 7-10% uplift reported in 2014.
In my previous posts about A/B testing, I made the case that you need to consider the math behind A/B testing, or risk having invalid, or even wrong, results.
My first suggestion is to use sample sizing, but that requires a lot of tests.
Here's how to do something similar without nearly as many.
Almost two decades ago, Jeffrey and I started evangelizing the notion that your conversion rate is a measure of your ability to persuade visitors to take the action you want them to take.
Good companies know how to persuade visitors, but legendary companies better understand their visitors and their desires, and do more than simply satisfying those desires.
Great companies find ways to delight them along their journey. This is sometimes labeled as 'flow' in the UX world.
In other words, conversion rate optimization is a critical discipline, but by itself, will it be able to transform a good company into a legendary one?
A/B testing is now an integral part of digital marketing.
But the tests can produce the wrong results if they are not conducted correctly. Here is part one of a three-part series about how you can use data science techniques to avoid making big mistakes with your A/B tests.
Marketers live in a world that is creating 2.5 exabytes of data each and every day.
This provides both a challenge and an opportunity to marketers. How can they process and harness big data in faster and more innovative ways to deliver deeper insights and improved business performance?
The average conversion rate in Adobe's Digital Optimization survey is 2.6%. However, 20% of companies have a conversion rate greater than 4.5%.
So what is the secret behind the success of this 20%? What can other companies learn from this approach?
Let's delve into the results, based on a global survey of 1,000 marketers and supported by Econsultancy.
A/B testing has undoubtedly become the buzzword of the marketing world. It has the potential to transform your marketing approach and fundamentally enhance the way you do business online.
It is the only reliable way of establishing cause and effect. In fact, 75% of the internet retailing top 500 are using an A/B testing platform. While 61% of organisations are planning to bolster testing services in the next 12 months.
And yet: poor A/B testing methodologies are costing online retailers up to $13bn a year in lost revenue.
That’s a really big number. It’s no longer enough to say that you use A/B testing. How you do it is far more important. Here are three A/B testing horror stories.
The cases are anonymous, but the scenarios are very real. Avoiding these traps can help you transform an A/B horror story into the marketing fairytale you always dreamed of.
Let’s kiss the toad and turn him into a prince.
Vistaprint has an interesting order and checkout process. There is lots of cross-sell and a decent amount of persuasion tactics used.
Things have moved on and I must say that I don’t think it’s too complicated any more. There are a number of steps to the order process and to the checkout process but that was to be expected when designing a customised t-shirt (my chosen product).
Cross-sell and upsell is now presented on pages where I already feel assured the design process is going well.
Mainly there was a lot of clear information and some fairly persuasive copy and design techniques which I think has been judged correctly.
However, the company must be careful to keep cross-sell relevant. After being offered similar products, stationery and the like, I was then offered website builds and marketing services. This felt wrong and made me think the process might become more tiresome. If I was busier, I could have abandoned at this point.
See what you think of each stage of the order process..