{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

We all have an intuition as to the relevant merits or otherwise of 'the impression'. If a blog post on Econsultancy gets 5,000+ views, I know it's been relatively popular with our audience, considering we get around 1m views a month.

A lot of page views is generally a broad indicator of quality, at least on this blog. Quality could be defined as great entertainment, or helpful best practice.

Quality doesn't necessarily dictate time spent on page, a great post can still be quick to digest. Nor does quality dictate a low bounce rate, especially as we get lots of views from social referral (of course, if time on page was 30 seconds, and bounce rate 96%, there would be mighty cause for concern).

If a post doesn't get many views, but I receive some good comments from learned readers, I'll generally be happy, and hope the post will bring in more traffic over time.

What am I trying to get at here? Well, measurement is a science and an art. There are trends that cannot be repudiated, as well as intuition that must be followed.

There are many ways to skin a cat, but the worst thing you can do is skin the cat without explaining why the cat was flayed in such a way.

In this post I'm going to look at a possible crisis in some areas of measurement and market research. Please add your comments below.

I'm going to be a little negative, and list some things that dismay the level-headed marketer. I think there is a fair amount of misuse of statistics and their interpretation, in marketing.

I've tried to flesh out each sour note as best I can, to bring something constructive to the debate. I'll start with the easy ones.

Market research

Leading questions are asked of a favourable audience.

See South Bank Centre’s campaign to redevelop the ‘skate park’ beneath them. Here’s the excellent Dan Barker on the survey that went out only to South Bank Centre members (supporters) and included questions listing only the benefits of the proposals.

There's not much to add here. When there's such an emotive agenda as redevelopment of a heritage site, it's easy to spot leading questions and flawed data capture.

But when market research is carried out on more mundane subjects, the same things can occur if a marketing agency (for example) has a vested interest in showing the importance of their field.

This could be an issue for every company surveying their customers. If these inaccuracies are pointed out in bright enough light, there may be consequences for the brand image.

PRs report findings of research ahead of publication, with no access to methodology.

This is particularly relevant for marketing services companies that are content marketing with research, but can also apply to B2C companies releasing survey results.

In most of academia, it's really bad form to release results, let alone conclusions, of a study, without at the same time releasing methodology and detail on the dataset.

However this still goes on, no more often than in market research. When we do the Econsultancy 'digital marketing statistics of the week' blog posts, we see this a lot. We are sent conclusions and stats, with nothing to back them up. As much as we can, we try to weed these out, but the practice is rife.

What it does is devalue accurate reporting, and encourage people to believe that minimum effort into research can easily yield a set of 'conclusive' findings to shout about. 

Research is done with too small a sample size. Categories have significantly different numbers of data points in them.

Rigour of statistical analysis is often not detectable. Some weighting, probability values etc should be used if statistics are to be accurate/representative. The visualisation of results should not disguise important facets of the dataset.

Don't use bar charts when each category's sample size differs by orders of magnitude.

Publishing 

PRs and publishers mention research without linking to it (we are occasionally guilty).

This is often propagates poor research, that in turn propagates a low level of diligence in statistics and measurement in general. 

Simple statistics are misreported.

E.g. we once reported that X% of people shop on a mobile. The actual stat was X% of people with a smartphone shop on a smartphone.

Two very different things, especially globally. We try not to make these mistakes, but if they happen it's imperative that audiences point this out. Often, our readers do. 

Data is being devalued

Raw qualitative data is often written, literally, in a language anyone can understand - tweets, feedback from surveys etc. Quantitative data, on the other hand, is often impenetrable in its raw form, and sadly, when the data is analysed, the results can be manipulated at so many stages, not least when choosing what findings to report.

Of course, if quantitative data is gathered correctly and diligently, analysed accurately and reported in full, it trumps all. It is incontrovertible, it can be acted on, e.g. in conversion rate optimisation, with confidence.

Until such a time as measurers in the marketing industry are fully aware of best practice, then qualitative data, as well as our intuition, our PR and branding hunches, are perhaps our most faithful instruments.

Publishers have an important part to play in championing proper analysis, and not publishing spurious stats, consequently devaluing the industry.

Sentiment analysis

Positive sentiment isn't conveyed efficiently to upper management

The problem with data can also impact on the softer metrics. When targets are handed out, they are necessarily quantitative and well-defined. Does this impact on communication of brand success to the board?

Edited retweets, @replies, comments, favourites. These are all things that can be counted up. In theory, reach can be ascertained, too, with many of the social analytics SAAS's available.

And yet, accurate reporting of how much benefit social brings to the brand is still a thorny issue, particularly for brands that don't have to provide customer service on social, and for B2B brands. It's that fairly old issue of social ROI.

(Matt Owen has written much on the Econsultancy blog about this issue - see this post on the difficulty of attribution).

The only solution to this is to view the influential and positive tweet as one would a news cutting for a high-profile rag. In the 'old days' one would circulate details of mentions in articles etc in the monthly reporting, and I feel the same thing is entirely appropriate with social.

Doubtless some Social Managers do this. 'This week @AshleyFriedlein tweeted one of our articles' or similar.

There may also be a greater cause for concern. Is PR and content becoming more nebulous and consequently undervalued through lack of understanding from upper management of softer metrics? More persuasive reporting is needed to start addressing these issues.

When salespeople sell, they talk about all the soft, qualitative, emotional metrics, as well as the hard stuff. They’ll give an accurate picture of our audience, reach, sentiment etc., because they know this is the important, human side of the business.

We make some money with this rhetoric, but the same rhetoric about love from our audience is often, paradoxically, not conveyed accurately to stakeholders (on paper at least).

B2B sales attribution 

Salespeople don't care enough about the brand and attribution is difficult. Can we kill two birds with a social CRM stone?

There's no getting around the fact that salespeople have to be incentivised on the contracts they bring in.

However, if salespeople are using 'contract value' as their only metric, there's no doubt that some brands can harm their image by being too salesy. What if every salesperson was encouraged to push their own personal brand, through integration of all public communications channels (social, email, phone)?

Social CRM software (see What is social CRM?), and opening up all social networks to each member of a sales team, is good for the salespeople. They know that if they can concentrate on the relationship and the conversation, they're more likely to sell, and more likely to increase the customer lifetime value. It's also good for the brand if sales aren't intimidating prospects, but are soft selling.

So there's a new softer metric born - 'customer value as a product of contract value, customer goodwill and consequent amplification of brand'. OK, it can't be measured yet, but it's how sales teams should be thinking.

See Minter Dial and Eric Mellet's free report on The Sales Organization of the Future.

If salespeople are doing more of the relationship building, pointing customers to relevant content, engaging with them and essentially curating a service, then more of that can necessarily be safely attributed to sales. And other teams will be less sceptical about commisioned teams taking their money.

So, what are my conclusions?

  1. We need to question data, even if it nicely fits our agenda.
  2. On the other side of the coin, we need to be transparent when we present data, methodology and conclusions (without hiding behind confidentiality BS).
  3. Although professional development and remuneration will doubtless entail KPIs, hard metrics etc, qualitative findings and positive sentiment should be included more in reporting.
  4. Encouraging your staff to pursue their own 'personal branding' will not only lead to more sales (directly or otherwise, depending on the department) but will also lead to greater appreciation of PR and sentiment, business-wide.

This isn’t a paean to other, more rigorous industries. This isn’t a eulogy for the departing hopes of a digital, transparent market. This is an invitation to all of you, to discuss what you are happy with, and what you aren’t.

We should probably start being better at all these numbers AND report some of the love in the room, too.

Ben Davis

Published 20 August, 2013 by Ben Davis @ Econsultancy

Ben Davis is a senior writer at Econsultancy. He lives in Manchester. You can contact him at ben.davis@econsultancy.com, follow at @herrhuld or connect via LinkedIn.

612 more posts from this author

Comments (12)

Comment
No-profile-pic
Save or Cancel
Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

Great post Ben, I wish it was something that was talked about more. If there is a crisis, most people won't see it, all they'll see is the shiny inforgraphic...

over 2 years ago

Ben Davis

Ben Davis, Senior Writer at EconsultancyStaff

@Stuart

Cheers.

Infographics! Yes!

We stopped doing 'Infographic of the week', which was painful because it always did well, views-wise.

I think they can be done really well, but the art of the poster has been around for yonks. Cramming in too many spurious stats doesn't always make things *clearer*.

over 2 years ago

Alec Cochrane

Alec Cochrane, Head of Optimisation at Blue Latitude

Hi Ben,

The mainstream media is particularly guilty of parroting press releases that are released without access to the underlying data and with biased surveys anyway. They do this because they need to be first to the scoop of the story for the additional page views and putting in extra effort of criticising an underlying data set for a largely apathetic audience isn't worth it.

That said, I often see similar problems when producing insight reports for my clients. The analysis says 'X% of people with a smartphone shop on a smartphone', but the stat on its own is meaningless - you need to change it into an insight that you can tell your boss/client/CEO. Well the insight is that 'more people are buying on mobiles than you originally thought'. Before you know it your stat has changed and you are being misquoted because your reporting an insight not a fact.

There are no easy solutions to this beyond really forcing fact and insight to be two separate things.

Cheers,
Alec

over 2 years ago

Ben Davis

Ben Davis, Senior Writer at EconsultancyStaff

Thanks, Alec. Clear and senseful thoughts.

over 2 years ago

Avatar-blank-50x50

Sam Fox

Time on page 30 seconds, 96% bounce rate = a concern?

Typically you won't know how long a visitor spent on the page if they bounced though... So to the point you made previous to that, high bounce rate may not be of such concern since you probably can't link it to the low time on page...

Even more so with a high bounce rate the time on page for even the other 4% is not going to be 30 seconds on average. This being true at least if you have Google analytics which will class your bounces as 0 seconds, thereby skewing your time metric even more...

For a post about meaningful measurement, that paragraph makes me bounce...

I'm viewing on mobile so maybe you have an event firing when a user scrolls a certain length down the page which would give more credibility to your point about ' mighty concern'. But given the title of the article the point, in my view is more concerning that you use that combination of metrics

over 2 years ago

Ben Davis

Ben Davis, Senior Writer at EconsultancyStaff

@Sam

Thanks for commenting.

I was talking about the metrics separately (though I accept that bounce rate necessarily affects time on page).

ANYWAY, I was just making a general point that analytics tends to give us a good idea of what content trends are hottest, and outside of that there are any number of signifiers of a 'worthwhile post'.

I hope you can ignore that paragraph and read on. The thrust is that healthy scepticism is a good thing. In that regard, I +1 your comment for calling me out.

Best,

over 2 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

Ben, there's a lot to discuss here - to take one point...

As a man who lies naturally at the "tech/stat" end of the community I want to just say "WHO-RAY", this stuff is close to my heart.

... too often stats and results of measurement are just used in an attempt to bludgeon potential customers into purchase (often by the marketing / PR community), instead of improving understanding and prompting/informing further analysis.

However, we have to recognize that many of the net consumers of this measurement are not analytically/numerically minded, and I think this accounts for the attraction of the Infographic, which seems to promise more accessible information.

My wife (a primary school teacher) continually reminds me that the world splits into people who think and learn through pictures and those who think in words/numbers/lists; I think this also explains the rise/importance of the infographic as I suspect the marketing community are strongly biased to the former group, as I think this is type of approach is linked to the creative/conceptual thinking that is so central to marketing.

We have just embarked on a development project to attempt to provide an "infographic overlay" to our reporting, to improve accessibility for the graphically-minded community. I am really keen to see how this goes.

over 2 years ago

Stuart McMillan

Stuart McMillan, Deputy Head of Ecommerce at Schuh

@Malcom, a really interesting read on the subject of graphically representing data: http://www.fivesimplesteps.com/products/a-practical-guide-to-designing-with-data

over 2 years ago

Ben Davis

Ben Davis, Senior Writer at EconsultancyStaff

@malcolm

Absolutely. I want to re-assert Alec's point above. Fact and insight have to be separated. At the moment, they're used together, as you put it 'to bludgeon the customer'.

Visualisation, one need only look to religion, is canonising, or ossifying (there's probably a better word), so I think it has to be used particularly carefully. It's always appropriate to try to make data more digestible, but the correct charts have to be used.

For example, none of that cleverly placing the X axis above Y=0 to try to make trends look bigger than they are. Likewise no using non-linear axes to try to disguise trends.

I digress.

For analytics, I think it's definitely a goal to properly display trends over time, and fairly easy to do.

As for infographics, they are usually fairly straightforward in the ways they present data, often not through visualisation (though they purport to) but through putting a stat on a picture, which helps recall and as you say, speaks to the creative.

over 2 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

@stuart - thanks for the link... browsing my copy now - looks like a good publication...

@ben - I could betray my age by telling you I started using computer graphics for business data representation in 1976, and later (1980) worked developing "graphical information systems" it all seems like just a moment ago :-( .

... and since then I am constantly reminded that (as you say) we need to take great care that we are both using the right data, and representing it in the right (accessible) way.

Now the whole "big-data" thing and on-line business has meant that marketing has no option but to wrestle with the analysis and exploitation of this data - driven by what we have come to know as the "marketing technology arms race".

Over the last 10 years we have had some moments of blinding success when we have managed to get the marketing and tech folk communicating properly, providing the right data in usable ways and then using data-driven creative brilliance to exploit this data via marketing programs that blew everyone away with their performance - but those moments were too rare.

I think we are just entering a phase where we can make some of the required tech/data/visualization solutions "commodity" and start seeing amazing things happen every day...

over 2 years ago

Ben Davis

Ben Davis, Senior Writer at EconsultancyStaff

@Malcolm

As a newb(ish), I doff my cap. Would be great to get some of your specific insight into 'data visualisation through the ages' (planting a seed for your next blog post).

over 2 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

Age is no predictor of insight Ben! But maybe there's a topic worth some words :-)

over 2 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.