Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Google algorithm updates are a fact of life for brands and search specialists.
With so many Google updates over the course of a year, we have a periodic collective outburst of stress around these changes.
However, the effects of these changes are generally much more subtle and are not the cataclysmic events some fear.
How can businesses find out that they have been penalised by Google?
I've been asking a number of search experts for the tell-tale signs to look out for...
It’s nearly Christmas so it only seems right that we give people a chance to share in the greatest gift of all: hope.
Specifically that’s the hope that Google will listen to the SEO community and makes their dreams come true in 2015.
We’re about to leave 2014 behind us and fly headlong into 2015, so it’s a good time to get the crystal ball out and predict what the next 12 months will bring for search.
I’m not much good at soothsaying, so I asked some of our good friends from the SEO world what they think is on the horizon.
Everybody talks about the need to provide quality content on your site if you want to rank well in searches. But how do search engines identify quality content?
Successive Google algorithm updates (culminating in the recent Panda 4.1) aim to refine results so that they match the intent of the search query and deliver the most comprehensive, accessible and well-written answer.
Panda 4.1, as it has been dubbed, was released by Google last week with the aim of identifying low quality content more easily.
According to Google, it affects between 3% and 5% of search queries and will result 'in a greater diversity of high-quality small- and medium-sized sites ranking higher'.
I asked Marcus Tober, CTO and co-founder of SearchMetrics, about the new update.
In my experience, severe panda-related hits tend to boil down to a root cause of either duplicate content, thin content, or extremely poor user experience.
As I’ve already covered many of the other areas involved in recovering from panda this month, I wanted to focus on thin content – what it is, how to spot it, and most importantly, how to fix it using Google Analytics.
A few weeks have passed since Google’s long awaited and much speculated Penguin 2.0 update, and with the dust beginning to settle, we took a look at its impact in the UK.
There’s been no shortage of hype in the run-up to Penguin 2.0, with everybody’s favourite Google spokesperson and distinguished engineer Matt Cutts describing the forthcoming update as ‘a big one’ back in March.
But, so far at least, has it lived up to its billing as Google’s most advanced piece of spam-fighting technology to date?
Thanks again to Panda, Penguin etc, it seems many webmasters are panicking about links they have obtained in the past, or have been pulled up by Google as a result of over-zealous link building.
As a result, we are receiving many more link removal requests than we ever used to, ten or so in the past couple of months.
To be frank, these requests are annoying, and I'm also a little put out that they see this blog as a risk to them. Chris Lake touched upon this recently and, as he says, 'a lot of folks seem to have a bad case of The Fear'.
I thought it was worth exploring this issue in more detail, so I've asked a few SEO experts for their views...
It sometimes sucks, being a publisher in a post-Penguin, post-Panda world. It’s great that Google is cleaning up webspam, but it’s not so great to be on the receiving end of stupid demands from people who give the SEO industry a bad name.
What am I talking about? Dubious links, that’s what. Or should I say dubious links on a supposedly authority website (ours), that have been flagged up by dubious SEO tools. Emails with ‘please remove this link’ make our hearts sink.
What else? Dubious expectations. Why is it that publishers like Econsultancy are expected to clean up the mess? This is the last thing I want us to be doing. “It will be good for both of us,” they say, with various degrees of menace. No it won’t. It’s a cost to our business, and to the publishing industry more broadly.
We have always been hugely supportive of the SEO industry, and as a web business we’ve always tried to stay on top of SEO best practice. As such it is deeply frustrating to be on the receiving end of requests to remove ‘suspicious’ links, or to add no_follow to links that I think are perfectly acceptable.
I’m not planning on revealing any names here, but let me explain what I’m talking about. There are three areas for concern. The first two are linked to stupid, short-term thinking, and needless panic. The last one might indicate that Google is changing the goalposts around guest blogging.
Is this the tip of the iceberg, or a few isolated incidents that we’re experiencing?
We've seen a lot of changes in the SEO world over the last six months, with content marketing in particular becoming a hotter topic almost by the day.
But if you really think about what a good SEO campaign should look like, it's pretty obvious that link manipulation and over-optimisation is never what Google was looking for when reviewing quality in sites.
In fact, in the words of Google themselves; creating quality content is the single biggest thing you can do!