Google may be the dominant search engine in much of the world, but that doesn't mean it's perfect. To the contrary: there are plenty of examples which indicate that Google struggles to detect even the most unsophisticated web spam, and as a result is driving its users to sites that they'd probably rather not go.

Increasingly, Google's flaws in this area are attracting attention. While it's not yet clear if the attention is a reflection of the fact that consumers are finding that Google's results aren't meeting expectations or an indication that Google's glow has simply worn off, it is clear that Google has a lot to lose if it doesn't pay attention to web spam.

And pay attention to web spam it seems to be doing. Yesterday, Barry Schwartz at Search Engine Roundtable detailed how Google appears to be engaging in a more proactive enforcement effort using Google Webmaster Tools:

In early December, Google started sending out notices of doorway pages via Google Webmaster Tools and now webmasters are receiving more spam notifications.

A paid WebmasterWorld thread has reports of Google warning over "unnatural links" and a Google Webmaster Help thread has reports of warnings over "cloaking". Both are against Google's Webmaster guidelines.

The email notices from Google, which Schwartz published, warn recipients that Google has found "unnatural" links and cloaked content. The emails remind the recipients about Google's Webmaster guidelines, and instruct them on how they can submit their sites for reconsideration.

Unfortunately, the email published which warns of unnatural links provides far less detail to recipients about what Google has detected than the cloaking email, which may make it difficult to locate the links in question and remove them to Google's satisfaction.

But nonetheless, the fact that Google is sending out these notifications to Google Webmaster Tools users is a good thing, and it provides another good reason for publishers to sign up for Google Webmaster Tools.

As Google itself notes in one of the emails, many times violations of Google's guidelines are a result of a hack. In many cases, publishers won't even know they've been hacked until they see their Google traffic drop off a cliff.

The real question, however, is just how effective Google can be in detecting and dealing with web spam, even with a more proactive approach. There are plenty of publishers who aren't signed up with Google Webmaster Tools, and the true measure of Google's efforts to deal with this increasingly large problem will not be how the search giant deals with small-time violations.

Rather, the true measure will be how Google address the inconvenient fact that major companies, many of whom also advertise with Google, are using web spam and forbidden techniques to boost their rankings.

Photo credit: Robert Scoble via Flickr.

Patricio Robles

Published 7 January, 2011 by Patricio Robles

Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.

2642 more posts from this author

You might be interested in

Comments (3)



Surely, until G is able to accurately attribute web spam creation to the publisher rather than a hack/malicious 3rd party, prejudicing the target site would be unacceptable behaviour by G !?  Publishers following best practice could be undermined by ruthless competitors launching a programme of spammy linkbuilding pointing sites with otherwise legitimate backlink profiles etc etc.  Anyone aware of any examples of this having happened ?

over 7 years ago



@Dan I've experienced this with an affiliate site, where my unique content was spammed and spun onto various other sites (hint: with .ru extensions!) and my site was blacklisted alongside the other sites.... whilst they weren't a competitor, they'd still managed to take my site out with their spammy ways I can't see how G can evaluate the true source, unless all sites are somehow verified as 'original' - things like copyscape help but i'd hazard a guess that the majority of sites dont' use this.. however if G started to use copyscape..... that may help - problem right now is even if copyscape identifies a page as 'copied' G may blacklist all sites As with all things the search engines need to look at other available signals to help determine quality - its far too easy to create a new domain, spam some copy and gain short term benefits, followed by a rinse and repeat cycle....

over 7 years ago


John Nagle

Google seems to be doing a little to improve things. But as long as there's content from Marchex, Associated Content, eHow, and Demand Media showing up on the first page of Google searches, it's not working. As long as nonexistent locksmith and carpet cleaner locations show up, it's not working. As long as sites with ads but no business behind them show up, it's not working.

over 7 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.