Google Changes Algorithm To Penalize Site Scrapers

Google updated its search algorithm this week to help reduce webspam in its search results.

These changes were made in response to increased criticism of Google and its search engine results. The criticism has been partly inspired by the emergence of newer forms of webspam alongside traditional webspam (pages that consist of lots of keywords and phrases without context or meaning that “cheat” their way up to higher search ranks).

The latest webspam outbreaks commonly come from content farms and sites that syndicate content. Earlier this month, Stack Overflow‘s Jeff Atwood pointed out that in the last year, some content syndicators have routinely began outranking Stack Overflow on Google. In other words, the syndicates are outranking the originals.

In Stack Overflow‘s case, the problem was bad enough that a community member built a Google Chrome extension designed to redirect to Stack Overflow from spammier syndicates.

Matt Cutts, principle engineer at Google and head of the webspam team, responded to some of the criticism in a blog post and said Google would be “evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.” On his personal blog, Cutts confirmed that those changes have indeed gone into effect.

Cutts writes that this was a “pretty targeted launch” and that the “net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.”

More About: Google, matt cutts, Search Spam, spam, stack overflow, webspam

This entry was posted in Channels, google, matt cutts, News, search, Search Spam, software, spam, stack overflow, Web Apps, webspam and tagged , , , , , , , , , . Bookmark the permalink.