Beating Google’s Panda Update – 5 Deadly Content Sins

Posted by Cyrus Shepard

Was SEOmoz affected by Google’s Panda Updates? It depends how you look at it. Since the first update hit in February of 2011, organic search traffic to SEOmoz has increased by 49%.

SEOmoz and panda

To be fair, the dates that Panda hit don’t match up to periods when we saw traffic gains. In this time period we’ve rolled out original content, popular blog posts, introduced new tools and made several SEO improvements. In a way, you could say we’re good at not getting penalized by Panda.

Looking at the sites that lost the most traffic when Panda hit, I’m amazed at how poorly formatted most of these sites remain, even months later. A few have made improvements, but it feels like many webmasters decided it just wasn’t worth the effort, gave up, or they simply didn’t know what to do.

Panda – The 2-Minute Nutshell Version

Panda starts off with human quality raters who look at hundreds of websites. Computers, using machine learning, are then brought in to mimic the human raters. When the algorithm becomes accurate enough at predicting what the humans scored, it’s then unleashed across millions of sites across the Internet.

The point is: Panda starts off from a human point of view, not a machine’s. We can look at these sites with human eyes and see the obvious.

Remember, Panda is a site-wide penalty, not a page penalty. So if a certain percentage of your pages fall below Panda’s quality algorithm, then the whole site suffers. Fix enough of these pages and you may recover.

Note: I’ve used actual screenshots taken from sites well known to be hit by Panda. I don’t mean to call anybody out and I’m not picking on any particular site. On the contrary, we all have a lot to learn from these examples. Also, a lot of these topics are covered in Dr. Pete’s excellent post, Fat Pandas and Thin Content. It’s well work a look.

1. Heavy Template Footprint

Do you ever look at a site and ask yourself, “Where’s the Beef?” Consider the page below. How much original content exists above the fold?

Heavy Template Footprint

This template footprint creates a low ratio of original content. No one knows the exact threshold for what qualifies as duplicate content from a machine point of view, but this clearly violates all human standards.

Here at SEOmoz, our PRO platform uses a 95% threshold to judge duplicate content. This means if 95% of all the code on your page matches another page, then it’s flagged as a duplicate. To check your own ratios, try this nifty duplicate content tool.

2. Empty Content

Do you see pages that exist simply to link to other pages? I found this high authority page from faq.org in less than 10 seconds, which indicates there are plenty more.

Empty Panda content

Yep that’s the whole page right there. Or how about this page from Suite101? Eliminating these types of empty content pages, or just adding good material, will go a long way in eliminating Panda penalties.

3. Overlapping and Redundant Articles

Each page of your site should address a specific topic, instead of addressing a slightly different variation of a keyword phrase. How many variations of "Acai Berry Cleanse" does a website need?

How much acia berry?

The example above is from a high profile "content farm." This is one of the many reasons I believe Panda hit article sites so hard – pages and pages of overlapping articles that targeted keywords instead of humans. Combining these articles into a few, highly usable resources would cut down on the confusion.

4. High Ad Ratio

I understand the temptation. We can’t escape it. Google even tells us to plaster ads all over our site. But the Adwords team is separate from the spam team lead by Matt Cutts. Don’t expect them to both give the same advice.

Don't be like adwords

Optimizing for Adsense does not mean optimizing for search. I’m glad Google keeps their departments separate, but more consistent messaging from the company as a whole would reduce webmaster frustration.

5. Affiliate Links and Auto Generated Content

If a machine built your pages with minimal human intervention, Google wants to devalue you. We see this time and time again with multiple affiliate sites across the web.

Ouch!

Disclaimer: I predict that someday machines will be able to produce web pages indistinguishable from human generated content. Until that time, avoid it.

Beating Panda – Reduce the Sins

It’s my personal belief that the above 5 sins, alone or in combination, account for the vast majority of Panda penalties. Yet webmasters seem perplexed when faced with fixing the these problems.

Dr. Pete pointed me to a post on Webmasterworld from Duane Forrester. Although he’s talking about Bing, he explains the Panda situation well.

If a user searches on Bing, we return the best SERP we can for the query. The user clicks on the first result (as we’d expect, normally). They hit YOUR website, and…

1 – are so engaged they forget about the rest of the Internet for a few minutes and bask in your glory, getting the information they wanted, and more.

2 – are so dismayed they immediately hit their back button, suddenly popping back onto our radar, alerting us to the fact they were displeased with the result we just showed them – why else would they suddenly come back, without consuming your content? Displeased.
         -Duane Forrester

This is the future of the web – when the equivalent of human eyes look at every page of your site. Would you want your father to visit your website? Would you want your mother to shop there?

Rand had it right when he told us to improve engagement metrics. Make legitimate, on-site fixes that actually improve your visitor’s experience. You’ll find that reducing bounce rate, increasing page views and time-on-site has the side effect of making your visitors, and you, happier.

Do you like this post? Yes No

This entry was posted in Uncategorized and tagged , , , , , , , , , . Bookmark the permalink.