6 Ways to Recover from Bad Links

Posted by Dr. Pete

It’s a story we hear too often: someone hires a bad SEO, that SEO builds a bunch of spammy links, he/she cashes their check, and then bam – penalty! Whether you got bad advice, “your friend” built those links, or you’ve got the guts to admit you did it yourself, undoing the damage isn’t easy. If you’ve sincerely repented, I’d like to offer you 6 ways to recover and hopefully get back on Google’s Nice list in time for the holidays.

This is a diagram of a theoretical situation that I’ll use throughout the post. Here’s a page that has tipped the balance and has too many bad (B) links – of course, each (B) and (G) could represent 100s or 1000s of links, and the 50/50 split is just for the visual:

Hypothetical link graph

Be Sure It’s Links

Before you do anything radical (one of these solutions is last-ditch), make sure it’s bad links that got you into trouble. Separating out a link-based penalty from a devaluation, technical issue, Panda “penalty”, etc. isn’t easy. I created a 10 minute audit a while back, but that’s only the tip of the iceberg. In most cases, Google will only devalue bad links, essentially turning down the volume knob on their ability to pass link-juice. Here are some other potential culprits:

  1. You’ve got severe down-time or latency issues.
  2. You’re blocking your site (Robots.txt, Meta Robots, etc.).
  3. You’ve set up bad canonicals or redirects.
  4. Your site has massive duplicate content.
  5. You’ve been hacked or hit with malware.

Diagnosing these issues is beyond the scope of this post, but just make sure the links are the problem before you start taking a machete to your site. Let’s assume you’ve done your homework, though, and you know you’ve got link problems…

1. Wait It Out

In some cases, you could just wait it out. Let’s say, for example, that someone launched an SQL injection attack on multiple sites, pointing 1000s of spammy links at you. In many cases, those links will be quickly removed by webmasters, and/or Google will spot the problem. If it’s obvious the links aren’t your fault, Google will often resolve it (if not, see #5).

Even if the links are your responsibility (whether you built them or hired someone who did), links tend to devalue over time. If the problem isn’t too severe and if the penalty is algorithmic, a small percentage of bad links falling off the link graph could tip the balance back in your favor:

Link graph with bad links removed

That’s not to say that old links have no power, but just that low-value links naturally fall off the link-graph over time. For example, if someone builds a ton of spammy blog comment links to your site, those blog posts will eventually be archived and may even drop out of the index. That cuts both ways – if those links are harming you, their ability to harm will fade over time, too.

2. Cut the Links

Unfortunately, you can’t usually afford to wait. So, why not just remove the bad links?

Link graph with all bad links cut

Well, that’s the obvious solution, but there are two major, practical issues:

(a) What if you can’t?

This is the usual problem. In many cases, you won’t have control over the sites in question or won’t have login credentials (because your SEO didn’t give them to you). You could contact the webmasters, but if you’re talking about 100s of bad links, that’s just not practical. The kind of site that’s easy to spam isn’t typically the kind of site that’s going to hand remove a link, either.

(b) Which links do you cut?

If you thought (a) was annoying, there’s an even bigger problem. What if some of those bad links are actually helping you? Google penalizes links based on patterns, in most cases, and it’s the behavior as a whole that got you into trouble. That doesn’t mean that every spammy link is hurting you. Unfortunately, separating the bad from the merely suspicious is incredibly tough.

For the rest of this post, let’s assume that you’re primarily dealing with (a) – you have a pretty good idea which links are the worst offenders, but you just can’t get access to remove them. Sadly, there’s no way to surgically remove the link from the receiving end (this is actually a bit of an obsession of mine), but you do have a couple of options.

3. Cut the Page

If the links are all (or mostly) targeted at deep, low-value pages, you could pull a disappearing act:

Link graph with page removed

In most cases, you’ll need to remove the page completely (and return a 404). This can neuter the links at the target. In some cases, if the penalty isn’t too severe, you may be able to 301-redirect the page to another, relevant page and shake the bad links loose.

If all of your bad links are hitting a deep page, count yourself lucky. In most cases, the majority of bad links are targeted at a site's home-page (like the majority of any links), so the situation gets a bit uglier.

4. Build Good Links

In some sense, this is the active version of #2. Instead of waiting for bad links to fade, build up more good links to tip the balance back in your favor:

Link graph with good links added

By “good”, I mean relevant, high-authority links – if your link profile is borderline, focus on quality over quantity for a while. Rand has a great post on link valuation that I highly recommend – it’s not nearly as simple as we sometimes try to make it.

This approach is for cases where you may be on the border of a penalty or the penalty isn’t very severe. Fair warning: it will take time. If you can’t afford that time, have been hit hard, or suspect a manual penalty, you may have to resort to one of the next two options…

5. Appeal to Google

If you’ve done your best to address the bad links, but either hit a wall or don’t see your rankings improve, you may have to appeal to Google directly. Specifically, this means filing a reconsideration request through Google Webmaster Tools. Rhea at Outspoken had an excellent post recently on how to file for reconsideration, but a couple of key points:

  • Be honest, specific and detailed.
  • Show that you’ve made an effort.
  • Act like you mean it (better yet: mean it).

If Google determines that your situation is relevant for reconsideration (a process which is probably semi-automated), then it’s going to fall into the hands of a Google employee. They have to review 1000s of these requests, so if you rant, provide no details, or don’t do your homework, they’ll toss your request and move on. No matter how wronged you may feel, suck it up and play nice.

6. Find a New Home

If all else fails, and you’ve really burned your home to the ground and salted the earth around it, you may have to move:

Link graph with site moved

Of course, you could just buy a new domain, move the site, and start over, but then you’ll lose all of your inbound links and off-page ranking factors, at least until you can rebuild some of them. The other option is to 301-redirect to a new domain. It’s not risk-free, but in many cases a site-to-site redirect does seem to neuter bad links. Of course, it will very likely also devalue some of your good links.

I’d recommend the 301-redirect if the bad links are old and spammy. In other words, if you engaged in low-value tactics in the past but have moved on, a 301 to a new domain may very well lift the penalty. If you’ve got a ton of paid links or you’ve obviously built an active link farm (that’s still in play), you may find the penalty comes back and all your efforts were pointless.

A Modest Proposal

I’d like to end this by making a suggestion to Google. Sometimes, people inherit a bad situation (like a former SEO’s black-hat tactics) or are targeted with bad links maliciously. Currently, there is no mechanism to remove a link from the target side. If you point a link at me, I can’t say: “No, I don’t want it.” Search engines understand this and adjust for it to a point, but I really believe that there should be an equivalent of nofollow for the receiving end of a link.

Of course, a link-based attribute is impossible from the receiving end, and a page-based directive (like Meta Robots) is probably impractical. My proposal is to create a new Robots.txt directive called “Disconnect”. I imagine it looking something like this:

Disconnect: www.badsite.com

Essentially, this would tell search engines to block any links to the target site coming from “www.badsite.com” and not consider them as part of the link-graph. I’d also recommend a wild-card version to cover all sub-domains:

Disconnect: *.badsite.com

Is this computationally possible, given the way Google and Bing process the link-graph? I honestly don’t know. I believe, though, that the Robots.txt level would probably be the easiest to implement and would cover most cases I’ve encountered.

While I recognize that Google and Bing treat bad links with wide latitude and recognize that site owners can’t fully control incoming links, I’ve seen too many cases at this point of people who have been harmed by links they don’t have control over (sometimes, through no fault of their own). If links are going to continue to be the primary currency of ranking (and that is debatable), then I think it’s time the search engines gave us a way to cut links from both ends.

Update (December 15th)

From the comments, I wanted to clarify a couple of things regarding the "Disconnect" directive. First off, this is NOT an existing Robots.txt option. This is just my suggestion (apparently, a few people got the wrong idea). Second, I really did intend this as more of a platform for discussion. I don't believe Google or Bing are likely to support the change.

One common argument in the comments was that adding a "Disconnect" option would allow black-hats to game the system by placing risky links, knowing they could be easily cut. While this is a good point, theoretically, I don't think it's a big practical concern. The reality is that black-hats can already do this. It's easy to create paid links, link farms, etc. that you control, and then cut them if you run into trouble. Some SEO firms have even built up spammy links to get a short-term boost, and then cut them before Google catches on (I think that was part of the JC Penney scheme, actually).

Almost by definition, the "Disconnect" directive (or any similar tool) would be more for people who can't control the links. In some cases, these may be malicious links, but most of the time, it would be links that other people created on their behalf that they no longer have control over.

Do you like this post? Yes No

This entry was posted in Uncategorized and tagged , , , , , , , , , . Bookmark the permalink.