Browsing articles tagged with " competitor analysis"

How to Stop Competitors Copying Your Links

Sep 1, 2010   //   by John McElborough   //   Contests, SEO Blog  //  19 Comments

This is a guest post from John McElborough. It is part of The “Bad Ass” SEO Guest Blogging Contest.

One of the first things we do when building links for a new client is to identify their competitors, work out who’s linking to them using link research tools and try to copy as many of their links as possible. But if you’re an established site how do you stop SEO’s like me mining and cloning your links?

A while ago I wrote about a technique I use to throw competitors off the scent using a redirect. Its certainly not full proof but following some discussions I had with other SEO’s after writing that post its clear that there’s a concern about link data mining so below I’ve compiled a definitive guide to the methods you can use to keep your links private and prevent cloning of your backlink profile.

(If you read all the way to the end I’ll share the results of some recent testing I’ve done which may give you a new technique to try out)

Blocking crawlers and backlink analysis tools

zero links- what's wrong with my backlink tool!

The simplest way to stop link mining is to block your site from crawlers used by link research tools. There are 3 common data sources used to power most backlink analysis tools:

Majestic SEO

Majestic SEO uses data from the Majestic12 crawler. It adheres to robots protocol so you can block it using this line in your robots.txt file:

User-agent: MJ12bot
Disallow: /

SEOMoz / Linkscape

SEOMoz’s linkscape index is used by an increasing number of SEO’s via their Open Site Explorer tool and the API. You can block SEOMoz from showing your links using a meta tag in the <head> of your pages.

<META NAME="SEOMOZ" CONTENT="NOINDEX" />

Yahoo

This one is subject to change when Yahoo moves fully over to Microsoft.

The only real way to block data from appearing in Yahoo Site Explorer and the multitude of 3rd party tools it serves like Linkdiagnosis and Market Samurai is to block the Yahoo crawler. This can be done in robots.txt:

User-agent: Slurp
Disallow: /

This however will block you entirely from Yahoo search results. Most of my sites get less than 5% of search traffic from Yahoo these days but still, its hard to justify cutting this out entirely. I’ve got a better solution below for dealing with Yahoo data.

Obfuscate your link data

Even if you can’t block competitors from seeing your backlinks entirely you can take steps to make their life more difficult. Building high volumes of low quality links to your site is a good way to obfuscate your data, making it harder for competitors to locate and copy your best links.

thats a lot of links, where do i start?

Yahoo Site Explorer and data accessed via the API will only show the first 1000 links pointing to a page. While Yahoo does tend to show the most important links near the top of their data this isn’t limited to unique domains and doesn’t exclude nofollowed links. As such in order to render your sites backlink data useless to competitors you need to build a few sitewide links on big, well established sites. Blogroll links are good for this. These links are pretty easy to buy because you can use the nofollow tag to avoid penalties. Facebook pages and forum signatures are also good for this.

Build links which can’t be mined

Create an un-copyable profile

This is certainly the best practice approach. When building links you should be aiming to build links which are going to be hard or impossible for your competitors to replicate. Typically these will be the types of links which manifest from great content and personal relationships. Your competitors won’t be able to offer cash to replicate these links.

requsts for paid links will be inneffective against this type of link

Run your own link hubs

The hardest links in the world to copy are links on sites which you own (you’re not going to link to your competitor now are you?!). If you’re part of a group of sites you might already be getting links this way. Beyond this you can look at everything from building a single microsite to developing a fully fledged distributed link network.

Build links downstream

One thing which no link analysis tool does is looks at the links which point at the pages which link to you. If you’ve got some good quality links on authoritative domains think about building links to those pages. You can bet your competitors aren’t thinking about who’s linking to your links.

diagram showing downstream linking process

Buy presell pages

If you’re in a competitive space and are buying links you should be thinking about how you can buy links which both minimise the paid link footprint and which are hard for competitors to replicate. The latter I feel lends itself to buying links on dedicated ‘presell pages‘. If you negotiate a presell page you can stipulate that the webmaster can’t add any extra links (to your competitors) on the page you’re renting. You can then build some links to the presell page. That way even if a competitor rents a page on the same site as you, they don’t get the same value from it as you do.

Place links on noindex pages

I’m planning on covering this in more detail on my blog soon but an idea I’ve been testing is building links on pages which aren’t indexed. A page which uses the noindex, follow robots meta tag will still flow PageRank and anchor text to sites it links to, but from the tests I’ve been playing around with at least,  these pages won’t show up in any major link analysis tools.

process for adding links to noindex pages

This has huge potential for anyone concerned about competitors spying on their backlinks. Certainly it will stop competitors finding and copying links (and meaning they can’t report paid links) it also creates an opportunity to distribute content without duplicate content penalties. Here’s some possible applications:

  • Buy presell pages which are noindex, followed
  • Let webmasters republish your content on their sites (with links)
  • Hide links on other sites you own to avoid network detection

Questions on any of these techniques or have your own methods? Please ask or share in the comments.

John McElborough

John McElborough is a search marketing consultant from London, England who runs the PPC management agency inbound360. Read more of his posts at johnmcelborough.com.

More Posts - Website

Follow Me:
TwitterLinkedInGoogle Plus