Duplicate Content over Multiple Domains – SEO Issues?

Duplicate Content

I recently purchased some domains for SEO, and I wanted to do some research about best practices for using multiple domains on the same content.

The good news?  Google does not punish duplicate content; they just sort it out themselves. From Duplicate Content Summit at SMX:

  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don’t want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn’t cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn’t cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.

Duplicate content links

This information is from the Official Google Webmaster Central Blog, which is a great resource.  A couple of articles relating to duplicate content and multiple domains:

The take-away

So after doing more in-depth reading, Google’s only worried about people using duplicate content on multiple domains for spamming purposes.  From Deftly Dealing with Duplicate Content:

In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. However, we prefer to focus on filtering rather than ranking adjustments … so in the vast majority of cases, the worst thing that’ll befall webmasters is to see the “less desired” version of a page shown in our index.

As long as you aren’t scheming, you’ll be ok. If you want to get the most use out of two or more domains, you’ll need to have unique content for each domain — so get writing!

By Zack Katz

Zack Katz is the founder of GravityKit and TrustedLogin. He lives in Leverett, Massachusetts with his wife Juniper.

27 replies on “Duplicate Content over Multiple Domains – SEO Issues?”

We had a client who had the same content on multiple URLs. They got good traffic and a lot of sales through the oldest URL and still seemed to get some slight traffic on the secondary URLs. And then they had a snafu that caused them to lose a bunch of business.

Their webmaster got their main site de-indexed because she put a no-follow for the whole site in the robots.txt. They realized three months later that there were major problems.

In the meantime, one of the secondary sites got first dibs on the content and the main site (with an older URL) lost a ton of organic traffic, along with several hundred thousand dollars in sales.

We came in and the first place we started was rewriting the content for the oldest site and concentrating on different keywords, which helped them get back some of the traffic they had lost.

Sounds like quite the debacle. When things like that happen, it can cripple a business.

Were you not able to recover any of the previously well-ranked keywords?

Google likes unique content in general — if you have the same content as other websites, they know you’ve probably got default text in there.

It turns out that websites have the exact same copy. That’s gotta hurt!

I just recently moved my blog from one domain to the other for reasons of relevence to the sites’ purpose. I deleted the posts from the old one about a week ago. Should i wait longer or do anything before i repost them on my new domain? Thanks for the info…trekking through google’s pages isn’t something i wanted to do right now.

@thenetofinter Here’s what I would do –
Right now, what you have done is deleted what you had without providing a new target. Get them back up as soon as possible, so Google doesn’t declare them gone. You want Google to transfer, not delete and re-discover.

You should have already made them live on the new domain, and if possible, pointed the articles to their new location using a 301 Redirect (using .htaccess).

Once you’ve set up the 301 redirections and Google has indexed the new posts, use Google Webmaster Tools to request the old posts (and site?) be removed from the index.

Google is very smart, but even so, the old posts will hang around on the new domain for a while, depending on how often your site gets crawled. Don’t worry about it…if someone clicks on a properly redirected page, it should take them to the proper location.

Obvious solution everyone should be aware of and not to be ignored. Good for pointing that out 🙂

wow…that easy huh? what do you mean when you say old posts hanging around on the new domain? they’ll automatically be brought to the new posts with no old content right? what if i don’t want any users on the old site to see any old content? just like they were never on the old site. i guess that would degrade SEO by not transferring any power for the new one in the first place, eh? and for google, the new site would always be the “stolen”, or duplicate content?

Zack, hope you’re still monitoring this blog. Here’s what I’m wondering. We have a client who has an ecommerce website, let’s call it We’re thinking of setting up eight other sites like and, etc. Each site would be just one page with content and a link back to We feel that multiple links to is good. Having come up well in search engines will be good and hopefully bring visitors to is on a totally different server that the eight others. They all will be on the same server. Good idea or bad?

Hey zack,
first time caller…
first time visitor…

here is my question and I think this is in the area of topic, forgive my noobnishness…

I have a website that I am currently in the process of building…

It has url/domain name. I have (3) three count them, (3) three other domain names that I want to tie into my main url/site so that I can host all on the same server. Too cheap to pay for more hosting.

Non the less, they all tie together and all are very, very related in products and information.

Now as for getting spiders out there to locate the individual sites persey, I have to do I create individual sites within the site structure and then point domains to pages within the main? I will have different keyword and content on each index.html page, yadi, yadi, yadi,…

Help me obeeone, you are my only hope.


Help. There is a ton of emphasis on local searches now. I have a client who has a domain name/brand new site. She’d like to purchase additional domains with the to take advantage of local searches. Is this going to hurt her if they all direct to the same main domain? If so – is it better if each domain resolves to a separate page written specifically for that town in the same site and link back to the main site?

Hi Guys…I really appreciate the great info here. I have a question, I want to delete about 30 different post on my wordpress blog. Will deleting these pages hurt my SEO rankings? The pages that I want to delete are all indexed but they are duplicate content post. I don’t want to take a chance a leave that content on my blog but I am not sure if I should just delete them…and if so should I delete them all at once?

You guys mentioned deleting PLR content which is another form of duplicate content but you guys did not touch on how deleting those duplicate posts/content will affect you site, especially if the pages are indexed.

How does Google handle deleted posts once they have been indexed?

Please let me know.


Hi Zack,
thanks for the good info on this site. this is really helpful.

I actually have a question. I am about to set up a complex website, let’s call it:

I regged that one at a domain reseller. Then, I  regged and at a totally different company – just to go save, that no one bothers me later on. Now what I am about to do is to set up a blog on (the reason I briefly explain further below)

the server which is linked to the domain reseller account of doesn’t have the latest mysql version – so I decided now to use the webspace of and (pointing), because there is the latest version of mysql.
The blog is supposed to test a bit, how SEO works in the sector, to get some experience with the segment etc. and later on I imagine it could be helpful, because when I set up the “real” site, it will benefit from the blog, as it is already existing in the web, spread, name is at least a bit known etc. – I imagine, that it somehow prepares users and google for what comes next.
Unfortunately, now, got indexed, which I never planned (the company puts a standard html on every recently regged site, where it says “this is the site of, if you are the admin, please log in here”).
Now here are my questions, would be great if you could give me an advice:
1. is there a danger of getting google ‘confused’ if I have three different domains (that’s what I read and somehow know). I only want one entry to the website later on, namely – the others I don’t want to use
2. I only want to point the DNS of to the webspace of / because the webspace has the latest mysql version I need for the blog, which the server doesn’t have that is linked to the reseller acc – is that already realy silly and is there a more reasonable way? Just to play it really safe.
3. should I de-index then ?
4. and does it make sense to build the (I guess it does regarding experience and so on, later on it can be a way to bring traffic to the main site, – that’s the plan)?
5. and if I point the domain to the webspace of /, can I make sure, that / won’t even be appearing, that the blog will NOT be found on those domains, that google excusivly sees as the only entry to the blog?
thanks a lot, I hope this makes sense to you somehow

This is promising to know. I have bought multiple domains which have the www A record point to the same web server. what i dont understand is how google will differentiate the difference between spam and legit duplicate content? can anyone shed some light?

What about duplicate contend with different tlds?

I have .eu and along with about 30 other keyword rich domains with the same tld variations.

Is regional variation within the content enough?

I knew it! I did a test some time and this is how it panned out… I had this content that i posted to 4 article directories with about 98% of the same content and i tweaked the TITLES a bit.

For instance, if it was “breast enhancement – the best in the business”, I replaced the enhancement with ” augmentation” and then ” enlargement” for each content on each article directory and lo, all 4 articles, which already had about 98% of the same content appeared on google for the “breast enhancement” keyword! WOW! I believe the TITLES play a major role there…

Hi, I’m having a website built and I’m trying to think of ways to get us some traffic. Please let me know if my plan sounds like it will work for me or against me. I’ve purchased about 10 search-word-specific domains that I plan on using to re-direct to one main site. It’ll be a retail site. A good example of what I’m doing would be if I were selling Toro lawn mower replacement parts, I would own,,, (you get the picture). Is this good or bad? Should each domain point to a unique page within the main site? We will only be selling approx. 30 – 40 different items.
I’d greatly appreciate a response. Thank you!

Thanks for your post. Our company has a .com site and a .ca site, both with the same content (in English). In September we saw a 30% decrease in traffic to our Canadian site, and now our US site is displaying in’s results. How can we fix this problem? Our Webmaster Tools settings are geographically targeted for the correct country for each. Our site is hosted in the US, and I don’t see any way that will change. Any advice?

I also want to add that we are trying to target visitors specifically in each country for services located in each unique country; we’re not trying to show multiple domains in each SERP.

Comments are closed.