Duplicate content across multiple domains
-
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain.
My question relates to the removal of these pages. There are thousands of these duplicate pages.
I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen?
Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site.
I suppose another option would be to include no cache meta tag for these pages.
Any thoughts or comments would be appreciated.
-
I went ahead and added the links to the sitemap, however when google crawled the links I receieve this message.
When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL.
However I do not understand how adding the redirected links to the sitemap will remove the old links.
-
Worth a shot. Crawl bots usually work by following links from page to the next. If links links no longer exist to those pages, then Google will have a tough time finding those pages and de-indexing them in favor or the correct pages.
Good luck!
-
One of the previous developers left a hole that caused this issue. The system shares code between sites.
-
Andrew,
The links were removed from the offending sites, but If I understand the gist of your suggestion Google won't remove them as quickly if they are no longer linked and yes I am using canonical tags. So I should create a sitemap with the previous links and once Google follows these links to the main site remove the sitemap. Is that your recommendation?
I suppose I can try this first before filing a request to remove the entire site.
-
Ah, I thought he was saying the dupe content does still exists but no more duplication is taking place after the fix. That's where I was going wrong then lol.
-
As long as the duplicate content pages no longer exist and you've set up the 301 redirects properly, this shouldn't be a long term problem. It can sometimes take Google a while to crawl through 1000's of pages to index the correct pages. You might want to include these pages in a Sitemap to speed up the process, particularly if there are no longer any links to these pages from anywhere else. Are you using canonical tags? They might also help point Google in the right direction.
I don't think a no cache meta tag would help. This is assuming the page will be crawled and by that point Google should follow the 301 and cace that page.
Hope this helps! Let me know how the situation progresses.
Andrew
-
Do you want the smaller sites to still exist? If they don't matter at all then you could always take them offline though that's not recommended for obvious reasons (but it would get them out of the index fairly quick).
If they still need to exist then we're just back to the same thing, changing the content on them. If the problem has been fixed to stop further duplication then that's fine... you could limit the damage by having all of those smaller sites be dupes of each other but not of the main site by rewriting the smaller ones with one lot of content, or the main one. At least that way they will only be competing with each other and not the main site any more.
Or have I still got the wrong end of the stick?
-
I am referring to an e-commerce site, so yes its dynamic. The hole has been plugged (so to speak) but the content still exists in the google cache.
-
Ah I see, so it's a CMS which pumps out content then?
But it pumps it to other sites?
-
Steve, Maybe I haven't explained the issue in enough detail. The duplicate content issue is related to a technical issue with the site causing the content to be duplicated when it should not have been. Its not a matter of rewriting content. My issue deals with purging this content from these other domains so that the main domain can be indexed with this content.
-
You could always just rewrite the content so it's not duplicate, that way you get to keep them cached and maybe focus on some different but still targeted long tail traffic... turn a negative into a positive. I accept thousands of pages is a lot of work, but there's a million and one online copywriters who are pretty good (and cheap) that you could assign projects to for it. Google copywriters for hire or freelance copywriters... could have it done in no time and not spend that much
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Duplicate content w/ same URLs
I am getting high priority issues for our privacy & terms pages that have the same URL. Why would this show up as duplicate content? Thanks!
Technical SEO | | RanvirGujral0 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
Duplicate Content
SEOmoz is reporting duplicate content for 2000 of my pages. For example, these are reported as duplicate content: http://curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158
Technical SEO | | jplill
http://curatorseye.com/Name=âHolster-Atlasâ---Used-by-British-Officers-in-the-Revolution&Item=4158 The actual link on the site is http://www.curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158 Any insight on how to fix this? I'm not sure where the second version of the URL is coming from. Thanks,
Janet0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Multiple Domains on 1 IP Address
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company. They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses? Thank you!
Technical SEO | | Motivators0 -
Duplicate Content Resolution Suggestion?
SEOmoz tools is saying there is duplicate content for: www.mydomain.com www.mydomain.com/index.html What would be the best way to resolve this "error"?
Technical SEO | | PlasticCards0 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0