Duplicate Content with ADN, DNS and F5 URLs
-
In my duplicate content report, there are URLs showing as duplicate content.
All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc...
They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains.
We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these?
Thanks!
-
A couple more thoughts here, based on your revised question.
You'll want to figure out how those links to the rogue subdomain have been generated, so you don't just move them over to the new CMS (such as if it's in body text that gets wholesale copied without being examined).
If those old subdomains are not needed at all anymore, I'd get them removed entirely if you can, or at the very least blocked in robots.txt. You can verify each subdomain as its own site in Google Webmaster Tools, then request removal of those subdomains if the content is gone or if it's excluded in robots.txt.
You might suggest to the dev team that they password-protect things like this so they don't get accidentally crawled in the future, use robots.txt to block, etc.
If you have known dev subdomains that are needed, and you know about them as the SEO and make sure they have robots.txt on them, you might want to use a code monitoring service like https://www.polepositionweb.com/roi/codemonitor/ to monitor the contents of the robots.txt file. It will let you know if the file has been changed or removed (good idea for the main site too). I've seen dev sites copied over to live sites, and the robots.txt copied over too, so everything is now blocked on the new live site. I've also seen dev sites with a data refresh from the live site, and the robots.txt from the live site is now on the dev site, and the dev site gets indexed.
-
Thanks Keri, I received your note!
-
Hi! I have a couple of ideas, and sent you a quick email to the account on your Moz profile.
You may also find it helpful to do a google search for:
site:ourdomain.com -inurl:www
This will show you all the non-www subdomains that Google has indexed, in case some others have slipped on in and you don't want them to be indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have 2 versions of URLs. we have the mobile and the desktop. is that a duplicate content?
Hi, Our website has two version of URLs. dektop: www.myexample.com and mobile www.myexample.com/m If you go to our site from a mobile device you will land on our mobile URL, if you go to our site from desktop computer you will land on a regular URL. Both urls have the same content. Is that considered duplicate? If yes, then what can I do to fix it? Also, both URLs are indexed by google. We have two separate XML sitemaps- one for desktop and one for mobile. Is that a good SEO practice?
Technical SEO | | Armen-SEO0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Why are these two URLs showing in Moz as duplicate content?
Here is the first URL - http://www.flagandbanner.com/Products/FBPP0000012376.asp Here is the 2nd URL - http://www.flagandbanner.com/Products/flag-spreader.asp Granted I am new to this issue on this website, but what is Roger seeing that I'm not? A lot of our duplicate pages are just like this example.
Technical SEO | | Flaglady0 -
Duplicate Content in Dot Net Nuke
Our site is built on Dot Net Nuke. SEOmoz shows a very large amount of duplicate content because at the beginning each page got an extension in the following format: www.domain.com/tabid/110/Default.aspx The site additionally exists without the tabid... part. Our web developer says an easy fix with a canonical tag or 301 redirect is not possible. Does anyone have DNN experience and can point us in the right direction? Thanks, Ricarda
Technical SEO | | jsillay0 -
Duplicate Content Question (E-Commerce Site)
Hi All, I have a page that ranks well for the keyword “refurbished Xbox 360”. The ranking page is an eCommerce product details page for a particular XBOX 360 system that we do not currently have in stock (currently, we do not remove a product details page from the website, even if it sells out – as we bring similar items into inventory, e.g. more XBOX 360s, new additional pages are created for them). Long story short, given this way of doing things, we have now accumulated 79 “refurbished XBOX 360” product details pages across the website that currently, or at some point in time, reflected some version of a refurbished XBOX 360 in our inventory. From an SEO standpoint, it’s clear that we have a serious duplicate content problem with all of these nearly identical XBOX 360 product pages. Management is beginning to question why our latest, in-stock, XBOX 360 product pages aren't ranking and why this stale, out-of-stock, XBOX 360 product page still is. We are in obvious need of a better process for retiring old, irrelevant (product) content and eliminating duplicate content, but the question remains, how exactly is Google choosing to rank this one versus the others since they are primarily duplicate pages? Has Google simply determined this one to be the original? What would be the best practice approach to solving a problem like this from an SEO standpoint – 301 redirect all out of stock pages to in stock pages, remove the irrelevant page? Any thoughts or recommendations would be greatly appreciated. Justin
Technical SEO | | JustinGeeks0 -
Duplicate Content - Just how killer is it?
Yesterday I received my ranking report and was extremely disappointed that my high-priority pages dropped in rank for a second week in a row for my targeted keywords. This is after running them through the gradecard and getting As for each of them on the keywords I wanted. I looked at my google webmaster tools and saw new duplicate content pages listed, which were the ones I had just modified to get my keyword targeting better. In my hastiness to work on getting the keyword usage up, I neglected to prevent these descriptions from coming up when viewing the page with filter parameters, sort parameters and page parameters... so google saw these descriptions as duplicate content (since myurl.html and myurl.html?filter=blah are seen as different). So my question: is this the likely culprit for some pretty drastic hits to ranking? I've fixed this now, but are there any ways to prevent this in the future? (I know _of _canonical tags, but have never used them, and am not sure if this applies in this situation) Thanks! EDIT: One thing I forgot to ask as well: has anyone inflicted this upon themselves? And how long did it take you to recover?
Technical SEO | | Ask_MMM0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0