Best method for blocking a subdomain with duplicated content
-
Hello Moz Community
Hoping somebody can assist.
We have a subdomain, used by our CMS, which is being indexed by Google.
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster toolsI understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file?
It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property.
I've also asked the developer to add a password protection to the subdomain but this does not look possible.
What approach would you recommend?
-
Many thanks for taking the time to reply.
Ryan - I am talking to CMS Source about these options; it's not possible with the current configuration but it looks like we may be able to change this with a bit of development work.
Sean - Thanks, I had considered the canonical, which would at least prevent a duplicate content issue although I think we need to take measures to stop this admin. subdomain being accessible which this won't help with.
Lynn - Thanks, I'll do that as an interim measure.
Thanks
Kate -
HI,
If you have the subdomain set up in GWT then you could do a url removal request (Google Index -> Remove URLs). Just put in the root admin subdomain (ie https://admin.naturalworldsafaris.com/). This usually works pretty quickly and given your description will likely be the quickest way to drop admin urls that are showing up in the serps.
It is not an ideal solution and the tool will tell you that for permanent exclusion you should do it through your robots.txt file. in regards your question about that though - the robots.txt file for the subdomain has to be placed on the subdomain, adding the subdomain to the main domains robot.txt file wont work.
-
One alternative you could try is to place a
while this is not as ideal as adding the noindex nofollow it could help tell Google that the admin is not the preferred domain.
-
That's very strange behavior for the admin portion of a site to not be behind password protection. If the current developer is unable to do so, I'd ask around on that because if you block the admin portion via password protection, even if it's still indexed in Google it will rank much lower.
Are you able to contact someone at CMS Source to get some support? That might help resolve this as well as provide guidance on getting the robots.txt uploaded and being able to add noindex to admin only.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
Dropped ranking - Penguin penalty or duplicate content issue?
Just this weekend a page that had been ranking well for a competitive term fell completely out of the rankings. There are two possible causes and I'm trying to figure out which it is, so I can take action. I found out that I had accidentally put a canonical on another page that was for the same page as the one that dropped out of the rankings. If there are two pages with the same canonical tag with different content, will google drop both of them from the index? The other possibility is that this is a result of the recent Penguin update. The page that dropped has a high amount of exact anchor text. As far as I can tell, there were no other pages with any penalties from the Penguin update. One last question: The page completely dropped from the search index. If this were a Penguin issue, would it have dropped out completely,or just been penalized with a drop in position? If this is a result of the conflicting canonical tags, should I just wait for it to reindex, or should I request a reconsideration of the page?
Intermediate & Advanced SEO | | gametv0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0