Best method for blocking a subdomain with duplicated content
-
Hello Moz Community
Hoping somebody can assist.
We have a subdomain, used by our CMS, which is being indexed by Google.
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster toolsI understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file?
It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property.
I've also asked the developer to add a password protection to the subdomain but this does not look possible.
What approach would you recommend?
-
Many thanks for taking the time to reply.
Ryan - I am talking to CMS Source about these options; it's not possible with the current configuration but it looks like we may be able to change this with a bit of development work.
Sean - Thanks, I had considered the canonical, which would at least prevent a duplicate content issue although I think we need to take measures to stop this admin. subdomain being accessible which this won't help with.
Lynn - Thanks, I'll do that as an interim measure.
Thanks
Kate -
HI,
If you have the subdomain set up in GWT then you could do a url removal request (Google Index -> Remove URLs). Just put in the root admin subdomain (ie https://admin.naturalworldsafaris.com/). This usually works pretty quickly and given your description will likely be the quickest way to drop admin urls that are showing up in the serps.
It is not an ideal solution and the tool will tell you that for permanent exclusion you should do it through your robots.txt file. in regards your question about that though - the robots.txt file for the subdomain has to be placed on the subdomain, adding the subdomain to the main domains robot.txt file wont work.
-
One alternative you could try is to place a
while this is not as ideal as adding the noindex nofollow it could help tell Google that the admin is not the preferred domain.
-
That's very strange behavior for the admin portion of a site to not be behind password protection. If the current developer is unable to do so, I'd ask around on that because if you block the admin portion via password protection, even if it's still indexed in Google it will rank much lower.
Are you able to contact someone at CMS Source to get some support? That might help resolve this as well as provide guidance on getting the robots.txt uploaded and being able to add noindex to admin only.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content across different domains in different countries?
Hi Guys, We have a 4 sites One in NZ, UK, Canada and Australia. All geo-targeting their respective countries in Google Search Console. The sites are identical. We recently added the same content to all 4 sites. Will this cause duplicate content issues or any issues even though they are in different countries and geo-targeting is set? Cheers.
Intermediate & Advanced SEO | | wickstar0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Duplicate Content For E-commerce
On our E-commerce site, we have multiple stores. Products are shown on our multiple stores which has created a duplicate content problem. Basically if we list a product say a shoe,that listing will show up on our multiple stores I assumed the solution would be to redirect the pages, use non follow tags or to use the rel=canonical tag. Are there any other options for me to use. I think my best bet is to use a mixture of 301 redirects and canonical tags. What do you recommend. I have 5000+ pages of duplicate content so the problem is big. Thanks in advance for your help!
Intermediate & Advanced SEO | | pinksgreens0 -
Dropped ranking - Penguin penalty or duplicate content issue?
Just this weekend a page that had been ranking well for a competitive term fell completely out of the rankings. There are two possible causes and I'm trying to figure out which it is, so I can take action. I found out that I had accidentally put a canonical on another page that was for the same page as the one that dropped out of the rankings. If there are two pages with the same canonical tag with different content, will google drop both of them from the index? The other possibility is that this is a result of the recent Penguin update. The page that dropped has a high amount of exact anchor text. As far as I can tell, there were no other pages with any penalties from the Penguin update. One last question: The page completely dropped from the search index. If this were a Penguin issue, would it have dropped out completely,or just been penalized with a drop in position? If this is a result of the conflicting canonical tags, should I just wait for it to reindex, or should I request a reconsideration of the page?
Intermediate & Advanced SEO | | gametv0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
I'm not sure why SEOMoz is reporting duplicate content
I have thousands of duplicate page content errors on my site, but I'm not exactly sure why. For example, the crawl is reporting this page - http://www.fantasytoyland.com/2011-glee-costumes.html is a duplicate of this page - http://www.fantasytoyland.com/2011-jersey-shore-costumes.html . All of these products are unique to the page - what is causing it to flag as duplicate content?
Intermediate & Advanced SEO | | FutureMemoriesInc0 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0