Blocking Test Pages Enmasse on Sub-domain
-
Hello,
We have thousands of test pages on a sub-domain of our site. Unfortunately at some point, these pages were visible to search engines and got indexed. Subsequently, we made a change to the robots.txt file for the test sub-domain. Gradually, over a period of a few weeks, the impressions and clicks as reported by Google Webmaster Tools fell off for the test. sub-domain.
We are not able to implement the no index tag in the head section of the pages given the limitations of our CMS.
Would blocking off Google bot via the firewall enmasse for all the test pages have any negative consequences for the main domain that houses the real live content for our sites (which we would like to of course remain in the Google index).
Many thanks
-
If you want nothing on that test subdomain indexed, verify that subdomain as its own site in Google Webmaster Tools, exclude that subdomain from being indexed in robots.txt, then request removal of that site (subdomain) in GWT.
And consider setting up a page monitor like https://polepositionweb.com/roi/codemonitor/index.php on the robots.txt of your test site (and live site). It'll check the contents of those pages once a day, and email you if there's a change. Handy if there are multiple people working on the site.
-
I'm a bit confused. Didn't blocking the test subdomain with the robots.txt already accomplish what you are trying to do? Or are the test pages still somehow indexed? Or is your main site affected by the robots.txt? Anyway, I would suggest using the .htaccess file to block search engines from accessing the subdomain rather than a firewall - http://stackoverflow.com/questions/6738896/excluding-testing-subdomain-from-being-crawled-by-search-engines-w-svn-reposit
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Merging Domains
Hi, Everyone, My company is currently working with a client that has multiple websites and is interested in merging them into one. One is a primary corporate site, the other is a site for a single line of products. They obviously want to merge the product site into the corporate site. The interesting thing is that the product site outperforms the corporate site. It has the highest traffic, and it has far more links/linking domains, a higher domain authority (although only by two points), and much more social activity. However, their reasons for wanting to merge the two are completely valid - less management, URL would match print collateral, etc. They're asking our opinion on whether or not to move forward with the merger. I'm leaning toward no simply because of the fact that the site they want to merge is outperforming the other. I'm curious, though, to get some other opinions on this. Would a merger be worth the work in this case? Any advice would be appreciated. Thanks!
Technical SEO | | PapercutInteractive0 -
Pageing page and seo meta tag questions
Hi if i am using paging in my website there is lots of product in my website now in paging total paging is 1000 pages now what title tag i need to add for every paging page or is there any good way we can tell search engine all page or same ?
Technical SEO | | constructionhelpline0 -
Forum on toplevel or sub domain?
Hiya all I am building a site for the expansion of a facebook fan page that will be sending a fair bit of traffic to the site. One feature of the site is a forum plus a store and i have to make the dissuasion on if i should have the forum and store on the main domain or on separate subdomains for each. domain.com forum.domain.com store.domain.com So my question is would it be better to keep all of these elements on the main domain or each on a separate sub domains? Will
Technical SEO | | wildwill0 -
Redirecting over-optimised pages
Hi One of my clients websites was affected by Penguin and due to no 'bad link' messages, and nothing really obvious from the backlink profile, I put it down to over-optimisation on the site. I noticed a lot of spammy pages and duplicate content, and submitted recommendations to have these fixed. They dragged their heels for a while and eventually put in plans for a new site (which was happening anyway), but its taken quite a while and is only just going live in a couple of weeks. My question is, should I redirect the URLs of the previously over-optimised pages? Obviously the new pages are nice and clean and from what I can tell there are no bad links pointing to the URLs, so is this an acceptable practice? Will Google notice this and remove the penalty? Thanks
Technical SEO | | Coolpink0 -
Different domains
Firstly apologies for the very brief question as I am mainly looking for your thoughts as opposed to specific help about a specific problem. I am working on a site which has two sepreate domains and within one domain, two sub domains. The two different sites both havea high page rank, PR6 each, one is the corporate site and the other is the company blog. There are also two sub domains within the corporate site, again both domains have high pr and tons of content. My question is would it be better to consolidate all the assets under one domain or is it better to keep the sites sepreate, from an seo perspective that is.
Technical SEO | | LiquidTech0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0