Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
-
Hi all,
We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.
Thanks
-
Hi vtmoz,
Given the limitations you are telling us, I'd give noindex in robots.txt a try.
I've run some experiments and found that noindex rule in Robots.txt works. It definitely won´t remove from index that pages, but it will stop showing them for search results.I'd suggest you to try using that rule with care.
Also, run some experiments on your own. My first test would be only adding one or two pages, the one that causes more trouble being indexed (maybe due to undesired traffic or due to ranking on undesired search terms).Hope it helps.
Best luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Country Redirection Change
Analytics is showing a substantial decrease in referring traffic from Google specific regional domains like .ca, .co.uk, .de, etc vs an uptick from .com starting as of March 2018. Did anyone note when this change happened when Google stopped directing traffic to their regional domains? Was there any press about it (couldn't find any). Using a VPN for different countries, I compared regional specific domain SERPs vs .com and they're pretty much identical. Thanks!
Algorithm Updates | | Bragg1 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
Page Rank Metrics Disapearing
Hi Everyone. I keep hearing from different people that googles PAGE RANK , Will soon be eliminated. Has anyone heard anything more about this ? Or is this a myth? This would make it more complex for the average person to rely on DA and TF etc.. Would love to hear from you guys. I actually like the PR rating system. Also have websites more then a year old which are still not showing actual PR yet .Maybe delayed or maybe not. Robert
Algorithm Updates | | Yellow20000 -
Google Penguin update
When Google Penguin update will run again. The last time was in October 2013 and I'm still really curious now. Or have they stopped this and this is now continuously just like the panda?
Algorithm Updates | | NECAnGeL0 -
Next Google PR update
When is next google Pagerank update is expected to arrive.
Algorithm Updates | | csfarnsworth
I know it takes one month to one year for Google to update it but I know many people sitting here at Moz know some secrets for sure.0 -
How Can I Prevent Duplicate Page Title Errors?
I am working on a website that has two different sections, one for consumers and one for business. However, the products and the product pages are essentially the same but, of course, the pricing and quantities may be different. We just have different paths based on the kind of customer. And, we get feeds from manufacturers for the content so it's difficult to change it. We want Google to index both sections of the site but we don't want to get hammered for duplicate page titles and content. Any suggestions? Thanks!
Algorithm Updates | | JillCS0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Why is a link considered active, but is no longer on the page?
How come links sometimes show up in OSE or Yahoo Site Explorer and then when you go to the page, they're not there anymore? Why is a link indexed or considered active but is no longer on the page?
Algorithm Updates | | MichaelWeisbaum0