Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
-
Hi all,
We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.
Thanks
-
Hi vtmoz,
Given the limitations you are telling us, I'd give noindex in robots.txt a try.
I've run some experiments and found that noindex rule in Robots.txt works. It definitely won´t remove from index that pages, but it will stop showing them for search results.I'd suggest you to try using that rule with care.
Also, run some experiments on your own. My first test would be only adding one or two pages, the one that causes more trouble being indexed (maybe due to undesired traffic or due to ranking on undesired search terms).Hope it helps.
Best luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page not ranking?
Hey guys, I'm working on a relatively new client and have noticed that all the initial rankings that we have point towards a sub page rather than the home page. Only a branded search appears to bring up the home page which seems really strange. Any ideas? Home page; www.geraldmiller.com Sub page (ranking); www.geraldmiller.com/dwi-dui-defense
Algorithm Updates | | Webrevolve0 -
Google News Results
This is more of an observation than anything else. Has anyone noticed any strange results in Google News, in terms of very old content hitting page 1? My example is football, I support Newcastle so keep checking for the latest transfer failure or humiliation. First page for couple of days is showing old articles (April, May) from the same source rather than the usual spread of tabloid and broadsheet news.
Algorithm Updates | | MickEdwards0 -
7 Pack Google Serps?
What is the best way to get into the 7 pack of google serps? I have a site that ranked well before this changed but not was pushed back to page 2. I have Unique content and I currently have provided my info to all the standard local sites, like Yelp, Manta, Local.com and others. I already have a Google Local page and I also have links from local sites. What else can be done?
Algorithm Updates | | bronxpad0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Are multiple domains for my website hurting my Google ranking?
Hello, I currently have two domains showing up in google search: shwoodshop.com shop.shwoodshop.com These domains are currently ranked in the #2 and #3 spot, however my page is much more trafficked than the current #1 ranking. I am wondering if the fact that I have two domains competing for the #1 spot is hurting my search ranking. If so, what is the best way to remedy this issue and get back my #1 spot? I'm rather new to SEO and teaching myself as I go, so I appreciate the feedback!
Algorithm Updates | | shwoodshop0 -
Google Algo Update In Que. What consititues over optimization?
http://www.pcmag.com/article2/0,2817,2401732,00.asp According to this, Google is bringing the hammer down soon on another 10-20% of the search results. While we don't advocate keyword stuffing, exchanging links, or anything too risky I am still concerned. Do we know if the example "perfectly optimized page"; http://www.seomoz.org/blog/perfecting-keyword-targeting-on-page-optimization is now going to be penalty bait? Is this over stuffing? Also, how might this effect ecommerce sites in particular?
Algorithm Updates | | iAnalyst.com2 -
Was I Kicked Off Google Page One by Panda/Farmer?
Took over this site in March. Got a Panicked call from client Mid-March that all of a sudden keywords that put the site on Page One weren't working. There are still 9 that work, but apparently there were more. A large percentage of the backlinks are from Article Directories and Link Farms. Is this my problem? Also, a large percentage of the 149 pages suffer from keyword stuffing and were obviously written for Search Engines and not people. How much of a difference does that make?
Algorithm Updates | | reeljerc0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0