Is it hurting my seo ranking if robots.txt is forbidden?
-
robots.txt is forbidden - I have read up on what the robots.txt file does and how to configure it but what about if it is not able to be accessed at all?
-
Yes, excluding certain pages can be a benefit to your rankings: if the excluded pages could be considered duplicate content with your marketing pages or with it each other.
This is usually the case for blogs (think wordpress categories) or webshops (pagination, as well as single product pages reachable by different paths (and thus having different urls). As Ryan pointed out: controll that on the page level via noindex,follow to allow PR to flow. Use noindex,nofollow for "internal" pages you dont want to see crawled.
I am not sure, but having 9950 pages indexed, but considered duplicate content might hurt rankings for other pages on that domain. Google might consider the Domain spammy.
If you need a specific hint for your domain, send me a PM and I have a look if time permits.
-
In general, I do not use robots.txt. It is a better practice to use "noindex" for the pages you do not wish to have indexed.
If I had a 10k page site with 50 marketing pages, I would either want to index the entire site, or question why the other 99% of the site exists if it does not help market the products. There are numerous challenges your scenario prevents. If you block 99% of your site with robots.txt or the noindex meta tag, you are severely disrupting the flow of PR throughout your site. Also you are either blocking content which should be indexed, or you are wasting time and resources creating junk pages on your site.
If the content truly should not be indexed, it likely should be moved to another site. I would need a lot more details about the site, it's purpose and the pages involved. Whatever the proper solution, it is not likely going to be using robots.txt to block 99% of the site.
-
So in regards to increasing ranking, is there a benefit of using the robots.txt file to only index certain "marketing" page and exclude other content that may dilute your site. For example, lets say I have 10,000 pages but only about 50 or so are my marketing page. Would using robots.txt to only crawl my main marketing pages help place emphasis on that content?
-
Sebes is correct. To add a bit more, it is not necessary to provide a robots.txt file. Actually, it is preferable in most cases not to use the file but it is necessary if you do not have direct control over the code used in every page of your site. For example, if you have a CMS or Ecommerce based site you may not have likely do not have control over many pages on your site which are automatically generated through the software. In these cases the only way you can control how crawlers will treat your site's pages is either to pay for custom modifications to your site's code or to use a robots.txt file.
-
If the robots.txt can not be read by google or bing they assume that they can crawl as much as they want to. Check out the google webmaster tool to see whether google can "see" and access your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How should the Heading Tags be used in Blogs to gain the Best results in SEO?
There are various Heading Tags from H1 to H6. In what order and priority should they be used in order to get best reach and ranking in google. Is every Tag a must in a blog?
Algorithm Updates | | sne79790 -
Why different pages rank in different countries?
Hi all, I have been investigating on why our log-in page is ranking for primary keyword, but not our homepage. I can see now homepage is ranking from our second important country. I wonder why and what causes to rank different pages in different countries for same keyword. Again the statistics does not vary much between these countries. Thnaks
Algorithm Updates | | vtmoz0 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
50% drop in search, no changes to site over 2 days, no notifications, A rank...
My URL is: http://applianceassistant.com
Algorithm Updates | | applianceassistant
With no changes to my site, I suddenly experienced a huge drop in search queries on Aug1. Your company has still given me an overall rating of A. I just thought you may be able to help or be interested in my case due to it's strange nature. Due to some suggestions on the webmaster forums, I have disavowed all low quality back links to the site, and I am currently working through each page trying to make the key words a little less spammy. Here are some screen shots of the action...
https://lh6.googleusercontent.com/-WgXUf-lvUyg/U-nrWNgspPI/AAAAAAAAAEI/imoI190LUns/s1600/Analytics_081214.tiff
https://lh4.googleusercontent.com/-srmvn288rr0/U-pxlwoycVI/AAAAAAAAAEg/ckmyX_2Sl_Y/s1600/PAGES_AUG.tiff
https://lh3.googleusercontent.com/-DVCYxhkutbQ/U-pxpQVfYfI/AAAAAAAAAEo/MN9PiLFT-zs/s1600/pages_july.tiff This appears to be almost a 50% 2 year set back. Any ideas or suggestions are greatly appreciated0 -
Diluting your authority - adding pages diluting rankings of other pages?
I'm looking after a site that has around 400 pages. All of these pages rank pretty well for the KW they are targetting. My question is: if we add another 400 pages without doing any link building work, holding DA the same, 1) would the rankings of those 400 previously good pages diminish? and 2) Would the new pages, as more and more new ones are created, rank less and less well?
Algorithm Updates | | xoffie0 -
Would 37,000 footer links from one site be the cause for our ranking drops?
Hey guys, After this week's Penguin update, I've noticed that one of our clients has seen a dip in rankings. Because of this, I've had a good link at the client's back link profile in comparison to competitors and noticed that over 37,000 footer links have been generated from one website - providing us with an unhealthy balance of anchor terms. Do you guys believe this may be the cause for our ranking drops? Would it be wise to try and contact the webmaster in question to remove the footer links? Thanks, Matt
Algorithm Updates | | Webrevolve0 -
Google Places Rank Replacement
Hi Guys We have recentley done work for a client where they ended up position 1 for their chosen keyword, which was great. Since then there rank has changed and they are now first in google places rank but not on organic search at all, where as before they were on both? Any suggestions on why this may have happened? Not to sure at why google would have replaced the organic rank with the places rank? Cheers Chris
Algorithm Updates | | MiracleCreative0 -
Ranking factors for national and local
What SEO impact factors am I missing if I am ranked 1st on Google for a keyword, but not ranked local search? The keyword being setup like {"company industry" "location" }. Its ranked 4th when searched on google and the location is specific to the location in the keyword. I've tried to varify that all of my citations are correct and identical. When I compare my sites Domain Authority and links to its competitors, I should be dominating that search. If you guys have any incite it would be greatly appreciated. MADD DOGG
Algorithm Updates | | MaddDogg0