Is it hurting my seo ranking if robots.txt is forbidden?
-
robots.txt is forbidden - I have read up on what the robots.txt file does and how to configure it but what about if it is not able to be accessed at all?
-
Yes, excluding certain pages can be a benefit to your rankings: if the excluded pages could be considered duplicate content with your marketing pages or with it each other.
This is usually the case for blogs (think wordpress categories) or webshops (pagination, as well as single product pages reachable by different paths (and thus having different urls). As Ryan pointed out: controll that on the page level via noindex,follow to allow PR to flow. Use noindex,nofollow for "internal" pages you dont want to see crawled.
I am not sure, but having 9950 pages indexed, but considered duplicate content might hurt rankings for other pages on that domain. Google might consider the Domain spammy.
If you need a specific hint for your domain, send me a PM and I have a look if time permits.
-
In general, I do not use robots.txt. It is a better practice to use "noindex" for the pages you do not wish to have indexed.
If I had a 10k page site with 50 marketing pages, I would either want to index the entire site, or question why the other 99% of the site exists if it does not help market the products. There are numerous challenges your scenario prevents. If you block 99% of your site with robots.txt or the noindex meta tag, you are severely disrupting the flow of PR throughout your site. Also you are either blocking content which should be indexed, or you are wasting time and resources creating junk pages on your site.
If the content truly should not be indexed, it likely should be moved to another site. I would need a lot more details about the site, it's purpose and the pages involved. Whatever the proper solution, it is not likely going to be using robots.txt to block 99% of the site.
-
So in regards to increasing ranking, is there a benefit of using the robots.txt file to only index certain "marketing" page and exclude other content that may dilute your site. For example, lets say I have 10,000 pages but only about 50 or so are my marketing page. Would using robots.txt to only crawl my main marketing pages help place emphasis on that content?
-
Sebes is correct. To add a bit more, it is not necessary to provide a robots.txt file. Actually, it is preferable in most cases not to use the file but it is necessary if you do not have direct control over the code used in every page of your site. For example, if you have a CMS or Ecommerce based site you may not have likely do not have control over many pages on your site which are automatically generated through the software. In these cases the only way you can control how crawlers will treat your site's pages is either to pay for custom modifications to your site's code or to use a robots.txt file.
-
If the robots.txt can not be read by google or bing they assume that they can crawl as much as they want to. Check out the google webmaster tool to see whether google can "see" and access your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sizable decrease in amount of pages indexed, however no drop in clicks, impressions, or ranking.
Hi everyone, I've run into a worrying phenomenon in GSC and im wondering if anyone has come across something similar. Since August, I have seen a steady decline in the number of pages that are indexed from my site, from 1.3 million down to about 800,000 in two months. Interestingly, my clicks/impressions continue to increase gradually (on the same pace they have been for months) and I see no other negative side affects resulting from this drop in coverage. In total I have 1.2 million urls that fall into one of three categories, "Crawled - currently not indexed", "Crawl anomaly", and "Discovered - currently not indexed" Some other notes - all of my valid, error, and excluded pages are https://www. , so I don't believe there is an issue with different versions of the same site being submitted. Also, my rankings have not changed so I tentatively believe that this is unrelated to the Medic Update. If anyone else has experienced this or has any insight to the problem I would love to know. Thanks!
Algorithm Updates | | Jason-Reid0 -
Indexed, though blocked by robots.txt: Need to bother?
Hi, We have intentionally blocked some of the website files which were indexed for years. Now we receive a message "Indexed, though blocked by robots.txt" in GSC. We can ignore as per my knowledge? Are any actions required about this? We thought of blocking them with meta tags but these are PDF files. Thanks
Algorithm Updates | | vtmoz1 -
Ranking For Synonyms Without Creating Duplicate Content.
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B. Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us. We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A. Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
Algorithm Updates | | Fuel0 -
Not a mobile friendly website, will it hurt my rankings?
Unfortunately my website is not mobile friendly. As it is based on clickable links within an image there is no way to adapt it either. Now, I have heard Google is getting serious about mobile friendly design, how will this impact my ratings? My current analytics show 57% desktop, 24% mobile and 19% tablet. I really like the design of my site with the clickable images and would hate to have to change it because Google says so :-(. My website is http://tamarindobeachinfo.com
Algorithm Updates | | ijb0 -
Recent Algorithm Update Impact on Rankings
I've read that the most recent algorithm update by Google is targeting dodgy links. I have a client's website who within the last few days has been smashed out of of top positions for the most competitive keywords (and many others). I'm worried that the site has been penalised, however I can't understand why it would be. The site only has 11 domains linking to it (65 links total) and a lot of these links are coming from the same websites that link to all of our other web clients and none of them have experienced this sudden and significant drop in rankings. Does anyone know if Google is targeting a specific type of site, or how I can determine if my client's website has been penalised? I've not made any significant changes recently to the site's content or meta data, however rankings have remained steady for months now. It just seemed to happen overnight that they dropped off everything (eg. middle of page 2 to page 8 of search results for some of the better keywords) Thank you in advance for any assistance!
Algorithm Updates | | JuiceBoxOM0 -
Recent Rank drop after Penguin 2.1?
Recently, a lot of pages from our website have moved from page one or ranking number one, to page ten or something. We got a manual penalty message from Google Team, we removed a lot of unnatural links pointing to our pages and disavowed the rest. This got the penalty removed and we got a message from Google confirming the same. Before the manual penalty we were getting about 140,000 visits per day, after the penalty about 80,000. However, after Hummingbird or Penguin 2.1 all our ranks have vanished. We are nowhere in Google for our primary keywords and we getting like 40,000 visits per day. Most are direct or from sources other than Google. We had another look at the links we disavowed, a list of about 11000 domains, we found about 3000 domains to be good. We fixed the disavow file about one week back, but no changes in traffic since. We checking the domains again to see if we have missed more good domains in there; yes, we have. There are still a very few good domains in there. But we are not touching the disavow list; waiting to see the change for the last submitted. We have a dedicated user base, good liking on Facebook, all the stats in Analytics speak good, about 40% repeat visits about 30% direct. About 3000 people search for the site using our brand name as reported in Analytics. I doubt the on-page optimization, the pages could be over-optimized. But the on-page factors for other pages ranking for the keywords are similar. The keyword density is similar, so are the usage of headings and stuff. We have not made any recent changes to these on-page patterns. Our team is not able to figure out what could have gone wrong.
Algorithm Updates | | Develop410 -
Why does my Rank Checker result differ to SERPs
Hello SEOmoz members I've got yet another naive question for you. RankChecker is telling me that my client has risen to pg 1 position 7. Whilst SERPs is telling me they are still on position 14. I know that SERPs is variable depending on many factors, but this holds true for separate searches on other computers in various far flung locations. Please give me some insight into what is happening. I'm waiting to open the bubbly! Thanks
Algorithm Updates | | catherine-2793880 -
Ranking Tracking Tool Not Accurate?
Is google still updating the algo on a daily basis or for the most part are you other mozzers seeing your rankings stick? I ask because the rank tracking software I use locally on my laptop shows me inaccurate rankings, as well as the SEOMOZ tool (which was just updates yesterday). I am not sure why this is happening, but as of yesterday I lost four page one rankings, which I didn't deserve anyway, and was apparently a fluke until they did the PR update. So I dont mind, but I am curious if they are still tweaking on a daily basis or if its safe to continue link building. I dont want them to make another change and have it affect my rankings in a negative way.
Algorithm Updates | | getbigyadig0