Why blocking a subfolder dropped indexed pages with 10%?
-
Hy Guys,
maybe you can help me to understand better:
on 17.04 I had 7600 pages indexed in google (WMT showing 6113).
I have included in the robots.txt file, Disallow: /account/ - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form.
on 23.04 I had 6980 pages indexed in google (WMT showing 5985).
I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation?
Cheers
-
The thing is that I am checking indexed pages on a regular basis and usually the fluctuations are not big, only changes few pages. But never such manny pages. The traffic from organic did drop, but just slightly and rankings were never affected.
But as you said, I will keep an eye on this.
-
Hi,
If nothing significant, and no noticeable loss in rankings (e.g. no pages that were bringing in legitimate traffic were affected), I would wait this out and keep and eye on indexed pages. I've definitely seem similar rises / falls in indexed pages, but if the activity doesn't coincide with "real world" traffic / ranking consequences, it tends to be Google removing unnecessary pages (pagination, etc.) or even reporting error.
-
Hi Jane,
It was a small drop in traffic, but only few visits, nothing significant.
-
Hi,
The drop could be unrelated to your disallowing the account pages (but perhaps check if the CMS allows random query strings, and look into whether it could have created any upon user action, etc. just in case). It's pretty common to see fluctuations in the number of indexed pages, especially with numbers of pages in the thousands or higher. Have you noticed a decrease in traffic from search that you can match with deindexation of pages that were previously bringing in visitors?
-
I don't think so, because the URLs are static (www.domain.com/account/register), these urls don't parameters.
-
Maybe there are multiple URL variations created. For example, URL parameters, which will create multiple URLs to be indexed in Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
An informational product page AND a shop page (for same brand)
Hi all, This is my first foray into e-commerce SEO. I'm working with a new client who sells upscale eBikes online. Since his products are expensive, he wants to have informational pages about the brands he sells eg. www.example.com/brand. However these brands are also category pages for his online shop eg. www.example.com/shop/brand I'm worried about keyword cannibalization and adding an extra step/click to get to the shop (right now the navigational menu takes you to the information page and from there you have to click to get to the shop) I'm pretty sure it would make more sense to have ONE killer shopping page that includes all the brand information but I want to be 100% sure before I advise him to take this big step. Thoughts?
Technical SEO | | MouthyPR1 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Home page not indexed by any search engines
We are currently having an issue with our homepage not being indexed by any search engines. We recently transferred our domain to Godaddy and there was an issue with the DNS. When we typed our url into Google like this "https://www.mysite.com" nothing from the site came up in the search results, only our social media profiles. When we typed our url into Google like this "mysite.com" we were sent to a GoDaddy parked page. We've been able to fix the issue over at Godaddy and the url "mysite.com" is not being redirected to "https://mysite.com" but, Google and the other search engines have yet to respond. I would say our fix has been in place for at least 72 hours. Do I need to give this more time? I would think that at lease one search engine would have picked up on the change by now and would start indexing the site properly.
Technical SEO | | bcglf1 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
My homepage+key pages have dropped 40+ positions after implementing redirects and canonical changes. HELP!
Hi SEOMozers, I work for a web based nonprofit at www.tisbest.org. I had a professional contact recommend that we work on our redirects to our homepage because we were losing valuable rank benefit. This combined with getting sick of seeing our weekly SEOMoz crawl reports show 304 duplicate page and title errors for months. No one could seem to figure out what was happening (we think it had to do with session stuff; we were seeing several versions of each page showing the following: www.tisbest.org/default.aspx/(random character string) My developer and I read a bunch of articles and started making changes 10 days ago: He setup 301 redirects from http://tisbest.org to http://www.tisbest.org. (set the canonical domain). We did a redirect from http://www.tisbest.org/default.aspx to root with "/". I set the canonical setting to www.tisbest.org in our webmaster tools. In our web config (we're running in asp.net), we changed our session detection from auto-detect then saw some session funkiness so we changed it back. Though we do think the character strings we were seeing were session GUID. He forced lower case URL’s to reduce duplicate page content/titles. I got my weekly crawl report 9 days ago and we had dropped from 340 duplicate page title and page content errors went to one. We went nuts and felt like the kings of SEO. Then, yesterday (9/28), the SEO grim reaper came knocking when I received my weekly SEOMoz ranking report. It said we had dropped 40+ spots for all of 9 of our keywords. Sure enough, I searched our keywords and our website was gone. Then I searched our company name, tisbest, and only a few of our pages show but not the homepage. I searched for our URL www.tisbest.org, and I originally got the expanded view (with 8 links to various webpages - can't remember what this view is called) but now, today (Saturday), the expanded view is gone from this search result. Also, when I run the On Page Report card for our homepage, I get the following error message with no results: "We were unable to grade that page. The page did not load. Curl::Err::TooManyRedirectsError: Number of redirects hit maximum amount." When I run the Open Site explorer report, I get this message at the top: Oh Hey! It looks like that URL redirects to www.tisbest.org/?AspxAutoDetectCookieSupport=1. Would you like to see data for <a class="clickable redirects">that URL instead</a>?" If I go to the report for the that report's page, it says that "No information is available for that URL." Just tonight (night of 9/29), our developer added the rel="canonical" href="http://www.tisbest.org" /> to our homepage tonight to see if that would help. We did not do that originally. In our Google Webmaster tools, I am seeing the number of URL Error - Not Followed has sky rocked. I have attached a screen capture to this thread. There are also a large number of URL Errors - Not Found errors as well. I did some research tonight and downloaded and ran Screaming Frog SEO Crawler. I have attached a screen capture below with this report and a couple of questions I sent our developer that may be helpful to you. Also, not sure if this is relevant, we use a master page that all of our pages inherit from so all of our pages get the same meta-data: name="keywords" content="charitable gift card, charitable gift certificate, non profit gift card, charity donation, giftcard, charity gift card, donation gift card, donation gift, charity gift, animal gift card, animal gift, environmental gift card, environmental gift, humanitarian gift card, humanitarian gift, christian gift card, christian gift, catholic gift card, catholic gift, religious gift card, religious gift" />id="ctl00_metaDescription" name="description" content="Award winning Charity Gift Card, for over 250 premier charities. A customized donation gift that makes the world better. TisBest is BBB Accredited." />name="google-site-verification" content="EfJIhN3h2SVSXdSpUbfceBVw2q6zrGX8rRQhdNZ1xY8" /><title></span><span> </span></p> <p>Can anyone help me/us identify the issue that obliterated our rankings? I am happy to give an information needed. Thank you! Chad Edwards</p> <a download="Bqcu1.png" class="imported-anchor-tag" href="http://i.imgur.com/Bqcu1.png" target="_blank">Bqcu1.png</a> <a download="ZXQ8d.png" class="imported-anchor-tag" href="http://i.imgur.com/ZXQ8d.png" target="_blank">ZXQ8d.png</a></title>
Technical SEO | | TisBest0 -
I know I'm missing pages with my page level 301 re-directs. What can I do?
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
Technical SEO | | VMLYRDiscoverability0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0