Why blocking a subfolder dropped indexed pages with 10%?
-
Hy Guys,
maybe you can help me to understand better:
on 17.04 I had 7600 pages indexed in google (WMT showing 6113).
I have included in the robots.txt file, Disallow: /account/ - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form.
on 23.04 I had 6980 pages indexed in google (WMT showing 5985).
I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation?
Cheers
-
The thing is that I am checking indexed pages on a regular basis and usually the fluctuations are not big, only changes few pages. But never such manny pages. The traffic from organic did drop, but just slightly and rankings were never affected.
But as you said, I will keep an eye on this.
-
Hi,
If nothing significant, and no noticeable loss in rankings (e.g. no pages that were bringing in legitimate traffic were affected), I would wait this out and keep and eye on indexed pages. I've definitely seem similar rises / falls in indexed pages, but if the activity doesn't coincide with "real world" traffic / ranking consequences, it tends to be Google removing unnecessary pages (pagination, etc.) or even reporting error.
-
Hi Jane,
It was a small drop in traffic, but only few visits, nothing significant.
-
Hi,
The drop could be unrelated to your disallowing the account pages (but perhaps check if the CMS allows random query strings, and look into whether it could have created any upon user action, etc. just in case). It's pretty common to see fluctuations in the number of indexed pages, especially with numbers of pages in the thousands or higher. Have you noticed a decrease in traffic from search that you can match with deindexation of pages that were previously bringing in visitors?
-
I don't think so, because the URLs are static (www.domain.com/account/register), these urls don't parameters.
-
Maybe there are multiple URL variations created. For example, URL parameters, which will create multiple URLs to be indexed in Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drop in Rankings
Hello Webmasters, My site has incurred a sudden dip in rankings across sections. We conducted an analysis and have observed the following two major issues: Unnatural Links Penalty: Our site was issued an Unnatural Links Penalty on May 23. Basically, we have both 'http' and 'https' versions of our website registered on Webmaster tools. Initially, the warning showed up on the 'http' version and thus we started a cleanup by extracting the linking domains and have also filed a reconsideration request once all the spammy domains were removed and rightly disavowed. Recently, we got another manual action warning on the 'https' version regarding the unnatural links. So we have started with the cleanup activity right away. While analyzing this issue, we came across another major problem regarding the two versions which is our next concern and is mentioned below. https Canonical Issue: For more insights, we went through our site’s content and found that our website is following the below pattern Our 'http' version of the webpages get 301 redirected to the 'https' version. This 'https' version again has a canonical pointing to the 'http' version thus creating a loop. To conclude, I request your valuable learnings and thoughts on the following: Which of these issues are likely to have affected our website’s ranking Which version is likely to be preferred by Google (https or http) in our case
Technical SEO | | Starcom_Search0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Why google indexed pages are decreasing?
Hi, my website had around 400 pages indexed but from February, i noticed a huge decrease in indexed numbers and it is continually decreasing. can anyone help me to find out the reason. where i can get solution for that? will it effect my web page ranking ?
Technical SEO | | SierraPCB0 -
SEOMOZ and Webmaster Tools showing Different Page Index Results
I am promoting a jewelry e-commerce website. The website has about 600 pages and the SEOMOZ page index report shows this number. However, webmaster tools shows about 100,000 indexed pages. I have no idea why this is happening and I am sure this is hurting the page rankings in Google. Any ideas? Thanks, Guy
Technical SEO | | ciznerguy1 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Un-Indexing a Page without robots.txt or access to HEAD
I am in a situation where a page was pushed live (Went live for an hour and then taken down) before it was supposed to go live. Now normally I would utilize the robots.txt or but I do not have access to either and putting a request in will not suffice as it is against protocol with the CMS. So basically I am left to just utilizing the and I cannot seem to find a nice way to play with the SE to get this un-indexed. I know for this instance I could go to GWT and do it but for clients that do not have GWT and for all the other SE's how could I do this? Here is the big question here: What if I have a promotional page that I don't want indexed and am met with these same limitations? Is there anything to do here?
Technical SEO | | DRSearchEngOpt0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Google News not indexing .index.html pages
Hi all, we've been asked by a blog to help them better indexing and ranking on Google News (with the site being already included in Google News with poor results) The blog had a chronicle URL duplication problem with each post existing with 3 different URLs: #1) www.domain.com/post.html (currently in noindex for editorial choices as showing all the comments) #2) www.domain.com/post/index.html (currently indexed showing only top comments) #3) www.domain.com/post/ (very same as #2) We've chosen URL #2 (/index.html) as canonical URL, and included a rel=canonical tag on URL #3 (/) linking to URL #2.
Technical SEO | | H-FARM
Also we've submitted yesterday a Google News sitemap including consistently the list of URLs #2 from the last 48h . The sitemap has been properly "digested" by Google and shows that all URLs have been sent and indexed. However if we use the site:domain.com command on Google News we see something completely different: Google News has indexed actually only some news and more specifically only the URLs #3 type (ending with the trailing slash instead of /index.html). Why ? What's wrong ? a) Does Google News bot have problems indexing URLs ending with .index.html ? While figuring out what's wrong we've found out that http://news.google.it/news/search?aq=f&pz=1&cf=all&ned=us&hl=en&q=inurl%3Aindex.html gives no results...it seems that Google News index overall does not include any URLs ending with /index.html b) Does Google News bot recognise rel=canonical tag ? c) Is it just a matter of time and then Google News will pick up the right URLs (/index.html) and/or shall we communicate Google News team any changes ? d) Any suggestions ? OR Shall we do the other way around. meaning make URL #3 the canonical one ? While Google News is showing these problems, Google Web search has actually well received the changes, so we don't know what to do. Thanks for your help, Matteo0