Why do we have so many pages scanned by bots (over 250,000) and our biggest competitors have about 70,000? Seems like something is very wrong.
-
We are trying to figure out why last year we had a huge (80%) and sudden (within two days) drop in our google searches. The only "outlier" in our site that we can find is a huge number of pages reported in MOZ as scanned by search engines. Is this a problem? How did we get so many pages reported? What can we do to bring the number of searched pages back to a "normal" level?
BT
-
Hi. A mystery indeed! Have you recently upgraded or changed Web platforms or changed or upgraded what you are using for your site navigation?
-
Stewart_SEO
Thanks for your quick response. We did review the robots.txt of the competitors. Not line by line - they took surprisingly different approaches to the robots.txt. But there were the usual exclusions for wish lists, etc. We've gone back and tightened up our robots.txt and haven't yet seen any changes. Several months ago we were at about 600,000 pages and it is dropping. Very mysterious.
-
Have you looked at your competitors robots.txt file? they are probably blocking the very same searches you are talking about. if there is a particular bot like a Chinese crawler for example baidu that you don't want to come to your site you can block them via the command: User-agent: Baiduspider
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
How to take down a sub domain which is receiving many spammy back-links?
Hi all, We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"? Thanks
Algorithm Updates | | vtmoz0 -
Which is the best way - to have all FAQ pages at one place, or splitted in different sections of the website?
Hi all, We have a lot of FAQ sections on our website, splitted in different places, depending on products, technologies, etc. If we want to optimize our content for Google's Featured Snippets, Voice Search and etc. - what is the best option: to combine them all in one FAQ section? or it doesn't matter for Google that this type of content is not in one place? Thank you!
Algorithm Updates | | lgrozeva0 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
Page 1 all of a sudden for two clients
Hello, So, for many months, a couple of my clients have had a handful of terms that they were ranking for on Page 2. All of a sudden in the past month, both clients have moved up to Page 1, #2 for most of their terms. I have been working on some optimization tests and made minor changes, but I am concerned because the consistency of the #2 position for both clients for all of the previously Page 2 ranking keywords. I have seen this type of Google increase for clients before, and my experience has shown that it is a test from Google-so, from Google's perspective: "we're going to move your rankings up to Page 1 and see what you do with this to prove to us that your site is worth the position". Anyone had any experience with this kind of movement? Thanks so much in advance..
Algorithm Updates | | lfrazer0 -
New Google SERPs page title lengths, 60 characters?
It seems that the new Google SERPs have a shorter page title character length? From what I can gather they are 60 characters in length. Does this mean we all need to now optimise our page titles to 60 characters? Has anyone else noticed this and made any changes to page title lengths?
Algorithm Updates | | Adam_SEO_Learning0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0 -
When Google crawls and indexes a new page does it show up immediately in Google search - "site;"?
We made changes to a site, including the addition of a new page and corresponding link/text changes to existing pages. The changes are not yet showing up in the Google index (“site:”/cache), but, approximately 24 hours after making the changes, The SERP's for this site jumped up. We obtained a new back link about a couple of weeks ago, but it is not yet showing up in OSE, Webmaster Tools, or other tools. Just wondering if you think the Google SERP changes run ahead of what they actually show us in site: or cache updates. Has Google made a significant SERP “adjustment” recently? Thanks.
Algorithm Updates | | richpalpine0