Why do we have so many pages scanned by bots (over 250,000) and our biggest competitors have about 70,000? Seems like something is very wrong.
-
We are trying to figure out why last year we had a huge (80%) and sudden (within two days) drop in our google searches. The only "outlier" in our site that we can find is a huge number of pages reported in MOZ as scanned by search engines. Is this a problem? How did we get so many pages reported? What can we do to bring the number of searched pages back to a "normal" level?
BT
-
Hi. A mystery indeed! Have you recently upgraded or changed Web platforms or changed or upgraded what you are using for your site navigation?
-
Stewart_SEO
Thanks for your quick response. We did review the robots.txt of the competitors. Not line by line - they took surprisingly different approaches to the robots.txt. But there were the usual exclusions for wish lists, etc. We've gone back and tightened up our robots.txt and haven't yet seen any changes. Several months ago we were at about 600,000 pages and it is dropping. Very mysterious.
-
Have you looked at your competitors robots.txt file? they are probably blocking the very same searches you are talking about. if there is a particular bot like a Chinese crawler for example baidu that you don't want to come to your site you can block them via the command: User-agent: Baiduspider
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
Panda...Should I consolidate...Like this...
I'm torn. Many of our 'niche' ecommerce products rank ok, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Algorithm Updates | | saultienut0 -
Does the page title keyword count in anchor text when link is web address?
If someone links to my plumbing site with this link as the anchor text: http://www.plumbers.com/austin-plumbers.html does the key phrase "austin plumbers" get counted in the anchor text by google or is this a sample of anchor text that google ignores? Thanks mozzers! Ron
Algorithm Updates | | Ron100 -
Can a google data refresh knock your pages out of the rankings?
I see that around mid November 2013 a handful of my sites pages dropped off of Google completely. It was around the data refreshes in November, and while everyone says it doesn't effect that much I was wondering if anyone knew if it could knock some of my pages out of the rankings for a specific keyword. Note - we had previously held muliple listings for different pages on our site for this particular keyword. Google kept the highest ranking and knocked the lower ones off. See attached image of our keyword ranking history to see what I mean. DcJJM0M
Algorithm Updates | | franchisesolutions0 -
When was the last algorithm update? One of my pages has dropped significantly this week
One of my pages dropped 22 places last week and I'm not sure why - can any body give me some suggestions to why this might have happened?
Algorithm Updates | | lindsayjhopkins0 -
Site name appended to page title in google search
Hi there, I have a strange problem concerning how the search results for my site appears in Google. The site is Texaspoker.dk and for some strange reason that name is appended at the end of the page title when I search for it in Google. The site name is not added to the page titles on the site. If I search in Google.dk (the relevant search engine for the country I am targeting) for "Unibet Fast Poker" I get the following page title displayed in the search results: Unibet Fast Poker starter i dag - få €10 og prøv ... - Texaspoker.dk If you visit the actual page you can see that there is no site name added to the page title: http://www.texaspoker.dk/unibet-fast-poker It looks like it is only being appended to the pages that contains rich snippets markup and not he forum threads where the rich snippets for some reason doesn't work. If I do a search for "Afstemning: Foretrukne TOPS Events" the title appears as it should without the site name being added: Afstemning: Foretrukne TOPS Events Anybody have any experience regarding this or an idea to why this is happening? Maybe the rich snippets are automatically pulling the publisher name from my Google+ account... edited: It doesn't seem to have anything to do with rich snippets, if I search for "Billeder og stuff v.2" the site name is also appended and if I search for "bedste poker bonus" the site name is not.
Algorithm Updates | | MPO0 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0