Crawl Diagnostics - Crawling way more pages than my site has?
-
Hello all,
I'm fairly new here, more of a paid search guy dabbling in SEO on the side. I have a client that I have in SEOMoz and the Crawl Diagnostics report is showing 10,000+ pages crawled and I think the site has at most 800 pages (e-commerce site using freewebstore.org as the platform).
Any reasons this would be happening?
-
Ok - Here is an update. I found that it has a basketful of entries for each Category and I have a pretty good list of categories.
Attached is an image showing what is happening in one category. There is an entry for each sort option which I understand where this is coming from (Sort Name, Sort Price Ascending, Sort Price Descending) what i don't understand are all the "rw=1" entries. And why they stack up like they do.
Is this an issue? I am assuming it is because there seems to be no real reason for it.
-
Thanks to both of you. I will start to dig in to your suggested steps later today.
I just took this one and they really don't have anything set-up. I just got them set-up on Webmaster tools as well so not even sure if they had their site indexed before.
The Crawl Diagnostics doesn't show much duplicate content (60 pages?) but the Too Many On Page Links, Overly Dynamic URL, Duplicate Title, Long URL warnings are all showing 6000-10000 pages.
The site sells crystals, each item is unique and as I did my first review they don't really even have item descriptions written let alone page titles and meta-descriptions.
I am in analysis mode working up my comments in review and detailing an action plane to help them focus moving forward. I was just shocked by the 10,000 pages listed in one of the crawl warnings.
anyway, I'll dig into this info and let you know what I find. It's an adventure!
-
I'm guessing that as an ecommerce site you've got multiple ways to browse your content, by category / brand / special offers etc. The thing to watch out for is interesting URLs with categories or lots of parameters.As a result, chances are you've got a duplicate content problem.
As Nakul mentioned a good first step is to take a look at your crawl report or use one of the tools he mentioned to see if you've got the same content being indexed multiple times.
Once you've done that, check is to see how many of these pages being crawled are appearing in Google's index. Is Google doing a reasonable job identifying the right version? How many pages are there in the index. Are recently added products being discovered quickly?
The Site: operators will be your friend here and Dr Pete did a great article on ways you can use it.
http://www.seomoz.org/blog/25-killer-combos-for-googles-site-operator
Once you understand what is being crawled and what's making it to the index you need to decide what pages you really do want to be indexed and make sure that these become the canonical versions and block parts of your site using robots.txt. (But understand the problem and what you want to achieve before you start doing this.)
Hope this helps.
<object id="plugin0" style="position: absolute; z-index: 1000;" width="0" height="0" type="application/x-dgnria"><param name="tabId" value="ff-tab-10"> <param name="counter" value="138"></object>
-
You can download the entire crawl and see if there's actually that many pages. Or post the URL here.
You can also test using a crawling software tool like Xenu or Screaming Frog to test it.
You can also post/private message the link here and I can take a look.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Diagnostics
My site was crawled last night and found 10,000 errors due to a Robot.txt change implemented last week in between Moz crawls. This is obviously very bad so we have corrected it this morning. We do not want to wait until next Monday (6 days) to see if the fix has worked. How do we force a Moz crawl now? Thanks
Moz Pro | | Studio330 -
Crawl Diagnostics : Problem of display in Excell.
Hi Mozers, I've just finished watching the Crawl Diagnostics Webinar and when I try to export one of my campaign into the CSV format, I've a display problem into Microsoft Excell. Every headtitles are into the "A" column so, I can't do anything with that : I can't organize the data,... It's totally unreadable. What can I do? Thank you for yours answers. Jonathan
Moz Pro | | JonathanLeplang0 -
How effective is Crawl DIagnostics in determining crawlibility?
Is Seomoz crawl diagnostics useful for determining what pages Google has a hard time indexing. One of the problems with my site is that it uses JS and Flash and I know Google isnt too keen on that. Can Crawl Diagnostics accurately tell me if there is too much of something and therefore Google is having a hard time crawling? I want to be able to know if JS or Flash is hurting any of my pages in any way. I provide good content and I want to make sure Google can pick it up.....Is this too much to ask? Is there anything out there for this?
Moz Pro | | waltergah0 -
Tools that crawl 2 million page sites
Our site is about 2million pages deep, 50% of which is stale content. Yes, I know - OMG #unhygienic. Even if we get approval to get rid of half of it. SEOMoz Pro Elite only crawls 20k deep - what can i do to crawl and diagnose the whole site. Are there any tools anyone can suggest. SEOMoz??
Moz Pro | | ilhaam0 -
Transfering Page Authority
Hi, I have recently change my url architecture with site redesign and was just doing some analysis of the old and new pages. I seem to be losing a little bit of Organic Search because of it. As an example this old diving page in open site explorer shows a Page Authority of 46 whilst the new diving page shows a Page Authority of 22. I have a 301 redirect going from the old page to the new, but that seems to be quite a drop in Page Authority. Is there anything else I can be doing to improve upon it? Thanks, Adam
Moz Pro | | NaescentAdam0 -
Are you going to update your On-Page too?
With the new over optimisation penalty coming in, I was wondering if you're going to update your online tool to take this into account?
Moz Pro | | photogaz0 -
After fixing errors can I re-crawl for diagnostics?
As I am fixing errors will the campaign automatically update to show where I have fixed issues?
Moz Pro | | eidna220 -
Crawl Diagnostics and missing meta tags on noindex blog pages
Hi Guys/Gals We do love the Crawl Diagnostics, but do find the missing meta tags ("Missing Meta Description" Tag in this case) somewhat spammy. We use the "All in One SEO Pack" for our blog and it does stick in noindex,follow (as it should) on the pages that is of no use to us. "2008/04/page/2/" and the likes. Maybe I'm wrong but should the Diagnostics tool not respect the noindex tag and just ignore any warnings, since it should really mean that these pages are NOT included in the search index. Meaning that the other meta tags are really useless. Any thoughts?
Moz Pro | | sfseo0