Pages crawled
-
I noticed there is a limited in the number of pages crawled on galena.org? Will this number increase over time?
-
How many pages are on your site? I'm seeing over 500 indexed on Google. Trying to get a feel for how many pages aren't indexed.
Do you use a sitemap file at all? If you have one in place, and submit it via Google Webmaster Tools, it will show you how many pages are in your sitemap and how many of those pages Google has indexed.
A secondary note though that some of those pages are Ajax File Manager pages, which you probably want to exclude from being crawled via robots.txt.
-
It is advisable to check through Google webmaster tools for any crawl errors. If there is any, try to fix ASAP.
Don't have high hope on other site audit tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Virtual Hub Page Impact
I currently have a website structure that has multiple subfolders. One of the primary sub-folders has hundreds of pages within (e.g. www.mydomain.com/subfolder1/page) The pages are all accessible through other subfolders, as contextually appropriate, but there is no existing hub page for the specific pages. In other words, while www.mydomain.com/subfolder1/page1....n are all valid URLs, www.mydomain.com/subfolder1/ is a 404. My question is given that the pages within the subfolder are accessible through multiple other subfolders, how much of an issue is it that the specific subfolder these pages are within 404s. Does this negatively impact in any way?
On-Page Optimization | | APFM0 -
Too many page links warning... but each link has canonical back to main page? Is my page OK?
The Moz crawl warns me many of my pages have too many links, like this page http://www.webjobz.com/jobs/industry/Accounting ...... has 269 links but many of the links are like this /jobs/jobtitles/Accounting?k=&w=3&hiddenLocationID=463170&depth=2 and are used to refine search criteria.... when you click on those links they all have a canonical link back to http://www.webjobz.com/jobs/industry/Accounting Is my page being punished for this? Do I have to put "no follow" tags on every link I do not want the bots to follow and if I do so is Roger (moz bot) not going to count this as a link?
On-Page Optimization | | Webjobz0 -
Why is the seomoz showing it crawled 3 pages when i only have 2 pages?
I had seomoz crawl my site. I only have 2 pages. The site url is www.autoinsurancefremontca.com.
On-Page Optimization | | Greenpeak0 -
Keywords Qty per page.
I have a website http://www.versaillesdentalclinic.com with 20 pages in Total, but i need it to be on the top page of Google by 65 keywords and may be more. How many keywords per page shall I use? Currently I am promoting around 12 keywords high competitive for main page? Is it ok? I will appreciate a good answer 😉 Thanks, Russel
On-Page Optimization | | smokin_ace0 -
Page Rank
I had just made a 301 re-direct on one of our product pages which had a PR of 4, now that Google has indexed the new page, it's now got a PR of 0, i'm struggling to understand why this could be, i know that you may see a drop of 1, which has happened in the past, but this drop just does not make sense. Any ideas of why this could be? Kind Regards
On-Page Optimization | | Paul780 -
Page titles and descriptions
A website has several wigets to show Each wiget with its own page The wigets mostly just vary in size How would you suggest titles be done? Example: Wiget 1ft Wiget 2ft Wiget 3 ft an so on........ Would this trigger a duplicate content issue given “Wiget” leads in the page title?
On-Page Optimization | | APICDA0 -
Are you SEO master? What would you change on my page?
Take a look at this page and tell me what would you change from SEO perspective. It's always to easy to criticise so here is your chance. Good Luck. http://www.traxnyc.com/mens_jewelry.html
On-Page Optimization | | DiamondJewelryEmpire0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5