What is a good crawl budget?
-
Hi Community!
I am in the process of updating sitemaps and am trying to obtain a standard for what is considered "strong" crawl budget? Every documentation I've found includes how to make it better or what to watch out for. However, I'm looking for an amount to obtain for (ex: 60% of the sitemap has been crawled, 100%, etc.)
-
@blueprintmarketing I have a large website with Wordpress image folders going back to 2009.
I am currently redesigning my website, and I am trying to determine if there is any benefit to trying to shrink down / delete those images and image folders which I am no longer using.
I really do not have time to go through all of those image folders, and see which ones I am still using, and which ones I am not using anymore. I am hoping this does not matter?
Does anyone here know if this matters when it comes to Google's Crawl Budget?
All of the images are completely optimized and crunched, however, my question is whether it would be worth the time investment to go through every single folder and thousands of images and try to delete the ones which are not being referenced on any of my pages?
Does anyone have a definitive answer regarding Crawl Budget?
-
Can you give some inputs about the site [https://indiapincodes.net/](link url) I tried all recommendations, only 30% of the url is been indexed. would appreciate your time.
-
@yaelslater
Unless you have a huge site, I'm talking about half a million to one million pages. I Would not worry about True Google crawl budge anymore.However, if only 60% of URLs in your XML site map are being indexed, make sure they are indexable URLs if they're not index value, or else you should be able to click in the coverage section of Search Console. It will give you a reason why your URL was submitted by an XML site map or not noindex.
A recent study showed about 20% of URLs on all websites across the study were not indexed for one reason or another.But make sure there are only 200 URLs, no redirects 301, 302, or 404's or noindex nofollow URLs in the XML sitemap because obviously, Google does not put them into the index if the Search Console does not tell you the issue & you would like to share your domain with me, I'm sure I could figure it out.
I don't know if you're using a CDN and if you could share a little more with me especially the domain I can be a lot more helpful.
You could also use a tool like screaming frog and generate a new site map and make sure that is not the issue. If you're using Yoast, you can turn it on and off if you wanted to create a new site map.
You can create up to 500 pages for free using Screaming Frog SEO Spider it is paid after that https://www.screamingfrog.co.uk/xml-sitemap-generator/
Or if you want it or you can generate over 1000 URLs for free online I would recommend https://www.sureoak.com/seo-tools/google-xml-sitemap-generator
However, please keep in mind the sureoak tool has things like a "keyword density checker" that makes me feel like this site is giving out that information because that's not a real thing that Google considers unless you use the same word for every word in the document. Keyword density is one of those things that are not real
But the XML site map generator works just fineI hope this was of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Self Referencing Links - Good or Bad?
As an agency we get quite a few of our clients come to us saying "Ooo, this company just contacted me saying they've run an SEO report on my site and we need to improve on these following things" We had one come through the other day that had reported on something we had not seen in any others before. They called them self-referencing links and marked it as a point of action should be taken. They had stated that 100% of the pages on our clients website had self-referencing links. The definition of self-referencing is when there is a link on a page that is linking to the page you are currently on. So for example you're on the home page and there is a link in the nav bar at the top that says "Home" with a link to the home page, the page you are already currently on. Is it bad practice? And if so can we do anything about it as it would seem strange from a UI point of view not to have a consistent navigation. I have not heard anything about this before but I wanted to get confirmation before going back to our client and explaining. Thanks Mozzers!
Technical SEO | | O2C0 -
Are subdomains a good seo strategy for a multistore e-commerce?
Hi there I'm wondering what is the best strategy to work with multi-stores on magento: to use or not to use subdomains? Suppose we have the www.website.com and we configure it to use multistore. The url base will not have the store id on it so it will not be like www.website.com/store1 and www.website.com/store2. It will simply rely on the user session so if we have two categories for each store it will acces using: www.website.com/category1 (for store 1) www.website.com/category2 (for store 2) The homepage will allways be set on www.website.com so we should have a single page for several "home pages" (depending on the user session / store he is accessing). I guess this is not a good option if we want to rank for different keywords (for each store). So I was wondering if it is a good solution to set: store1.website.com store2.website.com This way we have 2 "home pages" each one able to rank. Does it make sense? Is it good or bad for seo? Another option I was considering was: www.website.com (for store 1) store2.website.com (for store 2) store3.website.com (for store 3) www.website.com/blog (for blog) Can this work? Good or bad for seo? best regards
Technical SEO | | qgairsoft0 -
When Should I Ignore the Error Crawl Report
I have a handful of pages listed in the Error Crawl Report, but the report isn't actually showing anything wrong with these pages. I am double checking the code on the site and also can't find anything. Should I just move on and ignore the Error Crawl Report for these few pages?
Technical SEO | | ChristinaRadisic0 -
How do I keep Google from crawling my PPC landing page?
Question, I don't want Google to catch my PPC landing pages, but I'm wondering if I make the page no-follow, no-index will it still crawl my landing page for quality score? Just to clarify I do want it to crawl the landing page to boost my quality score on PPC, but I do not want it to index it for SEO. Thanks 🙂
Technical SEO | | jhinchcliffe0 -
How do you diagnose if on your site is only 50% crawled?
Good Morning from 7 degrees C, goodbye arctic conditions wetherby UK, If a site had 100 pages for example & that site was plugged into Webmaster Tools how could you diagnose if all the pages had been crawled? The thing is I want to learn how to diagnose crawl issues with sites, is their a known methodology for this? Thanks in advance, David
Technical SEO | | Nightwing0 -
Suggestions on good framework/code for building an optimized website?
There seem to be quite a few template, framework, and theme options for building a site optimized for search. I'm currently looking at Socrates and Genesis premium themes for Wordpress. Does anyone have experience or suggestions on these resources?
Technical SEO | | ksracer0 -
How to increase the crawl rate?
hello, Our site was hosted in North America and Google was crawling it reasonably fast. Since our traffic is mostly from India we moved it to India, now the crawling is terribly slow from Google. Is there anyway to fix the crawl rate(we have increased the crawl rate in GWT)
Technical SEO | | greyniumseo0 -
E-Commerce Site Crawling Problem
Our website displays all of the products in our website If you attempt to visit a category or page that doesn't exist but conforms to our site url structure. Somehow google crawled these pages and indexed them, and they have TONS of duplicate content that hurt us. How do I deal with this problem?
Technical SEO | | 13375auc30