What is a good crawl budget?
-
Hi Community!
I am in the process of updating sitemaps and am trying to obtain a standard for what is considered "strong" crawl budget? Every documentation I've found includes how to make it better or what to watch out for. However, I'm looking for an amount to obtain for (ex: 60% of the sitemap has been crawled, 100%, etc.)
-
@blueprintmarketing I have a large website with Wordpress image folders going back to 2009.
I am currently redesigning my website, and I am trying to determine if there is any benefit to trying to shrink down / delete those images and image folders which I am no longer using.
I really do not have time to go through all of those image folders, and see which ones I am still using, and which ones I am not using anymore. I am hoping this does not matter?
Does anyone here know if this matters when it comes to Google's Crawl Budget?
All of the images are completely optimized and crunched, however, my question is whether it would be worth the time investment to go through every single folder and thousands of images and try to delete the ones which are not being referenced on any of my pages?
Does anyone have a definitive answer regarding Crawl Budget?
-
Can you give some inputs about the site [https://indiapincodes.net/](link url) I tried all recommendations, only 30% of the url is been indexed. would appreciate your time.
-
@yaelslater
Unless you have a huge site, I'm talking about half a million to one million pages. I Would not worry about True Google crawl budge anymore.However, if only 60% of URLs in your XML site map are being indexed, make sure they are indexable URLs if they're not index value, or else you should be able to click in the coverage section of Search Console. It will give you a reason why your URL was submitted by an XML site map or not noindex.
A recent study showed about 20% of URLs on all websites across the study were not indexed for one reason or another.But make sure there are only 200 URLs, no redirects 301, 302, or 404's or noindex nofollow URLs in the XML sitemap because obviously, Google does not put them into the index if the Search Console does not tell you the issue & you would like to share your domain with me, I'm sure I could figure it out.
I don't know if you're using a CDN and if you could share a little more with me especially the domain I can be a lot more helpful.
You could also use a tool like screaming frog and generate a new site map and make sure that is not the issue. If you're using Yoast, you can turn it on and off if you wanted to create a new site map.
You can create up to 500 pages for free using Screaming Frog SEO Spider it is paid after that https://www.screamingfrog.co.uk/xml-sitemap-generator/
Or if you want it or you can generate over 1000 URLs for free online I would recommend https://www.sureoak.com/seo-tools/google-xml-sitemap-generator
However, please keep in mind the sureoak tool has things like a "keyword density checker" that makes me feel like this site is giving out that information because that's not a real thing that Google considers unless you use the same word for every word in the document. Keyword density is one of those things that are not real
But the XML site map generator works just fineI hope this was of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
Pages Crawl Per Day Gone Drasitcaly Down, is it google issue?
Hello Expert, In search console in Crawl Stats Pages Crawl per day going day by day i.e. from 4 lac pages per day now it is reduce upto 2 lac in last 15 days. So where is the issue? Where I am going wrong or it is issue from google end? Thanks!
Technical SEO | | Johny123450 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
WEBMASTER console: increase in the number of URLs we were blocked from crawling due to authorization permission errors.
Hi guys,I received this warning in my webmaster console: "Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors." So i went to "Crawl Errors" section and i found such errors under "Access denied" status: ?page_name=Cheap+Viagra+Gold+Online&id=471 ?page_name=Cheapest+Viagra+Us+Licensed+Pharmacies&id=1603 and many happy URLs like these. Does anybody know what this is and where it comes from? Thanks in advance!
Technical SEO | | odmsoft0 -
Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
Hello Guys, I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago). Reasons are: These pages sent no organic traffic at all in this 8 months Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content) Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well? That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree? Now to the real question: Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects. Cheers, Heiko
Technical SEO | | _Heiko_0 -
Numerous 404 errors on crawl diagnostics (non existent pages)..
As new as them come to SEO so please be gentle.... I have a wordpress site setup for my photography business. Looking at my crawl diagnostics I see several 4xx (client error) alerts. These all show up to non existent pages on my site IE: | http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109 | Totally lost on what could be causing this. Thanks in advance for any help!
Technical SEO | | Swanny8110 -
Is it a good idea to use the rel canonical tag to refer to the original source?
Sometimes we place our blog post also on a external site. In this case this post is duplicated. Via the post we link to the original source but is it also possible to use the rel canonical tag on the external site? For example: The original blogpost is published on http://www.original.com/post The same blogpost is published on http:///www.duplicate.com/post. In this case is it wise to put a rel canonical on http://www.duplicate.com/post like this: ? What do you think? Thanks for help! Robert
Technical SEO | | Searchresult0 -
seo moz crawl diagnosis
Hi seomozzers, We are creating a brand new website for a client and I would like to run an seo moz crawl to fix what has been done wrong. So my question is it ok to run an SEO moz crawl with a dev URL? Are final URLs and dev URLs will give me the same results or not? Basically, Should I wait for getting the final URL or is it ok to run a crawl under a dev URL such as www.dev2.example.com or http://183.2564.2864? Thank you 🙂
Technical SEO | | Ideas-Money-Art0