Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many links can you have on sitemap.html
-
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
-
Sitemaps are limited to 50MB (uncompressed) and 50,000 URLs from Google perspective.
All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break it into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google.
Just for everyone's references - here is a great list of 20 limits that you may not know about.
-
Hi Imjonny,
As you know google crawl all pages without creating any sitemap. You don't need to create html sitemap. Xml sitemap is sufficient to crawl all pages. if you have millions pages, You need to create html sitemap with proper category wise and keep upto 1000 links on one page. . As you know html site map is creating for user not Google, So you don't need to worry about that too much.
Thanks
Rajesh -
We break ours down to 1000 per page. A simple setting in Yoast SEO - if you decide to use their sitemap tool. It's worked well for us though I may bump that number up a bit.
-
Well rather the amount of links each page of the sitemap.html is allowed to have. For example, If I have a huge site, I don't want to place all links on 1 page, I would probably break them out to allow the crawlers some breathing room between different links.
-
Hello!
I get that you are referring to the maximum size and/or the limit of URLs the sitemap file can have. That gets answered in the faq of sitemap.org: (link here)
Q: How big can my Sitemap be?
Sitemaps should be no larger than 50MB (52,428,800 bytes) and can contain a maximum of 50,000 URLs.Best luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do links from subdomains pass the authority and link juice of main domain ?
Hi, There is a subdomain with a root domain's DA 90. I can earn a backlink from that subdomain. This subdomain is fresh with no traffic yet. Do I get the ranking boost and authority from the subdomain? Example: I can earn a do-follow link from **https://what-is-crm.netlify.app/ **but not from https://netlify.app
White Hat / Black Hat SEO | | teamtc0 -
Spam Score & Redirecting Inbound Links
Hi, I recently downloaded a spreadsheet of inbound links to my client sites and am trying to 301 redirect the ones that are formatted incorrectly or just bad links in general (they all link to the site domain, but they used to have differently formatted urls on their old site, or the link URL in general has strange stuff on it). My question is, should I even bother redirecting these links if their spam score is a little high (i.e. 20-40%)? it already links to the existing domain, just with a differently formatted URL. I just want to make sure it goes to a valid URL on the site, but I don't want to redirect to a valid URL if it's going to harm the client's SEO. Also not sure what to do about the links with the --% spam score. I really appreciate any input as I don't have a lot of experience with how to deal with spammy links.
White Hat / Black Hat SEO | | AliMac260 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Suspicious external links to site have 302 redirects
Hi, I have been asked to look at a site where I suspect some questionable SEO work, particularly link building. The site does seem to be performing very poorly in Google since January 2014, although there are no messages in WMT. Using WMT, OPenSiteExplorer, Majestic & NetPeak, I have analysed inbound links and found a group of links which although are listed in WMT, etc appear to 302 redirect to a directory in China (therefore the actual linking domain is not visible). It looks like a crude type of link farm, but I cant understand why they would use 302s not 301s. The domains are not visible due to redirects. Should I request a disavow or ignore? The linking domains are listed below: http://www.basalts.cn/
White Hat / Black Hat SEO | | crescentdigital
http://www.chinamarbles.com.cn/
http://www.china-slate.com.cn/
http://www.granitecountertop.com.cn/
http://www.granite-exporter.com/
http://www.sandstones.biz/
http://www.stone-2.com/
http://www.stonebuild.cn/
http://www.stonecompany.com.cn/
http://www.stonecontact.cn/
http://www.stonecrate.com/
http://www.stonedesk.com/
http://www.stonedvd.com/
http://www.stonepark.cn/
http://www.stonetool.com.cn/
http://www.stonewebsite.com/ Thanks Steve0 -
How do you change the 6 links under your website in Google?
Hello everyone, I have no idea how to ask this question, so I'm going to give it a shot and hopefully someone can help me!! My company is called Eteach, so when you type in Eteach into Google, we come in the top position (phew!) but there are 6 links that appear underneath it (I've added a picture to show what I mean). How do you change these links?? I don't even know what to call them, so if there is a particular name for these then please let me know! They seem to be an organic rank rather than PPC...but if I'm wrong then do correct me! Thanks! zorIsxH.jpg
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
Can you set up a Google Local account under a PO Box?
I have a client that wants a Google local listing in a town he serves but does not have a physical location. Is it an issue to share an address with an existing company? Is is it better to use a P.O. Box? or is there a forwarding address company? Is this considered a black hat Local SEO tactic?
White Hat / Black Hat SEO | | BonsaiMediaGroup0