Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many links can you have on sitemap.html
-
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
-
Sitemaps are limited to 50MB (uncompressed) and 50,000 URLs from Google perspective.
All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break it into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google.
Just for everyone's references - here is a great list of 20 limits that you may not know about.
-
Hi Imjonny,
As you know google crawl all pages without creating any sitemap. You don't need to create html sitemap. Xml sitemap is sufficient to crawl all pages. if you have millions pages, You need to create html sitemap with proper category wise and keep upto 1000 links on one page. . As you know html site map is creating for user not Google, So you don't need to worry about that too much.
Thanks
Rajesh -
We break ours down to 1000 per page. A simple setting in Yoast SEO - if you decide to use their sitemap tool. It's worked well for us though I may bump that number up a bit.
-
Well rather the amount of links each page of the sitemap.html is allowed to have. For example, If I have a huge site, I don't want to place all links on 1 page, I would probably break them out to allow the crawlers some breathing room between different links.
-
Hello!
I get that you are referring to the maximum size and/or the limit of URLs the sitemap file can have. That gets answered in the faq of sitemap.org: (link here)
Q: How big can my Sitemap be?
Sitemaps should be no larger than 50MB (52,428,800 bytes) and can contain a maximum of 50,000 URLs.Best luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Score & Redirecting Inbound Links
Hi, I recently downloaded a spreadsheet of inbound links to my client sites and am trying to 301 redirect the ones that are formatted incorrectly or just bad links in general (they all link to the site domain, but they used to have differently formatted urls on their old site, or the link URL in general has strange stuff on it). My question is, should I even bother redirecting these links if their spam score is a little high (i.e. 20-40%)? it already links to the existing domain, just with a differently formatted URL. I just want to make sure it goes to a valid URL on the site, but I don't want to redirect to a valid URL if it's going to harm the client's SEO. Also not sure what to do about the links with the --% spam score. I really appreciate any input as I don't have a lot of experience with how to deal with spammy links.
White Hat / Black Hat SEO | | AliMac260 -
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Can a Self-Hosted Ping Tool Hurt Your IP?
Confusing title I know, but let me explain. We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates. This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP. Thoughts?
White Hat / Black Hat SEO | | David-Kley0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Are All Paid Links and Submissions Bad?
My company was recently approached by a website dedicated to delivering information and insights about our industry. They asked us if we wanted to pay for a "company profile" where they would summarize our company, add a followed link to our site, and promote a giveaway for us. This website is very authoritative and definitely provides helpful use to its audience. How can this website get away with paid submissions like this? Doesn't that go against everything Google preaches? If I were to pay for a profile with them, would I request for a "nofollow" link back to my site?
White Hat / Black Hat SEO | | jampaper1 -
Internal Links to Ecommerce Category Pages
Hello, I read a while back, and I can't find it now, that you want to add internal links to your main category pages. Does that still apply? If so, for a small site (100 products) what is recommended? Thanks
White Hat / Black Hat SEO | | BobGW0 -
Is it worth getting links from .blogspot.com and .wordpress.com?
Our niche ecommerce site has only one thing going for it: We have numerous opportunities on a weekly basis to get reviews from "mom bloggers". We need links - our domain authority is depressing. My concern is that these "mom bloggers" tend to have blogs that end with .blogspot.com or .wordpress.com. How do I screen for "reviewers" that are worth getting links from and how can I make the most of the community we have available to us?
White Hat / Black Hat SEO | | Wilkerson1 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380