Is there a suggested limit to the amount of links on a sitemap?
-
Currently, I have an error on my moz dashboard indicating there are too many links on one of my pages. That page is the sitemap. It was my understanding all internal pages should be linked to the sitemap.
Can any mozzers help clarify the best practice here?
Thanks,
Clayton
-
Your html sitemap is best for website visitors, so best practice is to list the most important sections/pages. Google can use your html sitemap page to crawl the rest of your site as long as the structure can be followed.
If you have lots of pages, then it's best to us an xml sitemap to submit through Google Webmaster. Once your xml sitemap is in the root directory of your website, you can also let search engines know its location through your robots.txt file like this:
User-agent: * Sitemap: http://www.SomeDomain.com/sitemap.xml
If your site changes over time, it's a good idea to create fresh sitemaps - just set reminders for yourself in a calendar.
-
That makes sense. Thanks.
-
Thanks for the help BrewSEO, Darin, and Zora.
-
I believe he is referring to an actual site map html page, not an XML file to submit to Google.
-
Don't worry about it. The "too many links" message is based on Google's suggestion to have less than 100 links per-page. Obviously site-maps are going to be an exception to this rule, and with good reason. You are fine.
-
The answer is "technically" 50,000.
However, the size for sitemap matters too (no bigger than 50MB).
If you have more than these numbers allow for in Google's Guidelines then you break them up and have multiple sitemaps on your site.
Here is Google's Guidelines on Sitemaps:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=183668
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
Sitemap
I have a question for the links in a sitemap. Wordpress works with a sitemap that first link to the different kind of pages: pagesitemap.xml categorysitemap.xml productsitemap.xml etc. etc. These links on the first page are clickable. We have a website that also links to the different pages but it's not clickable, just a flat link. Is this an issue?
Technical SEO | | Happy-SEO0 -
Recovering from Sitemap Issues with Bing
Hi all, I recently took over SEO efforts for a large e-commerce site (I would prefer not to disclose). About a month ago, I began to notice a significant drop in traffic from Bing and uncovered in Bing Webmaster Tools that three different versions of the sitemap were submitted and Bing was crawling all three. I removed the two out of date sitemaps and re-submitted the up to date version. Since then, I have yet to see Bing traffic rebound and the amount of pages indexed by Bing is still dropping daily. During this time there has been no issue with traffic from Google. Currently I have 1.3 million pages indexed by Google while Bing has dropped to 715K (it was at 755K last week and was on par with Google several months ago). I know that no major changes have been made to the site in the past year so I can't point to anything other than the sitemap issue to explain this. If this is indeed the only issue, how long should I expect to wait for Bing to re-index the pages? In the interim I have been manually submitting important pages that aren't currently in the index. Any insights or suggestions would be very much appreciated!
Technical SEO | | tdawson090 -
Webmaster tools not showing links but Moz OSE is showing links. Why can't I see them in the Google Search Console
Hi, Please see attached photos. I have a website that shows external follow links when performing a search on open site explorer. However, they are not recognised or visible in search console. This is the case for both internal and external links. The internal links are 'no follow' which I am getting developer to rectify. Any ideas why I cant see the 'follow' external links? Thanks in advance to those who help me out. Jesse T7dkL5s T7dkL5s OkQmPL4 3qILHqS
Technical SEO | | jessew0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
Hi, I am performing an analysis of the total inbound links to my homepage and I would like to have the total amount of inbound links categorized by the Linking root domains. For example, the Open Site explorer does offer the feature to show you the Linking Root Domains to your page. Then when you click on the first Linking Root Domain, it also shows you the Top Linking Pages ( Which means all the pages that link to your page from this particular top level domain) Now I would like to export this data to a CSV file, but open site explorer only exports the total amount of top level linking domains. Does anyone has a solution to this problem ? Thank you very much for the help in advance!
Technical SEO | | Feweb0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Could somebody suggest a GOOD Wordpress XML sitemap generator?
We have been putzing around with Google XML Sitemaps Generator (a plug-in on Wordpress) for our Wordpress blog and we cannot get it to write an XML sitemap! Could somebody suggest a viable alternative that actually works? Thank you for your help! Jay
Technical SEO | | theideapeople0 -
External Sitewide Links and SEO
I have one big question about the potential SEO value -- and possibly also dangers? -- of "followed" external sitewide links. Examples of these would be: a link to your site from another site's footer a blogroll link a link to your site from another site's global navigation Aside from the link's position in the HTML file (the higher the better, presumably), are these links essentially the same from an SEO point of view or different (and how)? There used to be an influential view out there that the link juice value of a sitewide link was the same as that of a single link (presumably from the linking site's home page), even though a sitewide link may in fact result a huge number individual links. Is this true or false? What is the math here? Should one worry about having "too many" sitewide links, in the sense that this may raise red flags by way of the algo? I talked to someone a few months ago (before the recent algo updates) who believed that he had got a minus 10 penalty or whatever it was for getting too many sitewide links We offer website design and development as well as SEO, and we put a keyworded link to ourselves in the footer. I think this is a fairly common practice. Is this a good or bad idea SEO-wise? One opinion is that for external sitewide footer links, you should best have a dofollow link on the home page, but nofollow it on all other pages. What is your opinion about that? Is there anything else that is distinct, interesting or important about sitewide links' SEO value and pitfalls? Thank you!
Technical SEO | | Philip-SEO1