Sitmap Page - HTML and XML
-
Hi there
I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders.
I have a warning via seomoz saying that i have too many links on the html version.
What do i do here?
regards
Stef
-
Sorry for late reply guys. Great advice by both of you.
@ Alan, great display on how Page Rank flows. Great illustration which i totally could never explain to clients
-
220 links on a page is absolutely not too many on any level. Many of the highest ranked sites on the internet present more then 220 links.
The particular page is question is simply a sitemap, and the page is being offered to help users navigate the site. The VerizonWireless.com sitemap I shared has 370+ links on it.
The SEOmoz "warning" is a simple feature which will be set off on any internet page with 100+ links. The SEOmoz tool does not care how well those links are presented, whether they are footer links, whether they are on a content page, what the PA of the page is nor any other SEO factor. It is simply a >100 or not warning. As such, it offers very little value.
I am in the process of compiling a list of suggested features for the tool which will help improve it's usefulness. One of the feature recommendations I am proposing is to allow users to adjust the 100 count to any number they want. Each SEO can then choose to use the default 100 number, or use a number more suited to the particular site.
The link Alan shared is a nice explanation of PR flow. It is a nice page for learning PR, but with respect to this topic it over-complicates an otherwise very simple and straight-forward question. The simple point is, the more links on a page the less link juice will flow to each link.
The goals for any web page links should be as follows:
1. Ensure all links are useful for your site. For example, you probably want PR flowing to your most profitable product/service, and to your latest additions.
2. Ensure your links are actually used. Check analytics.
3. If a link is not used or not useful, remove it.
4. Along the lines above, your links should be presented in a very user-friendly manner. You don't want a page to look like a list of nothing but links as users will have a difficult time choosing what they want. An exception would be a sitemap.
With the above in mind, keep as many links as you see fit on the page. If it is 40, that is fine. If there are 250 links on the page, that is fine as well. When you start down a path of chasing numbers such as forcing your content into "500 words" or forcing your links into "100 maximum" you fall into a pit of SEO fallacies. You are not providing the best experience for your users nor SEO.
TL;DR - Provide your links in a manner which is visually appealing, non-spammy and helpful to users. Keep in mind your need to flow PR to important pages such as your money pages. Otherwise remove unnecessary links. Whatever that number of links is, so be it. Don't try to fit your links into a "I must be under 100" or any other number mindset.
-
too many according to google. make of it what you will, does not look like it is for any technial reason anyymore, but obviously there is a limit to how much of page they will crawl.
http://www.mattcutts.com/blog/how-many-links-per-page/You see how page rank flows, having a lot of links on your home page works to your advantage. Using numbers from Googles original algo,
Assuming every page starts with 1PR, a page passes %85 of its link juice, so if you have 100 links that’s 0.0085 each. To 100 internal pages, making them 1. 0085each , now they all pass back 85% that’s 0.857225 each, x 100 = 85.7225 back to your home page, now we do the sums all over again and again till they numbers stay static. Now this calculation relies on the internal pages having no other links, so you are unlikely to get figures as good as this, but you get the idea.
See link for better explanation.
http://www.webworkshop.net/pagerank.html check out calculator
Remember don’t stuff up your linking stuckture for the users just for the sake of page rank.I see it as like a golf swing after a lesson, if you try to do what you just learnt too much, you will get all stiff and un-natural, it’s better to swing naturally with what you have learnt in the back of your head.
-
Yes, ignore the warning.
It is possible to present 220 links in a neat, categorized manner. It is also possible to present 100 links as a jumble which is not user friendly.
You shared your presentation is similar to the example I shared which means it is user friendly so ignoring the warning is fine.
-
Nice, i really like that example that you gave. My one is similar and categorized too. Question still remains, do i ignore this warning for this page?
-
I have about 220 links
-
Wel how many do you have.
A quick way of checking is with IE, press F12, go to view menu, then link report
-
Your HTML sitemap is for users. It should present your links in such a manner as to be useful for users who are looking for a page on your site.
An example sitemap for a large site: http://www.verizonwireless.com/b2c/sitemap.jsp
It does not contain a link to every last page. It is more of a helpful directory. I would suggest you adjust your HTML sitemap in a similar manner. Treat is as a page of links for users.
-
So do you think that i should ignore this warning for the sitemap html page?
-
Well have a look if you can move a few out, it is good to link to as many pages as you can from the home page for the sake of PR flow. but not go over the limit, Some say the limit is 100, some say 150
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
Hi There! The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon? Greetings Bob
Technical SEO | | rijwielcashencarry0400 -
Roger returned single page
A search query in Google shows 156 indexed results. Yet Roger has completed a crawl and returned a single page. The website is table based, but there appear to be no redirects or Javascript blocking bots so I'm unsure why Roger has under delivered. The problem is not unique to Roger. I ran the site through Screaming Frog's desktop crawler and it also returned a single page. I'm wondering if there is something in the site code I don't know to look for that is stopping Roger & Screaming Frog from crawling the site. Appreciate any insights you can offer. PS. I've read https://seomoz.zendesk.com/entries/409821-Why-Isn-t-My-Site-Being-Crawled-You-re-Not-Crawling-All-My-Pages- and don't think these suggested causes apply.
Technical SEO | | NicDale0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
Are building a page using HTML 5 better for seo?
Very general question really, but does anyone know whether Google sees html5 pages as being superior in any way to xhtml or html 4.x pages?
Technical SEO | | jimpannell0 -
More than 1 XML Sitemap
I recently took over administration of my site and I have 2 XML sitemaps for my main site and 1 XML sitemap for my blog (which is a sub-page of the main site). Don't I only need 1 sitemap for my site and one for my blog? I don't know which one to delete - they both has the same page authority. Also, only 1 of them is accessible by browser search. http://www.rmtracking.com/rmtracking-sitemap.xml - accessible in browser http://www.rmtracking.com/sitemap.xml - regularly updated in Google Webmaster Tools but not accessible in search browser. I don't have any error messages in Webmaster tools.
Technical SEO | | BradBorst0 -
Duplicate Page Title
The crawl of my website http://www.aboutaburningfire.com revealed an error showing a duplicate page title. Can someone please explain to me how to fix this? I'm not sure what it means or how to fix it. | House Church Chicago, Organic Church, Illinois http://www.aboutaburningfire.com/ 1 Pending Pending House Church Chicago, Organic Church, Illinois http://www.aboutaburningfire.com/index.html |
Technical SEO | | severity0