Do you need an on page site map as well as an XML Sitemap?
-
Do on page site maps help with SEO or are they more for user experience? We submit and update our XML Sitemaps for the search engines but wondering if /sitemap for users is necessary?
-
In my experience, the html sitemap does not effect SEO as far as indexing the site, assuming you have good XML sitemap and a good link structure. I
I would say that the sole reason to provide an html sitemap is UX, but that is a huge reason to do it. Anything that upgrades your UX, also has the potential to boost your SEO.
Making the full site easy to see and easy to explore will encourage longer site visits and more page views, and that effects your value in the eyes of the search engines.
To expand on what Branden said, specialized sitemaps are a very good idea. Seeing as you were hoping to get rid of a sitemap this may not be what you want to hear but you really should have a sitemap for each type of content you have.
Mobile | Video | News | Images | TV Shows and of course your general web sitemap
Take a moment to learn about each of them as you will want to understand each maps expectations. ie. You should never have anything on a news sitemap that is older than 2 days and Google will reject anything that does not meet it's criteria.
Here is the Google Resource that explains it all.
-
yes you should use both.
You should also have an xml sitemap for video. I recomend using Wistia if you have a good budget, since they automoatically create video xml.....otherwise use Vimeo Pro, if you are on a budget. http://www.seomoz.org/blog/hosting-and-embedding-for-video-seo
And an xml sitemap for your blog (Google News xml sitemap) ... if you create lots of articles and want your articles included in Google's news feed. http://www.seomoz.org/blog/how-seomoz-gained-1000s-of-visits-from-google-news
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Mobile site content and main site content
Help, pls! I have one main site and a mobile version of that site (m.domain.com). The main site has more pages, more content, different named urls. The main site has consistently done well in Google. The mobile site has not: the mobile site is buried. I am working on adding more content to the mobile site, but am concerned about duplicate content. Could someone pls tell me the best way to deal with these two versions of our site? I can't use rel=canonical because the urls do not correspond to the same names on the main site, or can I? Does this mean I need to change the url names, offer different content (abridged), etc? I really am at a loss as to how to interpret Google's rules for this. Could someone please tell me what I am doing wrong? Any help or tips would GREATLY appreciated!!!!! Thanks!
Technical SEO | | lfrazer0 -
Need for a modified meta-description every page for paginated content?
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page. The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3 Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure. Any advice?
Technical SEO | | My-Favourite-Holiday-Cottages0 -
Keyword targeting by page, site, or both?
Hi, We recently discovered that a product we sell has a misnomer, and that a ton of people take to Google and use variations of that misnomer while trying to find us. Unfortunately we don't rank in Google for this keyword, and its costing us thousands in lost sales. I've been slowly building the misnomer into the content of our site in hopes that the spiders will pick up on it. It has started to work in the last couple weeks, but we're nowhere near the top (and we are #1 and #2 for most of our other prime keywords.) The site which sells the product is specialized, and only sells this specific product (in different models, but they're all the same product essentially.) With that in mind, I'm trying to figure out the best way to attack a new keyword. I know that normally you would dedicate a specific page (in an eCommerce store probably that product's own page) to employ your SEO tactics. However, because this site specializes in this product and offers different models and information about it I'm confused about the best approach. Does Google take into consideration the entire site a s whole, or are the pages within my site competing against each other for rank?
Technical SEO | | ninjaprecision0 -
Should I ask third pages to erase their links pointing at my site?
Good Morning Seomoz Fans, let me explain what is going on: A surfing site has included a link to my Site in their Footer. apparently, this could be good for my site, but as It has nothing to do with my site, I ask myself if I should tell them to erase it. Site A (Surfing Site) is pointing at Site B (Marketing Site) on their Footer. So Site B is receiving backlinks from every single page on Site A. But Site B has nothing to do with Site A: Different Markets. Should I ask them to erase the link on their footer as Surfing people will not find my Marketing Site interesting? Thanks in advance.
Technical SEO | | Tintanus0 -
Google indexing less url's then containded in my sitemap.xml
My sitemap.xml contains 3821 urls but Google (webmaster tools) indexes only 1544 urls. What may be the cause? There is no technical problem. Why does Google index less URLs then contained in my sitemap.xml?
Technical SEO | | Juist0 -
Keywords Ranking Dropped from 1st Page to Above 5th Page
Hello, My site URL is http://bit.ly/161NeE and our site was ranked first page for over hundred keywords before March, 30. But all of a sudden, all the keywords on first page dropped to 5th or 6th page. When we search for our site name without ".com", the results appeared on first page are all from other sites. And our page can only be seen on 6th page. We think we have been penalized by Google. But we don't know the exact reason. Can anyone please help? Some extra info on our site: 1. We have been building links by posting blog, articles and PR. All the articles are unique, written by the writers we hire. It has been working fine all the time. We also varied the anchor text a lot. 2. We didn't make any change to the website. But one real problem with our site is that the server is very slow recently and when google crawl our website, many errors were found, mostly 503, 404 errors. And the total number of errors have reach to over 50,000. Do you think this might be a problem for Google not displaying us on first page? Our technicals are working hard to solve server problem. And if it is solved, shall our rankings be back? Please advise. Thanks.
Technical SEO | | Milanoocom0