Do you need an on page site map as well as an XML Sitemap?
-
Do on page site maps help with SEO or are they more for user experience? We submit and update our XML Sitemaps for the search engines but wondering if /sitemap for users is necessary?
-
In my experience, the html sitemap does not effect SEO as far as indexing the site, assuming you have good XML sitemap and a good link structure. I
I would say that the sole reason to provide an html sitemap is UX, but that is a huge reason to do it. Anything that upgrades your UX, also has the potential to boost your SEO.
Making the full site easy to see and easy to explore will encourage longer site visits and more page views, and that effects your value in the eyes of the search engines.
To expand on what Branden said, specialized sitemaps are a very good idea. Seeing as you were hoping to get rid of a sitemap this may not be what you want to hear but you really should have a sitemap for each type of content you have.
Mobile | Video | News | Images | TV Shows and of course your general web sitemap
Take a moment to learn about each of them as you will want to understand each maps expectations. ie. You should never have anything on a news sitemap that is older than 2 days and Google will reject anything that does not meet it's criteria.
Here is the Google Resource that explains it all.
-
yes you should use both.
You should also have an xml sitemap for video. I recomend using Wistia if you have a good budget, since they automoatically create video xml.....otherwise use Vimeo Pro, if you are on a budget. http://www.seomoz.org/blog/hosting-and-embedding-for-video-seo
And an xml sitemap for your blog (Google News xml sitemap) ... if you create lots of articles and want your articles included in Google's news feed. http://www.seomoz.org/blog/how-seomoz-gained-1000s-of-visits-from-google-news
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site architecture? I've got a free user report, that shoots back a page with their data for them to share with co-workers and friends.
Hi, I have a site about to go online that users can run a free report that connects to their calendar app to get 12 months of statistics for their meetings, and then it shoots out a report. So they go to a.com/freereport and they get back a.zom/freereport/report/xxxxxx The content of those reports is different, but the structure is the same as it is a fun way to show off meeting stats to co-workers and friends. I don't see the point of Google indexing those as the traffic to those pages is going to be from social networks and viral, but I do want the backlink credit. Will I get backlink credit if I nofollow that folder? I am having a hard time deciding what to do seo wise and would love some thoughts and advice, what would you recommend? Do nothing fancy. Mark the report folder no follow. Try to do something with rel=cannonical to point those pages to the root page? Thoughts?
Technical SEO | | bwb0 -
Can you use a seperate url for a interior product page on a site?
I have a friend that has a health insurance agency site. He wants to add a new page, for child health care insurance to his existing site. But the issue is, he brought a new URL; insurancemykidnow.com and he want's to use it for the new page. Now, I'm not sure I'm right on this, but I don't think that can be done? I'm I wrong? = Thanks in advance.
Technical SEO | | Coppell0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Linking shallow sites to flagship sites
We have hundreds of domains that we are either doing nothing with, or they are very shallow. We do not have the time to build enough quality content on them since they are ancillary to our flagship sites that are already in need of attention and good content. My question is...should we redirect them to the flagship site? If yes, is it ok to do this from root domain to root domain or should we link the root domain to a matching/similar page (gymfranchises.com to http://www.franchisesolutions.com/health_services_franchise_opportunities.cfm)? Or should we do something different altogether? Since we have many to redirect (if this is the route we go), should we redirect gradually?
Technical SEO | | franchisesolutions0 -
Will getting backlinks to landing page from low quality sites negatively affect SEO?
I've recently started an initiative at my company to get our customers to publish a blog post about our company and to include a link to a landing page which sits on a subdomain attached to our main domain. The reason for directing visitors to the post to a landing page is to help with conversion. I've recently been thinking that couldn't the backlinks to this landing page from our customers' blogs (generally small sites) have a negative impact on the overall SEO of my companies domain? Thanks in advance.
Technical SEO | | JustinButlion0 -
Meta data & xml sitemaps for mobile sites when using rel="canonical"/rel="alternate" annotations
When using rel="canonical" and rel="alternate" annotations between mobile and desktop sites (rel="canonical" on mobile, pointing to desktop, and rel="alternate" on desktop pointing to mobile), what are everyone's thoughts on using meta data on the mobile site? Is it necessary? And also, what is the common consensus on using a separate mobile xml sitemap?
Technical SEO | | 4Ps0 -
Secondary Pages Indexed over Primary Page
I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?
Technical SEO | | Bucky0 -
International Site, flow of page rank?
OK. I'm working on an international site. The site is setup with folders for UK, US, AU e.g www.site.com/UK/index.aspx The root (non folder based) is the international version of the site e.g www.site.com/index.aspx www.site.com/index.aspx has the lions share of links. Therefore, the pages immediately linked from www.site.com/index.aspx have page rank distributed between them. My UK, US and AU home pages are linked via a country selector from the www.site.com/index.aspx page via an aspx redirect page that 301's to the appropriate country home page. Therefore the home pages of UK, US, AU are recieving some of the 'juice' that is coming in to www.site.com/index.aspx (but only a fraction via the redirect links) Am I right in thinking that pages on the international version of the site will have much more potential to rank (because of their 'juice') than the pages on UK, US and AU versions of the site? If so, am I right in thinking that these will tend to rank over the equivalent UK, US and AU versions of the pages in each country version of Google despite having set directory level Geo-targetting in GWT?
Technical SEO | | QubaSEO1