Do you need an on page site map as well as an XML Sitemap?
-
Do on page site maps help with SEO or are they more for user experience? We submit and update our XML Sitemaps for the search engines but wondering if /sitemap for users is necessary?
-
In my experience, the html sitemap does not effect SEO as far as indexing the site, assuming you have good XML sitemap and a good link structure. I
I would say that the sole reason to provide an html sitemap is UX, but that is a huge reason to do it. Anything that upgrades your UX, also has the potential to boost your SEO.
Making the full site easy to see and easy to explore will encourage longer site visits and more page views, and that effects your value in the eyes of the search engines.
To expand on what Branden said, specialized sitemaps are a very good idea. Seeing as you were hoping to get rid of a sitemap this may not be what you want to hear but you really should have a sitemap for each type of content you have.
Mobile | Video | News | Images | TV Shows and of course your general web sitemap
Take a moment to learn about each of them as you will want to understand each maps expectations. ie. You should never have anything on a news sitemap that is older than 2 days and Google will reject anything that does not meet it's criteria.
Here is the Google Resource that explains it all.
-
yes you should use both.
You should also have an xml sitemap for video. I recomend using Wistia if you have a good budget, since they automoatically create video xml.....otherwise use Vimeo Pro, if you are on a budget. http://www.seomoz.org/blog/hosting-and-embedding-for-video-seo
And an xml sitemap for your blog (Google News xml sitemap) ... if you create lots of articles and want your articles included in Google's news feed. http://www.seomoz.org/blog/how-seomoz-gained-1000s-of-visits-from-google-news
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Off-site company blog linking to company site or blog incorporated into the company site?
Kind of a SEO newbie, so be gentle. I'm a beginner content strategist at a small design firm. Currently, I'm working with a client on a website redesign. Their current website is a single page dud with a page authority of 5. The client has a word press blog with a solid URL name, a domain authority of 100 and page authority of 30. My question is this: would it be better for my client from an SEO perspective to: Re-skin their existing blog and link to the new company website with it, hopefully passing on some of its "Google Juice,"or... Create a new blog on their new website (and maybe do a 301 redirect from the old blog)? Or are there better options that I'm not thinking of? Thanks for whatever help you can give a newbie. I just want to take good care of my client.
Technical SEO | | TheKatzMeow0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
One site per location or all under and umbrella site?
I am working on a project where we are re-branding lots (100+) existing local business under one national brand. I am wondering what we should do with their existing websites, they are generally fairly poor and will need re-designing to match the new brand but may have some residual links? 301 redirect the URL to the national site, e.g. nationalsite.com/localbusinessA? If so what should I look out for? Do I need to specifically redirect any pages that have links to them to the same pages on the new site? Or should I give them a new standalone website that they link back to the national brand site? More than likely this will be hosted on the same server and CMS as the main site just the URL will remain Do I need to make sure that any old URL's that had links to them are 301'd to the new pages? Many thanks for you advice.
Technical SEO | | BadgerToo0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
How ro write a robots txt file to point to your site map
Good afternoon from still wet & humid wetherby UK... I want to write a robots text file that instruct the bots to index everything and give a specific location to the sitemap. The sitemap url is:http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspx Is this correct: User-agent: *
Technical SEO | | Nightwing
Disallow:
SITEMAP: http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspx Any insight welcome 🙂0 -
Google WMT shows sitemap.xml highest ranked for one main keyword
Hello, I am seeing my sitemap.xml show up in Google webmaster tools at the top for one of the main keywords for my site. This is in the Your Site on the Web - Keywords section. The URLs of my site contain this keyword, which is why I figure it showed up. I'm curious if this should be a concern to me? I find it odd that the sitemap would show up in this way. Thanks
Technical SEO | | nux0 -
Ror.xml vs sitemap.xml
Hey Mozzers, So I've been reading somethings lately and some are saying that the top search engines do not use ror.xml sitemap but focus just on the sitemap.xml. Is that true? Do you use ror? if so, for what purpose, products, "special articles", other uses? Can sitemap be sufficient for all of those? Thank you, Vadim
Technical SEO | | vijayvasu0