A sitemap... What's the purpose?
-
Hello everybody,
my question is really simple: what's the purpose of a sitemap?
It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily!
Am I wrong?
Thank you for yours answers,
Jonathan
-
I highly recommend checking out the Webinar Friday Rand did on this very subject: Getting Value from XML Sitemaps, HTML Sitemaps & Feeds.
-
If you have a static site with twenty pages that doesn't get new pages added very often then yes, a site map probably isn't of a whole lot of use if your website has good architecture.
However, if your site is 30,000 pages and gets new content added regularly, then an xml sitemap is useful to make sure that the engines know about all of your pages.
Using multiple sitemaps can be useful to help you diagnose what type of content Google is crawling best. A hypothetical example is that you have a large site where you a) sell baking supplies b) have recipes and c) have user profiles that you want indexed. You could submit a site map for each area (then a master sitemap that lists each of the sub sitemaps).
In Google Webmaster Tools, you get a report that says how many pages you submitted for each site map, and how many of those pages are indexed. using the above setup, you might find something like:
baking supplies has 50 URLs indexed out of 2000 submitted
recipes has 10,000 URLs indexed out of 11,000 submitted
users has 500 URLs indexed out of 1000 submittedAt a glance, you can tell that something is up with the products you're trying to sell and that Google isn't indexing that section very well, and you know to focus on that section, and maybe there's a bug in the code that put a noindex on most of the pages on accident.
Does that help?
-
A sitemap can help not only Google, but viewers find its way through your site. It is a great way to show the hierarchy and flow of your website. As mentioned, there are a few tools on the web that can help make this process pretty painless. At the end of the day, it can only help.
Hope that helps!
-
I agree to the benefits of having a sitemap on any website. Search for Google webmaster help on youtube. You can get to see a lot of supporting tutorials.
-
Hey Jonathan
A HTML sitemap can be useful for getting your site indexed and the XML one can also help with indexation but there are no guarantees that pages in the XML sitemap will be indexed. I read an article on here showing the indexation benefits of a sitemap and google have stated that they like you to have a HTML one for users as well as SEO so... it's like one of those 1% things, it may help a little bit, and it can't hurt but you still have to do everything else right.
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404's being re-indexed
Hi All, We are experiencing issues with pages that have been 404'd being indexed. Originally, these were /wp-content/ index pages, that were included in Google's index. Once I realized this, I added in a directive into our htaccess to 404 all of these pages - as there were hundreds. I tried to let Google crawl and remove these pages naturally but after a few months I used the URL removal tool to remove them manually. However, Google seems to be continually re/indexing these pages, even after they have been manually requested for removal in search console. Do you have suggestions? They all respond to 404's. Thanks
Technical SEO | | Tom3_151 -
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
What's best practice for blog meta titles?
I have the option of placing meta titles on the actual blog, or on the blog category on my site. Should I have separate meta titles for each blog or bundle them under a category and try to drive traffic to the category? Can anyone help with best practice?
Technical SEO | | Lubeman0 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
The 'On Page' section of SEOMOZ
How does SEOMOZ choose a keyword for a page, for example it has ranked one of my pages for a search term which does not really appear on that page and then given it an F - how do I change the key word association? Secondly, when I first started using SEOMOZ I could change the page and then click the button 'Grade my on-page optimization' and it would show an immediate update - does anyone know why this has been stopped, as it is very useful to know you have got the page right away to an A for example.
Technical SEO | | bowravenseo0 -
Will changing our colocation affect our site's link juice?
If we change our site's server location to a new IP, will this affect anything involving SEO? The site name and links will not be changing.
Technical SEO | | 9Studios0 -
Sitemaps - Format Issue
Hi, I have a little issue with a client site whose programmer seems kind of unwilling to change things that he has been doing a long time. So, he has had this dynamic site set up for a few years and active in google webmaster tools and others, but is not happy with the traffic it is getting. When I looked at webmaster tools I see that he has a sitemap registered, but it is /sitemap.php When I said that we should be offering the SE's /sitemap.xml his response is that sitemap.php checks the site every day and generates /sitemap.xml, but there is no /sitemap.xml registered in webmaster tools. My gut is telling me that he should just register /sitemap.xml in webmaster tools, but it is a hard sell 🙂 Anyone have any definitive experience of people doing this before and whether it is an issue? My feeling is that it doesn't need to be rocket science... Any input appreciated, Sha
Technical SEO | | ShaMenz0 -
Google has not indexed my site in over 4 weeks, what's the problem?
We recently put in permanent redirects to our new url, but Google seems to not want to index the new url. There was no problems with the old url and the new url is brand new so should have no 'black marks' against it. We have done everything we can think off in terms of submitting site maps, telling google our url has changed in webmaster tools, mentioning the new url on social sites etc...but still nothing. It has been over 4 weeks now since we set up the redirects to the url, any ideas why Google seems to be choosing not to index it? Thanks
Technical SEO | | cewe0