A sitemap... What's the purpose?
-
Hello everybody,
my question is really simple: what's the purpose of a sitemap?
It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily!
Am I wrong?
Thank you for yours answers,
Jonathan
-
I highly recommend checking out the Webinar Friday Rand did on this very subject: Getting Value from XML Sitemaps, HTML Sitemaps & Feeds.
-
If you have a static site with twenty pages that doesn't get new pages added very often then yes, a site map probably isn't of a whole lot of use if your website has good architecture.
However, if your site is 30,000 pages and gets new content added regularly, then an xml sitemap is useful to make sure that the engines know about all of your pages.
Using multiple sitemaps can be useful to help you diagnose what type of content Google is crawling best. A hypothetical example is that you have a large site where you a) sell baking supplies b) have recipes and c) have user profiles that you want indexed. You could submit a site map for each area (then a master sitemap that lists each of the sub sitemaps).
In Google Webmaster Tools, you get a report that says how many pages you submitted for each site map, and how many of those pages are indexed. using the above setup, you might find something like:
baking supplies has 50 URLs indexed out of 2000 submitted
recipes has 10,000 URLs indexed out of 11,000 submitted
users has 500 URLs indexed out of 1000 submittedAt a glance, you can tell that something is up with the products you're trying to sell and that Google isn't indexing that section very well, and you know to focus on that section, and maybe there's a bug in the code that put a noindex on most of the pages on accident.
Does that help?
-
A sitemap can help not only Google, but viewers find its way through your site. It is a great way to show the hierarchy and flow of your website. As mentioned, there are a few tools on the web that can help make this process pretty painless. At the end of the day, it can only help.
Hope that helps!
-
I agree to the benefits of having a sitemap on any website. Search for Google webmaster help on youtube. You can get to see a lot of supporting tutorials.
-
Hey Jonathan
A HTML sitemap can be useful for getting your site indexed and the XML one can also help with indexation but there are no guarantees that pages in the XML sitemap will be indexed. I read an article on here showing the indexation benefits of a sitemap and google have stated that they like you to have a HTML one for users as well as SEO so... it's like one of those 1% things, it may help a little bit, and it can't hurt but you still have to do everything else right.
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
The importance of url's - are they that important?
Hi Guys I'm reading some very contrasting and confusing reviews regarding urls and the impact they have on a sites ability to rank. My client has a number of flooring products, 71 to be exact - categorised under three sub categories 1. Gallery Wood - 2. Prefinshed Wood - 3. Parquet & Reclaimed. All of the 71 products are branded products (names that are completely unrelated to specific keyword search terms. This is having a major impact regarding how we optimise the site. FOR EXAMPLE: A product of the floor called "White Grain" - the "Key Word" we would like to rank this page for is Brown Engineered Flooring. I'm interested to know, should the name of the branded product match the url? What would you change to help this page rank better for the keyword - Brown Engineered Flooring. Title page: White Grain Url: thecompanyname.com/gallery-wood/white-grain (white grain is the name of the product) Key Word: Brown Engineered Flooring **Seo Title: **White Grain, Brown Engineered Flooring by X Meta Description: BLAH BLAH Brown Engineered Flooring BLAH BLAH Any feedback to help get my head around this would be really appreciated. Thank you.
Technical SEO | | GaryVictory0 -
Building URL's is there a difference between = and - ?
I have a Product Based Search site where the URL's are built dynamically based on the User input Parameters Currently I use the '=' t o built the URL based on the search parameters for eg: /condition=New/keywords=Ford+Focus/category=Exterior etc Is there any value in using hypen's instead of = ? Could you please help me in any general guidelines to follow
Technical SEO | | Chaits0 -
Unnatural Link Warning Removed - WMT's
Hi, just a quick one. We had an unnatural link warning for one of our test sites, the message appeared on the WMT's dashboard. The message is no longer there, has it simply expired or could this mean that Google no longer sees an unatural backlink profile? Hoping it's the latter but doubtful as we haven't tried to remove any links.. as I say it's just a test site. Thanks in advance!
Technical SEO | | Webpresence0 -
Intuit's Homestead web developer
I used Intuit's homestead to develop my website and when I analyze my site on semoz, I get duplicate page content between the site and the "index". Is this something to worry about and can I fix it if it is? Thanks. Michael
Technical SEO | | thompsoncpa0 -
Any idea why our sitemap images aren't indexed?
Here's our sitemap: http://www.driftworks.com/shop/sitemap/dw_sitemap.xml In google webmaster tools, I can see the sitemap report and it says: Items:Web Submitted:2,798 Indexed:2,910 Items:Images Submitted:3,178 Indexed:0 Do you have any idea why our images are not being indexed according to webmaster tools? I checked a few of the image URLs and they worked nicely. Thanks in advance, J
Technical SEO | | DWJames0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Domain Transfer Process / Bulk 301's Using IIS
Hi guys - I am getting ready to do a complete domain transfer from one domain to another completely different domain for a client due to a branding/name change. 2 things - first, I wanted to lay out a summary of my process and see if everyone agrees that its a good approach, and second, my client is using IIS, so I wanted to see if anyone out there knows a bulk tool that can be used to implement 301's on the hundreds of pages that the site contains? I have found the process to redirect each individual page, but over hundreds its a daunting task to look at. The nice thing about the domain transfer is that it is going to be a literal 1:1 transfer, with the only things changing being the logo and the name mentions. Everything else is going to stay exactly the same, for the most part. I will use dummy domain names in the explanation to keep things easy to follow: www.old-domain.com and www.new-domain.com. The client's existing home page has a 5/10 GPR, so of course, transferring Mojo is very important. The process: Clean up existing site 404's, duplicate tags and titles, etc. (good time to clean house). Create identical domain structure tree, changing all URL's (for instance) from www.old-domain.com/freestuff to www.newdomain.com/freestuff. Push several pages to a dev environment to test (dev.new-domain.com). Also, replace all instances of old brand name (images and text) with new brand name. Set up 301 redirects (here is where my IIS question comes in below). Each page will be set up to redirect to the new permanent destination with a 301. TEST a few. Choose lowest traffic time of week (from analytics data) to make the transfer ALL AT ONCE, including pushing new content live to the server for www.new-domain.com and implementing the 301's. As opposed to moving over parts of the site in chunks, moving the site over in one swoop avoids potential duplicate content issues, since the content on the new domain is essentially exactly the same as the old domain. Of course, all of the steps so far would apply to the existing sub-domains as well, IE video.new-domain.com. Check for errors and problems with resolution issues. Check again. Check again. Write to (as many as possible) link partners and inform them of new domain and ask links to be switched (for existing links) and updated (for future links) to the new domain. Even though 301's will redirect link juice, the actual link to the new domain page without the redirect is preferred. Track rank of targeted keywords, overall domain importance and GPR over time to ensure that you re-establish your Mojo quickly. That's it! Ok, so everyone, please give me your feedback on that process!! Secondly, as you can see in the middle of that process, the "implement 301's" section seems easier said than done, especially when you are redirecting each page individually (would take days). So, the question here is, does anyone know of a way to implement bulk 301's for each individual page using IIS? From what I understand, in an Apache environment .htaccess can be used, but I really have not been able to find any info regarding how to do this in bulk using IIS. Any help here would be GREATLY APPRECIATED!!
Technical SEO | | Bandicoot0