A sitemap... What's the purpose?
-
Hello everybody,
my question is really simple: what's the purpose of a sitemap?
It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily!
Am I wrong?
Thank you for yours answers,
Jonathan
-
I highly recommend checking out the Webinar Friday Rand did on this very subject: Getting Value from XML Sitemaps, HTML Sitemaps & Feeds.
-
If you have a static site with twenty pages that doesn't get new pages added very often then yes, a site map probably isn't of a whole lot of use if your website has good architecture.
However, if your site is 30,000 pages and gets new content added regularly, then an xml sitemap is useful to make sure that the engines know about all of your pages.
Using multiple sitemaps can be useful to help you diagnose what type of content Google is crawling best. A hypothetical example is that you have a large site where you a) sell baking supplies b) have recipes and c) have user profiles that you want indexed. You could submit a site map for each area (then a master sitemap that lists each of the sub sitemaps).
In Google Webmaster Tools, you get a report that says how many pages you submitted for each site map, and how many of those pages are indexed. using the above setup, you might find something like:
baking supplies has 50 URLs indexed out of 2000 submitted
recipes has 10,000 URLs indexed out of 11,000 submitted
users has 500 URLs indexed out of 1000 submittedAt a glance, you can tell that something is up with the products you're trying to sell and that Google isn't indexing that section very well, and you know to focus on that section, and maybe there's a bug in the code that put a noindex on most of the pages on accident.
Does that help?
-
A sitemap can help not only Google, but viewers find its way through your site. It is a great way to show the hierarchy and flow of your website. As mentioned, there are a few tools on the web that can help make this process pretty painless. At the end of the day, it can only help.
Hope that helps!
-
I agree to the benefits of having a sitemap on any website. Search for Google webmaster help on youtube. You can get to see a lot of supporting tutorials.
-
Hey Jonathan
A HTML sitemap can be useful for getting your site indexed and the XML one can also help with indexation but there are no guarantees that pages in the XML sitemap will be indexed. I read an article on here showing the indexation benefits of a sitemap and google have stated that they like you to have a HTML one for users as well as SEO so... it's like one of those 1% things, it may help a little bit, and it can't hurt but you still have to do everything else right.
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Another company's website indexing for my site
Hi, I am looking at all the pages which Google are indexing for my website and have come across pages of another company's website. I have contacted them through their online form and Facebook page asking for them to remove their listings for us, but to no avail so far. Is there a way I can do this myself?
Technical SEO | | British-Car-Registrations0 -
Can anyone tell me - in layman's terms - any SEO implications of a Netscaler redirect?
We are in the midst of exploring the best options for developing a "microsite" experience for a client and how we manage the site - subdomain vs. subdirectory... Netscaler redirect vs DNS change. We understand that a subdirectory is best for SEO purposes; however, we anticipate technical limitations when integrating the different hosting platforms and capabilities into the existing site. The proposed solutions that were provided are a netscaler redirect and/or dns changes. Any experience with these solutions?
Technical SEO | | jgrammer0 -
IT's Hurt My Rank?HELP!!!
hi,guys,john here, i just began use the MOZ service several days ago, recently i noticed one thing that one keyword on the first google search result page, but when i done some external links,the rank down from 1 to 8, i think may be the bad quality external links caused the rank down. so my question,should i delete the bad quality links or build more better quality links? which is better for me. easy to delete the bad links and hard to build high quality links. so what's your better opinion,guys? thanks John
Technical SEO | | smokstore0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
Why is robots.txt blocking URL's in sitemap?
Hi Folks, Any ideas why Google Webmaster Tools is indicating that my robots.txt is blocking URL's linked in my sitemap.xml, when in fact it isn't? I have checked the current robots.txt declarations and they are fine and I've also tested it in the 'robots.txt Tester' tool, which indicates for the URL's it's suggesting are blocked in the sitemap, in fact work fine. Is this a temporary issue that will be resolved over a few days or should I be concerned. I have recently removed the declaration from the robots.txt that would have been blocking them and then uploaded a new updated sitemap.xml. I'm assuming this issue is due to some sort of crossover. Thanks Gaz
Technical SEO | | PurpleGriffon0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
Switching Site to a Domain Name that's in Use
I'm comfortable with the steps of moving a site to a new domain name as recommended by Google. However, in this case, the domain name I'm asked to move to is not really "new" ... meaning it's currently hosting a website and has been for a long time. So my question is, do I do this in steps and take the old website down first in order to "free up" the domain name in they eyes of search engines to avoid large numbers of 404s and then (in step 2) switch to the "new" domain in a few months? Thanks.
Technical SEO | | R2iSEO0