What is the point of XML site maps?
-
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all.
The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links.
The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content.
This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently.
From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them.
It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it).
So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
-
Thanks Axial,
I'm not convinced it matters much if Google crawls deep pages they wouldn't find through organic links. If the pages aren't linked to they won't have any link juice and therefore won't rank well in SERPs.
The link about using site maps for canonical URLs says or implies you should only put your most important URLs in the sitemap. The sitemap tools I've seen tend to take a kitchen sink approach, which is needed if you are using it to try to get a deeper crawl. Plus there's no way (I see) in a sitemap to specify that page A is the canonical of page B. They simply suggest telling Google about page A (and not page B) in the hopes page A will get more weight than page B. A canonical meta tag on page B pointing to page A is obviously a much better way to deal with canonicals.
Image and video site maps are potentially valuable. I am asking specifically about site maps for pages.
Specifying related content for a given URL, such as different languages, is indeed useful and not something I was aware of. But it is not applicable on most sites and not used on most site maps.
-
Your sitemap.xml will help googlebot crawl deep pages, but it serves other purposes such as:
-
helping Google identify canonical pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066#3
-
creating sitemaps for video, images, etc.: "you can also use Sitemaps to provide Google with metadata about specific types of content on your site, including video, images, mobile, and News. For example, a video Sitemap entry can specify the running time, category, and family-friendly status of a video; an image Sitemap entry can provide information about an image’s subject matter, type, and license." http://support.google.com/webmasters/bin/answer.py?hl=en&hlrm=fr&answer=156184
-
you can specify alternate content, such as the URL of a translated page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
-
and more.
Sometimes working with a sitemap is less risky and maintenance is easier, especially when your CMS is limitative. The 3rd point is a good example. You may also appreciate the centralized approach more from a personnal point of view.
There are good resources on the Google webmaster resources, check them out.
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Domain has been redirected our site; but many incoming links from sub domain. Will this hurts?
Hi all, This is the scenario: Our website is newwebsite.com. Our old website is oldwebsite.com which has been redirected to newwebsite.com (years back). But one of the old website's sub domain has a lot of back links to our current website like: seo.oldwebsite.com to newwebsite.com. Will this scenario hurts with any wrong linking? Thanks
Algorithm Updates | | vtmoz0 -
What happens when we canonical and point to a page which has been redirected to another page? Google response!
Hi all, I would like to know the different scenarios Google going to respond when we use canonical and redirect for duplicate pages. Let's say A to B are duplicate pages with 95% same content and C Doesn't have same content but context wise similar and priority page we expect to rank for. What happens if we canonical from A to B and set redirect from B to C? What if both A and B are pointed to C with canonical? What if A or B deleted and other one is canonical to C? Note: We can noindex or 301 redirect as they have their own visitors. This is more about showing most relevant content to the audience and avoid duplicate content in search results. Thanks
Algorithm Updates | | vtmoz0 -
How does this site rank no 1 for big terms with no optimisation?
Hi, A client recently asked me abut a site that appears to have popped up out of nowhere and is ranking for big terms within their industry: http://bit.ly/11jcpky I have looked at the site for a particular term: Cheap Beds I was using unpersonalised search on google.co.uk with location set to London. The site currently ranks no 1 for that term and other similar terms. The question is how? SEO Moz reports no backlinks (they must have blocked?) Ahrefs and Majestic report report some backlinks but not many and no anchor text with the term in. The Page title and meta do not contain the term nor does the page seem to contain the term anywhere. The domain does have some age though has no keyword match in the URL. I'm a little stumped to how they are achieving these results. Any Ideas Anyone?
Algorithm Updates | | JeusuDigital0 -
Why some sites doesn't get ranked in Google but in Bing and Yahoo
Few of my sites e.g. Business-Training-Schools.com and Ultrasoundtechnicians.com doesnt get much visits from Google but these sites get top ranked in Bing and Yahoo. I have tried searching for answer to these question but i did not find anything convincing.
Algorithm Updates | | HQP2 -
What are the most trusted SEO sites?
Other then SEOmoz what sites can you trust for SEO? Is there some type of formula I can use to find out if any site is trustworthy?
Algorithm Updates | | uofmiamiguy0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Dramatic drop after rapid rise for new site
just launched a new site edenprairieexperts.com. The site jumped to the first page on yahoo and bing within a couple of days then fell off a cliff and isnt in the top 10 pages. Any reason for this? seems really strange for me. The only think I can think of is I got some really poor quality back links from someone screwing with me. If someone could take a glance at the site or give me some general direction I would appreciate it.
Algorithm Updates | | jjwelu0