What is the point of XML site maps?
-
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all.
The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links.
The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content.
This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently.
From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them.
It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it).
So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
-
Thanks Axial,
I'm not convinced it matters much if Google crawls deep pages they wouldn't find through organic links. If the pages aren't linked to they won't have any link juice and therefore won't rank well in SERPs.
The link about using site maps for canonical URLs says or implies you should only put your most important URLs in the sitemap. The sitemap tools I've seen tend to take a kitchen sink approach, which is needed if you are using it to try to get a deeper crawl. Plus there's no way (I see) in a sitemap to specify that page A is the canonical of page B. They simply suggest telling Google about page A (and not page B) in the hopes page A will get more weight than page B. A canonical meta tag on page B pointing to page A is obviously a much better way to deal with canonicals.
Image and video site maps are potentially valuable. I am asking specifically about site maps for pages.
Specifying related content for a given URL, such as different languages, is indeed useful and not something I was aware of. But it is not applicable on most sites and not used on most site maps.
-
Your sitemap.xml will help googlebot crawl deep pages, but it serves other purposes such as:
-
helping Google identify canonical pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066#3
-
creating sitemaps for video, images, etc.: "you can also use Sitemaps to provide Google with metadata about specific types of content on your site, including video, images, mobile, and News. For example, a video Sitemap entry can specify the running time, category, and family-friendly status of a video; an image Sitemap entry can provide information about an image’s subject matter, type, and license." http://support.google.com/webmasters/bin/answer.py?hl=en&hlrm=fr&answer=156184
-
you can specify alternate content, such as the URL of a translated page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
-
and more.
Sometimes working with a sitemap is less risky and maintenance is easier, especially when your CMS is limitative. The 3rd point is a good example. You may also appreciate the centralized approach more from a personnal point of view.
There are good resources on the Google webmaster resources, check them out.
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Should I use the Disavow Tool at this point?
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us. My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet? I have done everything possible to get them removed, and it's not happening.
Algorithm Updates | | UnderRugSwept0 -
Why if PR and DA are higher is this site lower in SERPS
Hi there Why if PR and DA are higher is this site lower in SERPS. Example: https://www.google.co.uk/webhp?sourceid=chrome-instant&ie=UTF-8#hl=en&sclient=psy-ab&q=samsung+SF-4200+cartridge&oq=samsung+SF-4200+cartridge&gs_l=hp.3...13009.13824.2.14464.2.2.0.0.0.0.72.143.2.2.0.eiatsh..0.0.l4idQl3RGWc&pbx=1&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.,cf.osb&fp=69f1c60047c60e0c&biw=1045&bih=580 Pos 1 www.inkfactory.com/ink-cartridges/samsung/sf4200-series PR:1 DA:48 Pos 3 www.internet-ink.co.uk › SAMSUNG INK PR:25 DA:60 I though is you had top DA and PR you should out rank those below you?
Algorithm Updates | | smashseo0 -
How to speed up indexing of my site...
Only 4 out of the 12 pages of my blog/site have been indexed. How can I ensure all the pages get indexed? I'm using a wordpress site, and I also wondered how could I speed the indexing process up (I have submitted a site map) Thanks!
Algorithm Updates | | copywritingbuzz0 -
Does Google do domain level topic modeling? If so, are off-site factors such as search traffic volume taken into account?
80% of my site's organic traffic is coming through a resource that is only somewhat related. Does Google think the main topic of my site is terms this resource targets thus bumping the terms I care about to a sub-topic level of sorts? If this is the case, would putting the resource information into a sub-domain help to solve the problem?
Algorithm Updates | | tatermarketing0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0