Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
-
When submitting compressed sitemaps to Google I normally use the a file named sitemap.gz
A customer is banging on that his web guy says that sitemap.xml.gz is a better format.
Google spiders sitemap.gz just fine and in Webmaster Tools everything looks OK...
Interested to know other SEOmoz Pro's preferences here and also to check I haven't made an error that is going to bite me in the ass soon!
Over to you.
-
Thanks Big Bazza... I like the 'better' vs 'accepted' reasoning. Not too confrontational
-
Generally the .xml.gz format is the one stated in examples there are a few references to this here : http://www.sitemaps.org/protocol.php#index
Most sitemap generators that create both compressed and uncompressed sitemap files name them sitemap.xml and sitemap.xml.gz respectively. It also makes it clearer what the content of the zipped file is. I don't believe it is essential however, as you will direct tools such as google.com/webmasters to your xml sitemap - rather than expect it to find it of its own accord.
I always use the .xml.gz format when compressing. I would argue that (if both formats work) neither one is 'BETTER' than the other, rather one is more ACCEPTED than the other.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Add versioning to an xml sitemap?
Is there a way to add versioning to an xml sitemap? Something like <version>x.x</version> outside of the <urlset>?</urlset> I've looked at a bunch of sitemaps for various sites and don't see anyone adding versioning information, but it seems like it would be a common issue - I can't believe someone hasn't come up with some way to do it.
Intermediate & Advanced SEO | | ATT_SEO0 -
Any excellent recommendations for a sitemap.xml plugin?
Hi, I'm trying to find a sitemap generator/plugin that I can point my client to. My client is using Magento, and is one of the largest sports store i Norway (around 20 000 products). I've heard there's one that can set the <priority>according to page views, sold units, and other relevant parameters, and that also takes care of the other elements in the sitemap.xml.</priority> Any good recommendations out there? 🙂
Intermediate & Advanced SEO | | Inevo0 -
2015/2016 Sitemaps Exclusions
Hello fellow mozrs!
Intermediate & Advanced SEO | | artdivision
Been working on a few Property (Real Estate for our American friends) websites recently and and two questions that constantly come up as we spec the site are: 1. What schema (schema.org) should the website use (throughout all pages as well as individual pages). Did anyone found that schema actually helped with their ranking/CTR?
2. Whilst setting up the sitemaps (usually Yaost is our preferred plugin for the job), what page would you EXCLUDE from the site map? Looking forward to some interesting comments.
Dan.0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Disavow File Submission process?
Hi Mozzers, I am working for client that hasn't got penalized but has lots of junk seo directories that I would like to disavow. My question is should i try reaching out to webmasters (if they are existant) first and show proof to google? or should I just go ahead and submit the file without any reach out?will it still work? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
Intermediate & Advanced SEO | | Dom4410 -
Should canonical links be included or excluded in a sitemap?
Our company is in the process of updating our sitemap. Should we include or exclude canonical links.
Intermediate & Advanced SEO | | WebRiverGroup0 -
WP File Permissions
After suffering a malware episode I wonder if there is an optimum setting for the file permissions for a typical Wordpress site? Colin
Intermediate & Advanced SEO | | NileCruises0