Is having a sitemap.xml file still beneficial?
-
Hi,
I'm pretty new to SEO and something I've noticed is that a lot of things become relevant and irrelevant like the weather.
I was just wondering if having a sitemap.xml file for Google's use is still a good idea and beneficial?
Logically thinking, my websites would get crawled faster by having one.
Cheers.
-
It's worth looking at the Webinar video on Sitemaps for this one. It's just a month old so it's completely up to date.
http://www.seomoz.org/webinars/getting-value-from-xml-sitemaps
Sitemaps aren't just a case listing every page on your site anymore, they require a bit of thought and attention.
-
Absolutely relevant, infact i'd say essential.
A great article posted on the site already here: http://www.seomoz.org/blog/xml-sitemaps-guidelines-on-their-use
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I complete a reverse DNS check when completing log file analysis?
I'm doing some log file analysis and need to run a reverse DNS check to ensure that I'm analysing logs from Google and not any imposters. Is there a command I can use in terminal to do this? If not, whats the best way to verify Googlebot? Thanks
Technical SEO | | daniel-brooks0 -
Sitemaps:
Hello, doing an audit found in our sitemaps the tag which at the time was to say that the url was mobile. In our case the URL is the same for desktop and mobile.
Technical SEO | | romaro
Do you recommend leaving or removing it?
Thank you!0 -
Sitemap errors have disappeared from my Google Webmaster tools
Hi all, A week ago I had 66 sitemap errors related to href langs in my GWT. Now, all the errors are gone, and it shows no errors. We have not done any work to fix the errors. I wonder if anybody has experienced the same thing, of Google suddenly changing the criteria or the way they report on errors in Google Webmaster Tools. I would appreciate any insights from the community! Best regards Peru
Technical SEO | | SMVSEO0 -
Does Google Still support Hyphened Domains for Exact Match or not?
Does Google Still support Hyphened Domains for Exact Match or not? For Example: www.my-key-word.com this domain is exact match for "my keyword" or not??
Technical SEO | | hammadrafique0 -
Should each new blog post be added to Sitemap.xml
Hello everyone, I have a website that has only static content. I have recently added a Blog to my website and I am wondering if I need to add each new Blog post to my Sitemap.xml file? Or is there another way/better way to get the Blog posting index? Any advice is greatly appreciated!
Technical SEO | | threebiz0 -
Benefits to having an HTML sitemap?
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority. Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
Technical SEO | | BostonWright0 -
Organic traffic still down 9 months after redesign
The Good: We redesigned our nature travel website (www.ietravel.com) in Drupal. Overall, it's a great improvement in look and usability. Also, we are ranking for more relevant search terms (the SEO was managed by an agency before, and there were a lot of junk terms in their campaigns that weren't converting). The Bad: Organic search referrals have consistently been down 10-20% year-over-year each and every month. The Ugly: I am trying to dig in and figure out why this is happening, and I'm at a loss. We are aggressively publishing to our blog 5 days a week, and I've built many keyword-focused landing pages. Here's what I do know in terms of things that could be problems which I've seen in Webmaster Tools and SEOmoz tools. I have a lot of files restricted by robots.txt - 1,337 of them. Many have to be that way by design because they are nodes generated by web forms (w/ private user data). The rest are "Dates & Rates" pages - I restricted them because for each destination they are very similar in content. Wondering now if that was a mistake. For example, http://www.ietravel.com/central-south-america/galapagos-islands/dates-rates We have duplicate title tags on 462 pages. The Lightbox Module that was installed for our photo galleries was a disaster. I am researching a more SEO-friendly solution, but that solution is a month or more away. We have 31 duplicate meta descriptions. My question is, could these errors be THAT significantly impacting our rankings? I should note that according to Google Analytics, Referral traffic & Direct traffic is also year-over-year every month since the redesign. I don't understand the Referral part especially, since we took great pains to put in many 301 redirects. There are no 404s or non-indexable pages showing up in Webmaster Tools either. If anyone has any suggestions for problem areas or red flags I should investigate, please let me know. Really, any thoughts are appreciated. Best, Carlton
Technical SEO | | csmithal1 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0