Broken sitemaps vs no sitemaps at all?
-
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file.
The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up.
I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt.
I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes.
Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
-
I think you can remove the sitemap since it returns so many warnings.
I don't think sitemaps have so much seo benefits but rather helps google find pages that are hard to find in your site or no accessible through regular href.
So make sure your site has a good structure and that all page can be found by browsing your site (click on links from pages to pages) and you will be fine sitemap or not.
Use linksleuth to crawl your site, if you are not sure of the accessibility of all pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I add external links to my sitemap?
Hi, I'm integrating with a service that adds 3rd-party images/videos (owned by them, hosted on their server) to my site. For instance, the service might have tons of pictures/videos of cars; and then when I integrate, I can show my users these pictures/videos about cars I might be selling. But I'm wondering how to build out the sitemap--I would like to include reference to these images/videos, so Google knows I'm using lots of multimedia. How's the most white-hat way to do that? Can I add external links to my sitemap pointing to these images/videos hosted on a different server, or is that frowned upon? Thanks in advance.
Intermediate & Advanced SEO | | SEOdub0 -
Standalone Hosting Plan vs Multisite Hosting Plan for SEO?
I am looking to migrate my current site to Siteground so I was having a chat with the operator who is telling me that if I was to sign up a new hosting plan I would get additional SEO benefits.. can anyone confirm or deny this? Also while on the question, do certain domain/hosting providers offer better SEO/SERP rankings and if they do can anyone recommend any in Australia in Particular? The domain is a .com.au website Transcript: me: i have a website that is registered at crazydomains and Wordpress files hosted on my friends server... what would I need to do to have the domain/hosting transfered to SG?
Intermediate & Advanced SEO | | IsaCleanse
Siteground Operator:.: Let me take a look at the website and I will provide you with a solution 🙂
me: thx
Siteground Operator:.: In this case you have two choices, you can either host it on your current plan or create a new one just for it
Siteground Operator:.: Getting a new plan will be a better choice in terms of SEO and performance
Siteground Operator:.: But you can run it on your current GrowBig as well
me: why will taht give it better SEO?
Siteground Operator:.: Because it will have its own cPanel and it will be a primary domain for it, instead of having it setup as an addon
me: How does Google know what the primary or secondary domain on my hosting plan?
Siteground Operator:.: It doest, as your file location will be primarydomain.com/addonslot
Siteground Operator:.: Compared to primarydomain.com if you put it in its own hosting plan
me: So im struggling to understand how this affects my SEO?
Siteground Operator:.: SERP is based on a couple of things, one of which is domain authority (DO). This tends to be a lot harder to build up with addon domains compared to domains hosted in their own plans.
Siteground Operator:.: Additionally, you will have 2 sites under a single IP address which is not the optimal solution you want to get
me: What would need to be done as far as transfering the WP installation/files/databse etc
Siteground Operator:.: As its stored on a local host you will have to upload a backup copy of your files and db on our server and we will configure it for you.
System: me has ended the chat0 -
Proper naming convention when updating sitemaps
I have a semi-big site (500K pages) with lots of new pages being created. I also have a process that updates my sitemap with all of these pages automatically. I have 11 sitemap files and a sitemap index file. When I update my sitemaps and submit them to Google, should I keep the same names?
Intermediate & Advanced SEO | | jcgoodrich0 -
Would it be better to Start Over vs doing a Website Migration?
Hey guys /gals I have a question please. I have a computer repair business that does extremely well in search and is on the front page of google for anything computer repair related. However, I am currently re-branding my company and have completely redesigned every aspect of the UI and the SEO Site structure as well as the fact that I have completely written vastly different content and different title tag lines and meta descriptions for each page. So basically when doing a migration we know that we want to keep our content, titles, headlines and meta descriptions the same as to not lose our page rank. Seeing that I have completely went against the grain in all directions on a much needed company re-branding and everything is completely different from the old site is it even worthwhile 301 redirecting my old urls to the new ones that would (best) correspond with the new? In the plainest English, would I do better at Ranking the New Website QUICKER without doing 301 redirects from the OLD to the NEW? In an EXTREME instance like what I have done, would the Domain Migration IMPEDED me ranking the new site seeing how nothing is the same? I have build a Rock solid SILO Site Architecture on the New site which is WordPress using the Thesis Framework and the old domain is built on JOOMLA 1.5 Thank fellas Marshall
Intermediate & Advanced SEO | | MarshallThompson0 -
Tool to check XML sitemap
Hello, Can anyone help me finding a tool to have closer look of the XML sitemap? Tks in advance! PP
Intermediate & Advanced SEO | | PedroM0 -
Sitemaps: Alternate hreflang
Hi, some time ago I have read that there is a limit of 50.000 URLs per sitemap file (So, you need to create a sitemap index and separate files with 50.000 urls each). [Source]. Now we are about to implement the link hreflang in the sitemap [Source], and we dont know if we have to count each alternate as a different url. We have 21 different well positioned domains (Same name, different cctlds, a little different content [varies in currencies, taxes, some labels, etc] depending in the target country) so the amount of links per url would be high. A) Shall we count each link alternate as a separate url, or just the original ones? For example, if we have to count the link alternates, that would make us have 2380pages per sitemap, each with one original url and 20 alternate links. (Always being aware of the 50mb maximum filesize) B) Actually we have one sitemap per domain. Using this, shall we generate one per domain using the matching domain as original url? Or it would be the same if we upload to every domain the same sitemap? Thanks
Intermediate & Advanced SEO | | marianoSoler980 -
Has important is it to set "priority" and "frequency" in sitemaps?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Intermediate & Advanced SEO | | nicole.healthline2 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0