Editing A Sitemap
-
Would there be any positive effect from editing a site map down to a more curated list of pages that perform, or that we hope they begin to perform, in organic search?
A site I work with has a sitemap with about 20,000 pages that is automatically created out of a Drupal plugin.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
For instance, would it focus Google's crawl budget more efficiently or have some other effect?
Your thoughts? Thanks! Best... Darcy
-
Hi Darcy
Looking at what has been mentioned previously I would agree with the train of thought that a more focussed sitemap would generally be advantageous.
Andrew
-
Hi Dmitrii,
Always fun to watch Matt's Greatest Hits, in this example the value of making things better.
I guess the make better or delete seems super black and white to me.
Economically, who is able to make thousands of pages dramatically better with compelling original content? So, instead, the only other option is apparently radical elective surgery and massive amputation? I guess I'd choose the chemo first and don't really see what the downside is for noindex/follow and exclude from the sitemap.
Anyway, thanks again! Best... Darcy
-
- I really read the above linked post differently than Google saying "just delete it."
Well, here is a video from Matt Cutts about thin content. In this particular video he's talking about websites, which already took hit for thin content, but in your case it's the same, since you're trying to prevent it
https://www.youtube.com/watch?v=w3-obcXkyA4&t=322So, there are two options he is talking about: delete or make it better. From your previous responses I understand that making it better is not an option, so there is only one option left
As for link juice thorough those pages. If those pages have good amount of links, traffic and are quite popular on your website, then surely DON'T delete them, but rather make them better. However, I understood that those pages are not popular or have much traffic, so, option two
-
Hi Thomas,
Thanks for the message.
To answer your question, part of the reason is link juice via a noindex/follow and then there are some pages that serve a very very narrow content purpose, but have absolutely no life in search.
All things being equal, do you think a smaller, more focused, sitemap is generally an advantage? In the extreme and on other sites I've seen sitemaps with noindexed pages on them.
Thanks... Darcy
-
Thanks for the suggestion, Andrew.
With setting priority or not in a sitemap, do you think a smaller, more focused, sitemap is generally an advantage?
Thanks... Darcy
-
Thomas & Dmitrii,
Thanks for the message. With all do respect, I really read the above linked post differently than Google saying "just delete it."
Also, I don't see how deleting it preserves whatever link juice those pages had, as opposed to a "noindex, follow" and taking them out of the sitemap.
Finally, I don't necessarily equate all of Google's suggestions as synonymous with a "for best effect in search." I assume their suggestions mean, "it's best for Google if you..."
Thanks, again!
Best... Darcy
-
You misunderstand the meaning of that article.
"...that when you do block thin or bad content, Google prefers when you use the noindex over 404ing the page..."
They are talking about the walk around the problem of blocking pages INSTEAD of removing them.
So, if for whatever reason you don't want to delete a page and just put a 404 status on it, it's worse than putting noindex on it. Basically, what they're saying is:
- if you have thin content, DELETE it;
- if for whatever reason you don't want to delete it, put NOINDEX on it.
P.S. My suggestion still stays the same. Delete all bad content and, if you really want, put 410 gone status for that deleted content for Google to understand immediately that those pages are deleted forever, not inaccessible by mistake or something.
Hope this makes sense
.
-
Darcy,
Whilst noindex would be a good solution, if the page has no benefit why would you noindex instead of deleting it?
-
Dmitrii & Thomas,
Thanks for your thoughts.
Removal would be one way to go. I note with some interest this post:
https://www.seroundtable.com/google-block-thin-content-use-noindex-over-404s-21011.html
According to that, removal would be the third thing after making it better and noindexing.
With thousands of pages, making it better is not really an option.
Best... Darcy
-
Hi Darcy
I don't know about scaling the sitemap down but you could make use of an area of the sitemap to optimise and make it a crawl more efficient.
The area in question is the Priority area that basically tells the search engines which pages on your site are the most important. The theory is that pages with a higher priority (say 100%) are more likely to get indexed by the search engines than pages with a lower priority of say (10%), although not everyone in the industry agrees.
-
"There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap."
Why not remove these from the site?
I personally believe that it'll have a positive impact, as you're submitting this sitemap to Google, you're giving it a way of going through your whole site, so why would you give it low quality pages. You want to provide Google (and your users) the best possible experience, so if you've got out of date pages, update them or if they're not relevant delete them, a user who lands on this page anyway would just bounce because it's not relevant anymore.
If these out of date pages can't be found by crawling, then 100% it's best to craft your sitemap to show the best pages.
-
hi there.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
Have you considered removing those pages/sections, rather than altering the sitemap? It would make more sense I think.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spotify XML Sitemap
All, Working on an SEO work up for a Spotify site. Looks like they are using a sitemap that links to additional pages. A problem, none of the links are actually linked within the sitemap. This feels like a strong error. https://lubricitylabs.com/sitemap.xml Thoughts?
Intermediate & Advanced SEO | | dmaher0 -
Our parent company has included their sitemap links in our robots.txt file - will that have an impact on the way our site is crawled?
Our parent company has included their sitemap links in our robots.txt file. All of their sitemap links are on a different domain and I'm wondering if this will have any impact on our searchability or potential rankings.
Intermediate & Advanced SEO | | tsmith1310 -
Sitemap into SE
Hi Moz community experts, I have a question about the sitemap into search engine like here : http://i.imgur.com/gQ0JhuH.jpg. Do you know what I need to do to get the same structure or do decide which pages we want to present into our result. We created a new page and we would like to see it into the resultat when the visitor is searching for our branded keywords. Thank in advance for your support. gQ0JhuH.jpg.
Intermediate & Advanced SEO | | johncurlee0 -
Need some help/input about my Joomla sitemap created by XMap
Here is my current sitemap for my site http://www.yakangler.com/index.php?option=com_xmap&view=xml&tmpl=component&id=1 I have some questions about it's current settings. I have a component called JReviews that xmap produces a separate link for each category. ex: http://www.yakangler.com/fishing-kayak-review/265-2013-hobie-mirage-adventure-island 2014-09-03T20:46:25Z monthly 0.4 http://www.yakangler.com/fishing-kayak-review/266-2012-wilderness-systems-tarpon-140 2014-06-03T15:49:00Z monthly 0.4
Intermediate & Advanced SEO | | mr_w
http://www.yakangler.com/fishing-kayak-review/343-wilderness-systems-tarpon-120-ultralite 2013-11-25T06:39:05Z monthly 0.4 Where as my other articles are only linked by the content category. ex: http://www.yakangler.com/news monthly 0.4
http://www.yakangler.com/tournaments monthly 0.4
http://www.yakangler.com/kayak-events monthly 0.4
http://www.yakangler.com/spotlight monthly 0.4 Which option is better?0 -
Should sitemap include https pages?
Hi guys, Trying to figure out some onsite issues I've been having. Would appreciate any feedback on the following 2 questions: My homepage (http://mysite.com) is a 301 redirect to https://mysite.com, which is under SSL. Only 2 pages of my site are https, the rest are http. Should the directory of my sitemap be https://mysite.com/sitemap.xml or should it be kept with http (even though the redirected homepage is to https)? Should my sitemap include the https pages (only 2 pages) as well as the http? Thanks, G
Intermediate & Advanced SEO | | G.Anderson0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Sitemaps: Alternate hreflang
Hi, some time ago I have read that there is a limit of 50.000 URLs per sitemap file (So, you need to create a sitemap index and separate files with 50.000 urls each). [Source]. Now we are about to implement the link hreflang in the sitemap [Source], and we dont know if we have to count each alternate as a different url. We have 21 different well positioned domains (Same name, different cctlds, a little different content [varies in currencies, taxes, some labels, etc] depending in the target country) so the amount of links per url would be high. A) Shall we count each link alternate as a separate url, or just the original ones? For example, if we have to count the link alternates, that would make us have 2380pages per sitemap, each with one original url and 20 alternate links. (Always being aware of the 50mb maximum filesize) B) Actually we have one sitemap per domain. Using this, shall we generate one per domain using the matching domain as original url? Or it would be the same if we upload to every domain the same sitemap? Thanks
Intermediate & Advanced SEO | | marianoSoler980 -
Sitemaps and subdomains
At the beginning of our life-cycle, we were just a wordpress blog. However, we just launched a product created in Ruby. Because we did not have time to put together an open source Ruby CMS platform, we left the blog in wordpress and app in rails. Thus our web app is at http://www.thesquarefoot.com and our blog is at http://blog.thesquarefoot.com. We did re-directs such that if the URL does not exist at www.thesquarefoot.com it automatically forwards to blog.thesquarefoot.com. What is the best way to handle sitemaps? Create one for blog.thesquarefoot.com and for http://www.thesquarefoot.com and submit them separately? We had landing pages like http://www.thesquarefoot.com/houston in wordpress, which ranked well for Find Houston commercial real estate, which have been replaced with a landing page in Ruby, so that URL works well. The url that was ranking well for this word is now at blog.thesquarefoot.com/houston/? Should i delete this page? I am worried if i do, we will lose ranking, since that was the actual page ranking, not the new one. Until we are able to create an open source Ruby CMS and move everything over to a sub-directory and have everything live in one place, I would love any advice on how to mitigate damage and not confuse Google. Thanks
Intermediate & Advanced SEO | | TheSquareFoot0