Editing A Sitemap
-
Would there be any positive effect from editing a site map down to a more curated list of pages that perform, or that we hope they begin to perform, in organic search?
A site I work with has a sitemap with about 20,000 pages that is automatically created out of a Drupal plugin.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
For instance, would it focus Google's crawl budget more efficiently or have some other effect?
Your thoughts? Thanks! Best... Darcy
-
Hi Darcy
Looking at what has been mentioned previously I would agree with the train of thought that a more focussed sitemap would generally be advantageous.
Andrew
-
Hi Dmitrii,
Always fun to watch Matt's Greatest Hits, in this example the value of making things better.
I guess the make better or delete seems super black and white to me.
Economically, who is able to make thousands of pages dramatically better with compelling original content? So, instead, the only other option is apparently radical elective surgery and massive amputation? I guess I'd choose the chemo first and don't really see what the downside is for noindex/follow and exclude from the sitemap.
Anyway, thanks again! Best... Darcy
-
- I really read the above linked post differently than Google saying "just delete it."
Well, here is a video from Matt Cutts about thin content. In this particular video he's talking about websites, which already took hit for thin content, but in your case it's the same, since you're trying to prevent it
https://www.youtube.com/watch?v=w3-obcXkyA4&t=322So, there are two options he is talking about: delete or make it better. From your previous responses I understand that making it better is not an option, so there is only one option left
As for link juice thorough those pages. If those pages have good amount of links, traffic and are quite popular on your website, then surely DON'T delete them, but rather make them better. However, I understood that those pages are not popular or have much traffic, so, option two
-
Hi Thomas,
Thanks for the message.
To answer your question, part of the reason is link juice via a noindex/follow and then there are some pages that serve a very very narrow content purpose, but have absolutely no life in search.
All things being equal, do you think a smaller, more focused, sitemap is generally an advantage? In the extreme and on other sites I've seen sitemaps with noindexed pages on them.
Thanks... Darcy
-
Thanks for the suggestion, Andrew.
With setting priority or not in a sitemap, do you think a smaller, more focused, sitemap is generally an advantage?
Thanks... Darcy
-
Thomas & Dmitrii,
Thanks for the message. With all do respect, I really read the above linked post differently than Google saying "just delete it."
Also, I don't see how deleting it preserves whatever link juice those pages had, as opposed to a "noindex, follow" and taking them out of the sitemap.
Finally, I don't necessarily equate all of Google's suggestions as synonymous with a "for best effect in search." I assume their suggestions mean, "it's best for Google if you..."
Thanks, again!
Best... Darcy
-
You misunderstand the meaning of that article.
"...that when you do block thin or bad content, Google prefers when you use the noindex over 404ing the page..."
They are talking about the walk around the problem of blocking pages INSTEAD of removing them.
So, if for whatever reason you don't want to delete a page and just put a 404 status on it, it's worse than putting noindex on it. Basically, what they're saying is:
- if you have thin content, DELETE it;
- if for whatever reason you don't want to delete it, put NOINDEX on it.
P.S. My suggestion still stays the same. Delete all bad content and, if you really want, put 410 gone status for that deleted content for Google to understand immediately that those pages are deleted forever, not inaccessible by mistake or something.
Hope this makes sense
.
-
Darcy,
Whilst noindex would be a good solution, if the page has no benefit why would you noindex instead of deleting it?
-
Dmitrii & Thomas,
Thanks for your thoughts.
Removal would be one way to go. I note with some interest this post:
https://www.seroundtable.com/google-block-thin-content-use-noindex-over-404s-21011.html
According to that, removal would be the third thing after making it better and noindexing.
With thousands of pages, making it better is not really an option.
Best... Darcy
-
Hi Darcy
I don't know about scaling the sitemap down but you could make use of an area of the sitemap to optimise and make it a crawl more efficient.
The area in question is the Priority area that basically tells the search engines which pages on your site are the most important. The theory is that pages with a higher priority (say 100%) are more likely to get indexed by the search engines than pages with a lower priority of say (10%), although not everyone in the industry agrees.
-
"There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap."
Why not remove these from the site?
I personally believe that it'll have a positive impact, as you're submitting this sitemap to Google, you're giving it a way of going through your whole site, so why would you give it low quality pages. You want to provide Google (and your users) the best possible experience, so if you've got out of date pages, update them or if they're not relevant delete them, a user who lands on this page anyway would just bounce because it's not relevant anymore.
If these out of date pages can't be found by crawling, then 100% it's best to craft your sitemap to show the best pages.
-
hi there.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
Have you considered removing those pages/sections, rather than altering the sitemap? It would make more sense I think.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pending Sitemaps
Hi, all Wondering if someone could give me a pointer or two, please. I cannot seem to get Google or Bing to crawl my sitemap. If I submit the sitemap in WMT and test it I get a report saying 44,322urls found. However, if I then submit that same sitemap it either says Pending (in old WMT) or Couldn't fetch in the new version. This couldn't fetch is very puzzling as it had no issue fetching the map to test it. My other domains on the same server are fine, the problem is limited to this one site. I have tried several pages on the site using the Fetch as Google tool and they load without issue, however, try as I may, it will not fetch my sitemap. The sitemapindex.xml file won't even submit. I can confirm my sitemaps, although large, work fine, please see the following as an example (minus the spaces, of course, didn't want to submit and make it look like I was just trying to get a link) https:// digitalcatwalk .co.uk/sitemap.xml https:// digitalcatwalk .co.uk/sitemapindex.xml I would welcome any feedback anyone could offer on this, please. It's driving me mad trying to work out what is up. Many thanks, Jeff
Intermediate & Advanced SEO | | wonkydogadmin0 -
Sitemap Query
I've decided to write my own sitemap because frankly, the automated ones pull all kinds of out of I don't know where. So to get around that, manual it is. But I have some products appear in various categories, should I still list every product in each category in the sitemap, regardless of some being duplicates, or should I choose the most relevant category and list them there? I do have a canonical URL extension which should resolve any duplicate content I have.
Intermediate & Advanced SEO | | moon-boots0 -
Should we be showing last modified date for each page in our sitemap?
We're not currently displaying the last modified date in our sitemap, e.g.: <url><loc>http://www.soschildrensvillages.org.uk/about-our-charity</loc></url> Are their any advantages to including this data? One benefit that occurred to us is that it will enable Google to determine which pages have fresh content and which are therefore worth crawling, helping Google index beneficial changes quicker. Thanks!
Intermediate & Advanced SEO | | SOS_Children1 -
Change in sitemap from XML to PHP caused to lose all organic rankings
Hi MOZers, I need some advice for my website: http://www.scorepromotions.ca/ I recently changed the sitemap submitted to GWT from http://www.scorepromotions.ca/sitemap.xml to http://www.scorepromotions.ca/google-sitemap.php I deleted the previously submitted XML sitemap from GWT on Friday & submitted the PHP sitemap on the advice of our developer. On Saturday, I noticed that all our organic rankings disappeared. So, I changed the PHP sitemap back to XML sitemap on Sunday. I am hoping to see my organic rankings recover to previous levels. Does anyone have any advice or experience to share about this issue ? Ankush
Intermediate & Advanced SEO | | ScorePromotions0 -
Any Good XML Sitemaps Generator?
I was wondering if everyone could recommend what XML Sitemap generators they use. I've been using XML-Sitemap, and it's been a little hit and miss for me. Some sites it works great, other it really has serious problems indexing pages. I've also uses Google's, but unfortunately it's not very flexible to use. Any recommendation would be much appreciated.
Intermediate & Advanced SEO | | alrockn0 -
Should I redirect my xml sitemap?
Hi Mozzers, We have recently rebranded with a new company name, and of course this necessitated us to relaunch our entire website onto a new domain. I watched the Moz video on how they changed domain, copying what they did pretty much to the letter. (Thank you, Moz for sharing this with the community!) It has gone incredibly smoothly. I told all my bosses that we may see a 40% reduction in traffic / conversions in the short term. In the event (and its still very early days) we have in fact seen a 15% increase in traffic and our new website is converting better than before so an all-round success! I was just wondering if you thought I should redirect my XML sitemap as well? So far I haven't, but despite us doing the change of address thing in webmaster tools, I can see Google processed the old sitemap xml after we did the change of address etc. What do you think? I know we've been very lucky with the outcome of this rebrand but I don't want to rest on my laurels or get tripped up later down the line. Thanks everyone! Amelia
Intermediate & Advanced SEO | | CommT0 -
XML sitemaps questions
Hi All, My developer has asked me some questions that I do not know the answer to. We have both searched for an answer but can't find one.... So, I was hoping that the clever folk on Moz can help!!! Here is couple questions that would be nice to clarify on. What is the actual address/name of file for news xml. Can xml site maps be generated on request? Consider following scenario: spider requests http://mypage.com/sitemap.xml which permanently redirects to extensionless MVC 4 page http://mypage.com/sitemapxml/ . This page generates xml. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
Why extreme drop in number of pages indexed via GWMT sitemaps?
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
Intermediate & Advanced SEO | | jkinnisch0