Editing A Sitemap
-
Would there be any positive effect from editing a site map down to a more curated list of pages that perform, or that we hope they begin to perform, in organic search?
A site I work with has a sitemap with about 20,000 pages that is automatically created out of a Drupal plugin.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
For instance, would it focus Google's crawl budget more efficiently or have some other effect?
Your thoughts? Thanks! Best... Darcy
-
Hi Darcy
Looking at what has been mentioned previously I would agree with the train of thought that a more focussed sitemap would generally be advantageous.
Andrew
-
Hi Dmitrii,
Always fun to watch Matt's Greatest Hits, in this example the value of making things better.
I guess the make better or delete seems super black and white to me.
Economically, who is able to make thousands of pages dramatically better with compelling original content? So, instead, the only other option is apparently radical elective surgery and massive amputation? I guess I'd choose the chemo first and don't really see what the downside is for noindex/follow and exclude from the sitemap.
Anyway, thanks again! Best... Darcy
-
- I really read the above linked post differently than Google saying "just delete it."
Well, here is a video from Matt Cutts about thin content. In this particular video he's talking about websites, which already took hit for thin content, but in your case it's the same, since you're trying to prevent it
https://www.youtube.com/watch?v=w3-obcXkyA4&t=322So, there are two options he is talking about: delete or make it better. From your previous responses I understand that making it better is not an option, so there is only one option left
As for link juice thorough those pages. If those pages have good amount of links, traffic and are quite popular on your website, then surely DON'T delete them, but rather make them better. However, I understood that those pages are not popular or have much traffic, so, option two
-
Hi Thomas,
Thanks for the message.
To answer your question, part of the reason is link juice via a noindex/follow and then there are some pages that serve a very very narrow content purpose, but have absolutely no life in search.
All things being equal, do you think a smaller, more focused, sitemap is generally an advantage? In the extreme and on other sites I've seen sitemaps with noindexed pages on them.
Thanks... Darcy
-
Thanks for the suggestion, Andrew.
With setting priority or not in a sitemap, do you think a smaller, more focused, sitemap is generally an advantage?
Thanks... Darcy
-
Thomas & Dmitrii,
Thanks for the message. With all do respect, I really read the above linked post differently than Google saying "just delete it."
Also, I don't see how deleting it preserves whatever link juice those pages had, as opposed to a "noindex, follow" and taking them out of the sitemap.
Finally, I don't necessarily equate all of Google's suggestions as synonymous with a "for best effect in search." I assume their suggestions mean, "it's best for Google if you..."
Thanks, again!
Best... Darcy
-
You misunderstand the meaning of that article.
"...that when you do block thin or bad content, Google prefers when you use the noindex over 404ing the page..."
They are talking about the walk around the problem of blocking pages INSTEAD of removing them.
So, if for whatever reason you don't want to delete a page and just put a 404 status on it, it's worse than putting noindex on it. Basically, what they're saying is:
- if you have thin content, DELETE it;
- if for whatever reason you don't want to delete it, put NOINDEX on it.
P.S. My suggestion still stays the same. Delete all bad content and, if you really want, put 410 gone status for that deleted content for Google to understand immediately that those pages are deleted forever, not inaccessible by mistake or something.
Hope this makes sense
.
-
Darcy,
Whilst noindex would be a good solution, if the page has no benefit why would you noindex instead of deleting it?
-
Dmitrii & Thomas,
Thanks for your thoughts.
Removal would be one way to go. I note with some interest this post:
https://www.seroundtable.com/google-block-thin-content-use-noindex-over-404s-21011.html
According to that, removal would be the third thing after making it better and noindexing.
With thousands of pages, making it better is not really an option.
Best... Darcy
-
Hi Darcy
I don't know about scaling the sitemap down but you could make use of an area of the sitemap to optimise and make it a crawl more efficient.
The area in question is the Priority area that basically tells the search engines which pages on your site are the most important. The theory is that pages with a higher priority (say 100%) are more likely to get indexed by the search engines than pages with a lower priority of say (10%), although not everyone in the industry agrees.
-
"There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap."
Why not remove these from the site?
I personally believe that it'll have a positive impact, as you're submitting this sitemap to Google, you're giving it a way of going through your whole site, so why would you give it low quality pages. You want to provide Google (and your users) the best possible experience, so if you've got out of date pages, update them or if they're not relevant delete them, a user who lands on this page anyway would just bounce because it's not relevant anymore.
If these out of date pages can't be found by crawling, then 100% it's best to craft your sitemap to show the best pages.
-
hi there.
Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap.
Have you considered removing those pages/sections, rather than altering the sitemap? It would make more sense I think.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Best sitemap generator that can automatically create and submit
I like screamingfrog but they don't automatically generate and submit to google. We use xml-sitemaps.org but they don't have all the functions and they crawl slow too. Can you recommend some good sitemap generator that is fast, with features and can automatically create and submit? Is inspyder good?
Intermediate & Advanced SEO | | rbai0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
Should I create a separate sitemap.xml for paginated categories?
For example: http://www.site.com/category/sub-category http://www.site.com/category/sub-category/1 http://www.site.com/category/sub-category/2 http://www.site.com/category/sub-category/3 Thanks in advance! 🙂
Intermediate & Advanced SEO | | esiow20130 -
How can I get an XML sitemap in the order that I want?
I use Screaming Frog and Xenu on a daily basis and I use them for sitemap creation, but the functionality is limited. With huge sites, it's really easy to create an ordered list of URLs for the sitemap in excel or word and upload that to Screaming Frog to crawl. The only problem is that it won't export the sitemap in the order that I uploaded it. Does anybody know of a tool that will do this or am I doomed to sit an manually arrange the URLs the way I want?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Has important is it to set "priority" and "frequency" in sitemaps?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Intermediate & Advanced SEO | | nicole.healthline2