Include or exclude noindex urls in sitemap?
-
We just added tags to our pages with thin content.
Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
-
Hi vcj and the rest of you guys
I would be very interested in learning what strategy you actually went ahead with, and the results. I have a similar issue as a result of pruning, and removing noindex pages from the sitemap makes perfect sense to me. We set a noindexed follow on several thousand pages without product descriptions/thin content and we have set things up so when we add new descriptions and updated onpage elements, the noindex is automatically reversed; which sounds perfect, however hardly any of the pages to date (3000-4000) are indexed, so looking for a feasible solution for exactly the same reasons as you.
We have better and comparable metrics and optimization than a lot of the competition, yet rankings are mediocre, so looking to improve on this.
It would be good to hear your views
Cheers
-
I'm aware of the fact Google will get to them sooner or later.
The recommendation from Gary Illyes (from Google), as mentioned in this post, was the reason for my asking the question. Not trying to outsmart Google, just trying to work within their guidelines in the most efficient way possible.
-
Just to put things into perspective,
if these URLs are all already indexed and you have used "noindex" on those pages, sooner or later google will re-crawl these pages and they will be removed. You may want to remove them from the index ASAP for some reason, but it wont really change anything. Because Google will not deindex your noindex pages just because they are in your sitemap.xml.
Google deindexes a sie only when it is time to re-crawl the page.Google never recommends using noindex in sitemaps, and google wont suggest that in their blocking search indexing results guidelines. Also Google indicates the following:
"Google will completely drop the page from search results, even if other pages link to it. If the content is currently in our index, we will remove it after the next time we crawl it. (To expedite removal, use the Remove URLs tool in Google Webmaster Tools.)"But hey! every SEO has its own take.. Some tend to try outsmart Google some not..
Good luck
-
That opens up other potential restrictions to getting this done quickly and easily. I wouldn't consider it best practices to create what is essentially a spam page full of internal links and Googlebot will likely not crawl all 4000 links if you have them all there. So now you'd be talking about maybe making 20 or so thin, spammy looking pages of 200+ internal links to hopefully fix the issue.
The quick, easy sounding options are not often the best option. Considering you're doing all of this in an attempt to fix issues that arose due to an algorithmic penalty, I'd suggest trying to follow best practices for making these changes. It might not be easy but it'll lessen your chances of having done a quick fix that might be the cause, or part of, a future penalty.
So if Fetch As won't work for you (considering lack of manpower to manually fetch 4000 pages), the sitemap.xml option might be the better choice for you.
-
Thanks, Mike.
What are your thoughts on creating a page with links to all of the pages we've Noindexed, doing a Fetch As and submitting that URL and its linked pages? Do you think Google would dislike that?
-
You could technically add them to the sitemap.xml in the hopes that this will get them noticed faster but the sitemap is commonly used for the things you want Google to crawl and index. Plus, placing them in the sitemap does not guarantee Google is going to get around to crawling your change or those specific pages. Technically speaking, doing nothing and jut waiting is equally as valid. Google will recrawl your site at some point. Sitemap.xml only helps if Google is crawling you to see it. Fetch As makes Google see your page as it is now which is like forcing part of a crawl. So technically Fetch As will be the more reliable, quicker choice though it will be more labor-intensive. If you don't have the man-hours to do a project like that at the moment, then waiting or using the Sitemap could work for you. Google even suggests using Fetch As for urls you want them to see that you have blocked with meta tags: https://support.google.com/webmasters/answer/93710?hl=en&ref_topic=4598466
-
There are too many pages to do that (unless we created a page with links to all of the Noindexed pages, then asked Google to crawl that and all linked pages, though that seems like it might be a bad approach). It's an ecommerce website and we Noindexed nearly 4,000 pages that had thin or duplicate content (manufacturer descriptions, no description on brand page, etc) and had no organic traffic in the past 90 days.
This site was hit by Panda in September 2014 and isn't ranking for things it should be – pages with better backlink profiles, higher DA/PA, better content, etc. than our competitors. Our thought is we're not ranking because of a penalty against thin/duplicate content. So we decided to Noindex these pages, improve the content on products that are selling and getting traffic, then work on improving pages that we've Noindex before switching them back to Index.
Basically following recommendations from this article: https://moz.com/blog/pruning-your-ecommerce-site
-
If the pages are in the index and you've recently added a NoIndex tag with the express purpose of getting them removed from the index, you may be better served doing crawl requests in Search Console of the pages in question.
-
Thanks for your response!
I did some more digging. This seems to contradict your suggestion:
https://twitter.com/methode/status/653980524264878080
If the goal is to have these pages removed from the index, and having them in the sitemap means they'll be picked up sooner by Google's crawler, then it seems to make sense that they should be included until they're removed from the index.
Am I misinterpreting this?
-
Hi
The reason you submit a sitemap to a searchengine is to ease and aid in crawling process for the pages that you want to get indexed. It speeds up the crawling process and lets search engine to discover all those pages that has no inner linkings to it etc..
A "noindex" tag does the opposite.
So no, you should not include noindex pages inside your sitemap files.
In general you should avoid pages that are not returning 200 also.Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameters as pagination
Hi guys, due to some changes to our category pages our paginated urls will change so they will look like this: ...category/bagger/2?q=Bagger&startDate=26.06.2017&endDate=27.06.2017 You see they include a query parameter as well as a start and end date which will change daily. All URLs with pagination are on noindex/follow. I am worrying that the products which are linked from the category pages will not get crawled well when the URLs on which they are linked from change on a daily basis. Do you have some experience with this? Are there other things we need to worry about with these pagination URLs? cheers
Technical SEO | | JKMarketing0 -
Sitemaps: Good Image And Video Sitemap Generators
Hello, We are trying to update our sitemap. We have currently updated our XML and HTML sitemaps but would like to have an image and video sitemap also. Can anyone recommend a good image and video sitemap generator? Kind regards, | Deeana Radley Web Designer & SEO Assistant Phone: 01702 460047 Email: dee@solopress.com Google+: +DeeanaRadley Twitter: @DeeanaRadley |
Technical SEO | | SolopressPrint0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
How do I use only one URL
my site can be reach by both www.site.com and site.com. How do I make it only use www?
Technical SEO | | Weblion0 -
Exclude Child URLs from XML Sitemap Generator (Wordpress)
Hi all, I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs. There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked. I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz! Cheers.
Technical SEO | | markadoi840 -
Sitemap Creation
Hi I am looking for the best way to generate an XML sitemap for webmaster tools for my website http://www.cheapfindergames.com. I have come across http://www.xml-sitemaps.com/ but it only allows up to 500 links. Is there a PHP script that any experts could share that would create the XML map that I could upload please? Many Thanks
Technical SEO | | ocelot0 -
Dynamic Parameters in URL
I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do? These attributes really help the user to find easier the products. I could remove some of the attributes, I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.
Technical SEO | | levalencia10