"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
-
Greetings MOZ community:
If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"?
My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense.
What is proper form?
As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content.
Thanks, Alan
-
Personally I think its madness to "no follow" any internal links. When you "no follow" you are throwing link juice out the window, the days of sculpting links ( the practice of "no following" some links on a page so more juice flows though other "follow" links) are long gone, yet is still see it being attempted all over the place.
-
I can agree on this one, in most cases there are still relevant links or main navigation on the page. So that's why it's valuable to have bots follow these links.
-
I personally would follow them There is no issue in having a page with thin content followed, it will not hurt anything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will URLS With Existing 301 Redirects Be as Powerful As New URLS In Serps?
Most products on our site have redirects to them from years of switching platform and merely trying to get a great and optimised URL for SEO purposes. My question is this: If a product URL has alot of redirects (301's), would it be more beneficial to me to create a duplicated version of the product and start fresh with a new URL? I am not on here trying to gain backlinks but my site is tn nursery dot net (proof:)
Intermediate & Advanced SEO | | tammysons
I need some quality help figuring out what to do.
Tammy0 -
"Avoid Too Many Internal Links" when you have a mega menu
Using the on-page grader and whilst further investigating internal linking, I'm concerned that as the ecommerce website has a very link heavy mega menu the rule of 100 may be impeding on the contextual links we're creating. Clearly we don't want to no-follow our entire menu. Should we consider no-indexing the third-level- for example short sleeve shirts here... Clothing > Shirts > Short Sleeve Shirts What about other pages we're don't care to index anyway such as the 'login page' the 'cart' the search button? Any thoughts appreciated.
Intermediate & Advanced SEO | | Ant-Scarborough0 -
Does redirecting from a "bad" domain "infect" the new domain?
Hi all, So a complicated question that requires a little background. I bought unseenjapan.com to serve as a legitimate news site about a year ago. Social media and content growth has been good. Unfortunately, one thing I didn't realize when I bought this domain was that it used to be a porn site. I've managed to muck out some of the damage already - primarily, I got major vendors like Macafee and OpenDNS to remove the "porn" categorization, which has unblocked the site at most schools & locations w/ public wifi. The sticky bit, however, is Google. Google has the domain filtered under SafeSearch, which means we're losing - and will continue to lose - a ton of organic traffic. I'm trying to figure out how to deal with this, and appeal the decision. Unfortunately, Google's Reconsideration Request form currently doesn't work unless your site has an existing manual action against it (mine does not). I've also heard such requests, even if I did figure out how to make them, often just get ignored for months on end. Now, I have a back up plan. I've registered unseen-japan.com, and I could just move my domain over to the new domain if I can't get this issue resolved. It would allow me to be on a domain with a clean history while not having to change my brand. But if I do that, and I set up 301 redirects from the former domain, will it simply cause the new domain to be perceived as an "adult" domain by Google? I.e., will the former URL's bad reputation carry over to the new one? I haven't made a decision one way or the other yet, so any insights are appreciated.
Intermediate & Advanced SEO | | gaiaslastlaugh0 -
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Syntax: 'canonical' vs "canonical" (Apostrophes or Quotes) does it matter?
I have been working on a site and through all the tools (Screaming Frog & Moz Bar) I've used it recognizes the canonical, but does Google? This is the only site I've worked on that has apostrophes. rel='canonical' href='https://www.example.com'/> It's apostrophes vs quotes. Could this error in syntax be causing the canonical not to be recognized? rel="canonical"href="https://www.example.com"/>
Intermediate & Advanced SEO | | ccox10 -
Why are "noindex" pages access denied errors in GWT and should I worry about it?
GWT calls pages that have "noindex, follow" tags "access denied errors." How is it an "error" to say, "hey, don't include these in your index, but go ahead and crawl them." These pages are thin content/duplicate content/overly templated pages I inherited and the noindex, follow tags are an effort to not crap up Google's view of this site. The reason I ask is that GWT's detection of a rash of these access restricted errors coincides with a drop in organic traffic. Of course, coincidence is not necessarily cause. Should I worry about it and do something or not? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Blog Content In different language not indexed - HELP PLEASE!
I have an ecommerce site in English and a blog that is in Malay language. We have started the blog 3 weeks ago with about 20-30 articles written. Ecommerce is using MAgento CMS and Blog is wordpress. URL Structure: Ecommerce: www.example.com Blog: www.example.com/blog Blog category: www.example.com/blog/category/ However, google is indexing all pages including blog category but not individual post that is in Malay language. What could be the issue here? PLEASE help me!
Intermediate & Advanced SEO | | WayneRooney0 -
Google's Stance on "Hidden" Content
Hi, I'm aware Google doesn't care if you have helpful content you can hide/unhide by user interaction. I am also aware that Google frowns upon hiding content from the user for SEO purposes. We're not considering anything similar to this. The issue is, we will be displaying only a part of our content to the user at a time. We'll load 3 results on each page initially. These first 3 results are static, meaning on each initial page load/refresh, the same 3 results will display. However, we'll have a "Show Next 3" button which replaces the initial results with the next 3 results. This content will be preloaded in the source code so Google will know about it. I feel like Google shouldn't have an issue with this since we're allowing the user action to cycle through all results. But I'm curious, is it an issue that the user action does NOT allow them to see all results on the page at once? I am leaning towards no, this doesn't matter, but would like some input if possible. Thanks a lot!
Intermediate & Advanced SEO | | kirmeliux0