"Noindex, follow" for thin pages?
-
Hey there Mozzers,
I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think?
Has someone faced this situation before?
Will appreciate your input!
-
+1 to EGOL and Ginaluca. We need more information about that pages.
In any case, if we are talking about thin content, but if is quality content and it's not duplicated content or oriented-for-SEO content, I would not use noindex for it.
If we ar talking about empty pages or almost empty pages maybe is better to use noindex, or maybe is better to delete and redirect with 301 this pages.
I would reduce the internal linking, and maybe put those internal links lower or in places with less visibility. Just that.
Greetings!
-
EGOL was right asking more information also for one precise reason: in some website a "thin page" maybe the best thing the same website can offer to a visitor because that page answers exactly to what the user needs from it.
That is why so often the Googlers say that thin content per se it's not a problem.
It's a problem if it is due to some technical issue or because of bad on-page SEO (i.e.: a page with a photo and no caption and written description of the photo).
So, to better answer your question, we need to know more about the nature of those thin pages you are talking about.
p.d.: using "noindex, follow" is not anymore suggested by Googlers. In fact, few months ago, John Mueller declared that if Google sees a page with a noindex,follow for a long time, then it will start considering the "follow" as a nofollow", so the original reason of its use won't be satisfied.
-
If you want good responses to this question, then post more about these pages (current content, how many, current traffic, current rankings, recent problems, purpose of pages, etc.) and more about your site (current content, how many, current traffic, current rankings, recent problems, etc.).
Questions with little information are often ignored by people who might know a lot about the subject because they don't want to guess, they don't want to think about and write about all possible cases, put their effort into a question when the poster didn't put much of his own effort into explaining.
Also, who are you? Owner? Employee? SEO? Are you the guy who put these pages up and didn't put any content on them? The guy who paid for the skinny content that is currently up there and needs to have input on yanking them down or paying for proper content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I "undo" or remove a Google Search Console change of address?
I have a client that set a change of address in Google Search Console where they informed Google that their preferred domain was a subdomain, and now they want Google to also consider their base domain (without the change of address). How do I get the change of address in Google search console removed?
Technical SEO | | KatherineWatierOng0 -
Leveraging "Powered by" and link spam
Hi all, For reference: The SaaS guide to leveraging the "Powered By" tactic. My product is an embeddable widget that customers place on their websites (see example referenced in link above). A lot of my customers have great domain authority (big brands, .gov's etc). I would like to use a "Powered By" link on my widgets to create high quality backlinks. My question is: if I have identical link text (on potentially hundreds) of widgets, will this look like link spam to Google? If so, would setting the link text randomly on each widget to one of a few different phrases (to create some variation) avoid this? Hope this makes sense, thanks in advance.
Technical SEO | | NoorHammad0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Why use noindex, follow vs rel next/prev
Look at what www.shutterstock.com/cat-26p3-Abstract.html does with their search results page 3 for 'Abstract' - same for page 2-N in the paginated series. | name="robots" content="NOINDEX, FOLLOW"> |
Technical SEO | | jrjames83
| | Why is this a better alternative then using the next/prev, per Google's official statement on pagination? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744 Which doesn't even mention this as an option. Any ideas? Does this improve the odds of the first page in the paginated series ranking for the target term? There can't be a 'view all page' because there are simply too many items. Jeff0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
'No Follow' and 'Do Follow' links when using WordPress plugins
Hi all I hope someone can help me out with the following question in regards to 'no follow' and 'do follow' links in combination with WordPress plugins. Some plugins that deal with links i.e. link masking or SEO plugins do give you the option to 'not follow' links. Can someone speak from experience that this does actually work?? It's really quite stupid, but only occurred to me that when using the FireFox add on 'NoDoFollow' as well as looking at the SEOmoz link profile of course, 95% of my links are actually marked as FOLLOW, while the opposite should be the case. For example I mark about 90% of outgoing links as no follow within a link masking plugin. Well, why would WordPress plugins give you the option to mark links as no follow in the first place when they do in fact appear as follow for search engines and SEOmoz? Is this a WordPress thing or whatnot? Maybe they are in fact no follow, and the information supplied by SEO tools comes from the basic HTML structure analysis. I don't know... This really got me worried. Hope someone can shed a light. All the best and many thanks for your answers!
Technical SEO | | Hermski0 -
Rel=cannonical vs. noindex.follow for paginated pages
I"m working on a real estate site that has multiple listing pages, e.g. http://www.hhcrealestate.com/manhattan-beach-mls-real-estate-listings I'm trying to get the main result page to rank for that particular geo-keyword, i.e. "manhattan beach homes for sale". I want to make sure all of the individual listings on the paginated pages, 2,3, 4 etc. still get indexed. Is it better to add to all of the paginated pages, i.e.manhattan-beach-mls-real-estate-listings-2, manhattan-beach-mls-real-estate-listings--3, manhattan-beach-mls-real-estate-listings-4, etc. or is it better to add noindex,follow to those pages?
Technical SEO | | fthead91