Robots.txt vs noindex
-
I recently started working on a site that has thousands of member pages that are currently robots.txt'd out.
Most pages of the site have 1 to 6 links to these member pages, accumulating into what I regard as something of link juice cul-d-sac.
The pages themselves have little to no unique content or other relevant search play and for other reasons still want them kept out of search.
Wouldn't it be better to "noindex, follow" these pages and remove the robots.txt block from this url type? At least that way Google could crawl these pages and pass the link juice on to still other pages vs flushing it into a black hole.
BTW, the site is currently dealing with a hit from Panda 4.0 last month.
Thanks! Best... Darcy
-
if you add the meta noindex, follow tag , it will keep the page out of the SERP but allows pagerank to flow through them to other pages.
See this interview of Matt Cutts for more info : http://www.stonetemple.com/articles/interview-matt-cutts.shtml
-
Hi Saijo,
Thanks for the response. Do you think that would yield the benefit I'm looking for of recapturing that lost link juice?
Do you think there'd be any downside to the switcheroo from robots.txt to noindex, follow?
Best... Darcy
-
Since you said " The pages themselves have little to no unique content or other relevant search play and for other reasons still want them kept out of search. " I would use meta robots "noindex, follow"
-
HI Lesley,
Thanks for the thoughts. I don't see this as a real option for a number of reasons, including but not limited to that there are 50,000 profiles, most with very little information. The members of this site are 95% busy professionals who aren't trying to advance their career via their profile. So, there'd be some privacy concern and the potential for tens of thousands of low content/highly templated pages. Not really a search dream come true!
Also, converting it into a system where different levels of profile completeness are acknowledged would not really resonate with this community nor would it be near the top of our engineering priorities.
What I really want to get clear on is how best to keep them search invisible while not losing link value into a robots.txt'd black hole. Really just looking for confirmation if, with those goals, "noindex, follow" and remove from robots is the way to go. I'm pretty sure it is, but would like to hear more about that.
Thanks... Darcy
-
I think what I am going to say is going to sound like it is going against the grain, but it really isn't. I have noticed in some places if you want an active community, you reward your members. Look at how moz does their forum, they don't really noindex the pages, but once you hit a point they psuedo drop the nofollow off of your profile link (it could be argued whether they really do). But the point is reward your members that are active. I would set up some automatic noindex tag in the header that grabbed the users post numbers. Then you can noindex all of the spammers and have prominent members shown in the search. If it were me that is how I would do it. I have a PA of 49 on my profile in one forum I regular, I have seen the stats, it is regularly an entry page to the forum. Another member has a 64 on a 93 domain, his is used a lot more than mine for entry as well. Think of it this way, if someone is googling my name, the second result is http://screencast.com/t/jIx7a4hcWV Moz's forum. 2nd search results still get a lot of clicks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
Intermediate & Advanced SEO | | Dom4410 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100 -
Will blocking urls in robots.txt void out any backlink benefits? - I'll explain...
Ok... So I add tracking parameters to some of my social media campaigns but block those parameters via robots.txt. This helps avoid duplicate content issues (Yes, I do also have correct canonical tags added)... but my question is -- Does this cause me to miss out on any backlink magic coming my way from these articles, posts or links? Example url: www.mysite.com/subject/?tracking-info-goes-here-1234 Canonical tag is: www.mysite.com/subject/ I'm blocking anything with "?tracking-info-goes-here" via robots.txt The url with the tracking info of course IS NOT indexed in Google but IT IS indexed without the tracking parameters. What are your thoughts? Should I nix the robots.txt stuff since I already have the canonical tag in place? Do you think I'm getting the backlink "juice" from all the links with the tracking parameter? What would you do? Why? Are you sure? 🙂
Intermediate & Advanced SEO | | AubieJon0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0 -
Canonical vs noindex for blog tags
Our blog started to user tags & I know this is bad for Panda, but our product team wants use them for user experience. Should we canonizalize these tags to the original blog URL or noindex them?
Intermediate & Advanced SEO | | nicole.healthline0 -
Directory VS Article Directory
Which got hit harder in penguin update. I was looking at SEER Interactive backlink profile (the SEO company that didn't rank for it's main keyword phrases) and noticed a pretty big trend on why it might not rank for its domain name. SEER was in a majority of anchor text, many coming from directories. i'm guessing THEY were effected because they matched the exact match domain link profile rule I'm not an expert programmer, but if i was playing "Google Programmer" I would think the Algo update went something like. If ((exact match domain) & (certain % anchor text==domain) & (certain % of anchor text== partial domain + services/company)) { tank the rankings } So back to the question, do you think that this update had a lot to do with directories, article directories, or neither. Is article directories still a legit way to get links. (not ezine)
Intermediate & Advanced SEO | | imageworks-2612900 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640