When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
-
I just want a second opinion
The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
-
Hi Jeff,
Thanks for your answer. Please take a look to the reply above on Fredrico
-
Hi Federico,
In this case it's an affiliate website and the 10.000 pages are all prodcutpages. It's all coming from datafeeds so it's duplicate content.
We don't want to index this that's for sure.
So noindex,follow or disallow the whole directory or both...
We have our own opinion about this but I want to hear what others are thinking about this
Thanks in advanced!
-
Yep, I agree with belt and suspenders.
-
Wesley - I do agree with Federico.
That said, if they really don't want those pages indexed, use the belt-and-suspender method (if you wear both a belt and suspenders, chances are greater that your pants won't fall down).
I'd put a robot.txt file to disallow the indexing of the directory, and also no-index / no-follow each of the pages, too.
That way when they have someone working on the pages in the site and they change things to followed, you're still covered. Likewise, if someone blows away the robot.txt file.
Just my $0.02, but hope it helps…
-- Jeff -
What do they have? 10,000 pages of uninteresting content? a robots tag noindex,follow will do to leave them our of engines. But to decide you really need to know what those pages have. 10,000 isn't a few, and if there's value content worth sharing, a page could get a link, that if you disallow it through the robots, won't even flow pagerank.
It all comes down to what are those pages for...?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can adding "noindex" help with quality penalizations?
Hello Moz fellows, I have another question about content quality and Panda related penalization. I was wondering this: If I have an entire section of my site that has been penalized due to thin content, can adding "noindex,follow" to all pages belonging to that section help de-penalizing the rest of the site in the short term, while we work to improve those penalized pages, which is going to take a long time? Can that be considered a "short term solution" to improve the overall site scoring on Google index while we work to improve those penalized pages, and, once ready, we remove the "noindex" tag? I am eager to know your thoughts on this possible strategy. Thank you in advance to everyone!
Intermediate & Advanced SEO | | fablau0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Internal Duplicate Pages causing dip in rankings
Hi Guys, Need help in understanding whether having duplicate pages on your site push you down in rankings. Our all product pages getting indexed by Google with different parameters i.e. filters, affiliate id, utm_source etc. and then we have 10-15 duplicate for one product page. I am observing dip in rankings whenever Google starts indexing these duplicate but when I asked this question to John Muller and other Google team they said if you set up canonical then you don't have to worry about having different urls for same page but we are not ranking on Google and if we do then we dropped from page 1 to page 2 or sometimes page 3. Example - http://goo.gl/G5p3X5 Any suggestions.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Should I NOFOLLOW my "Add To Cart" buttons?
Hello and Merry Christmass Should I NOFOLLOW my "Add To Cart" buttons? My e-commerce site has hundreds of products. Content wise, there is no real value to the reader on that page (besides for some testimonials and "why here" sentences). So it is not a page you'd want / expect to find in the SERPs. Also, with hundreds of links pointing to this page it would be "stronger" than other important pages which doesn't make sense. Last but not least, if I have limited time that the bots are on my site, why keep sending them to a non important page. This is why I am leaning to nofollowing the "add to cart" buttons and looking for reinforcements. Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
"Authorship is not working for this webpage" Can a company G+ page be both Publisher AND Author?
When using the Google Structured Data testing tool I get a message saying....... **Authorship Testing Result - **Authorship is not working for this webpage. Here are the results of the data for the page http://www.webjobz.com/jobs/ Authorship Email Verification Please enter a Google+ profile to see if the author has successfully verified an email address on the domain www.webjobz.com to establish authorship for this webpage. Learn more <form id="email-verification-form" action="http://www.google.com/webmasters/tools/richsnippets" method="GET" data-ved="0CBMQrh8">Verify Authorship</form> Email verification has not established authorship for this webpage.Email address on the webjobz.com domain has been verified on this profile: YesPublic contributor-to link from Google+ profile to webjobz.com: YesAutomatically detected author name on webpage: Not Found.Publisher | Publisher markup is verified for this page. |
Intermediate & Advanced SEO | | Webjobz
| Linked Google+ page: | https://plus.google.com/106894524985345373271 | Question - Can this company Google plus account "Webjobz" be both the publisher AND the author? Can I use https://plus.google.com/106894524985345373271 as the author of this and all other pages on our site? 98emVv70 -
How can we improve rankings for category pages
Hi Everyone, I have a dog breeder site I'm working on and I was wondering if I could get some tips and ideas on things to do to help the "category" pages rank better in search engines. Let's say I have "xyz" breed category page which has listings of all dog breeders who offer that particular breed, in this case "xyz". I have certain breeder profile listings which rank higher for those terms that the category page should be ranking for. So I'm guessing Google thinks those breeder profile pages are more relevant for those terms. Especially if well optimized. I know thin content may be my problem here, but one of our competitors dominates the rankings for relevant keywords with no content on their category pages. What do you all suggest?
Intermediate & Advanced SEO | | rsanchez0 -
Create different pages with keyword variations VS. Add keyword variations in 1 page
For searches involving keywords like "lessons", "courses", "classes" I see frequently pages in the top rankings which do not contain the search term in the title tag, despite these terms being quite competitive. It seems that when searching for "classes", google detects that pages about "courses" may be just as relevant. What do you recommend? option 1: creating 10 pages optimized on 10 different keyword variations, each with a significant part of unique content or option 2: one page and dropping throughout the page 10 keyword variations in body and headlines Given that keywords are all synonyms and website has already high domain authority in the niche. thanks
Intermediate & Advanced SEO | | lcourse0 -
On page report card for the keyword "computers"
I was looking at which websites ranks in the TOP 3 for the keyword "computers"... I noticed that first is wikipedia and then there are Dell and Apple... I then did an on page report card and I noticed that wikipedia has a grade A (which is great ) However, Apple has an F ( which sucks !! ) but there still rank out there. My question is why is Apple ranking for the keyword computers with no tiitle, no URL, no H1, no body, no B/Strong... when wikipedia has all of that and the term " computers " occurs 290 times on its page... Is is due to the fact that apple has millions of external links and is that enough to rank even with an " irrelevant " page ? By the way I have noticed that on other keywords such as " bicycle ". Wikipedia is ranking 1 st and then sites like www.trekbikes.com are out there but they shouldn't based on their homepage "optimization ". I know there are other factors but I am just trying to figure why such sites ( like apple or trek bikes ) rank out there. Thank you,
Intermediate & Advanced SEO | | seoanalytics0