When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
-
I just want a second opinion
The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would  you do?
-
Hi Jeff,
Thanks for your answer. Please take a look to the reply above on Fredrico
-
Hi Federico,
In this case it's an affiliate website and the 10.000 pages are all prodcutpages. It's all coming from datafeeds so it's duplicate content.
We don't want to index this that's for sure.
So noindex,follow or disallow the whole directory or both...
We have our own opinion about this but I want to hear what others are thinking about this
Thanks in advanced!
-
Yep, I agree with belt and suspenders.
-
Wesley - I do agree with Federico.
That said, if they really don't want those pages indexed, use the belt-and-suspender method (if you wear both a belt and suspenders, chances are greater that your pants won't fall down).
I'd put a robot.txt file to disallow the indexing of the directory, and also no-index / no-follow each of the pages, too.
That way when they have someone working on the pages in the site and they change things to followed, you're still covered. Â Likewise, if someone blows away the robot.txt file.
Just my $0.02, but hope it helps…
-- Jeff -
What do they have? 10,000 pages of uninteresting content? a robots tag noindex,follow will do to leave them our of engines. But to decide you really need to know what those pages have. 10,000 isn't a few, and if there's value content worth sharing, a page could get a link, that if you disallow it through the robots, won't even flow pagerank.
It all comes down to what are those pages for...?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SERP Ranking
Hi everyone, I'm struggling to figure out why our website has dramatically changed and is ranking far worse than ever. We were top for around 10 keywords, and had over 4 high traffic keywords in position 1. I think the website needs updating (as it's magento 1.7), and that the competition is increasing. Taking our eye off the ball has been a huge mistake. Could anyone advise on what the next steps would be to start increasing to a better serp rank? These two are own main ranking pages:
Intermediate & Advanced SEO | | mg33dev
https://www.mediagenic.co.uk/
https://www.mediagenic.co.uk/budget-roller-banner.html Any advice or pointers would be absolutely awesome. Thanks0 -
Ranking for a brand term with "&" (and) in the name?
Hello Moz community. We have a company that rebranded their name to "Bar & Cocoa" with the URL https://barandcocoa.com/. It's been about 3 months, and the website has yet to show up organically anywhere within the first 50 results foer their brand terms. It seems that Google pretty much ignores the "&" or "and" word when typing in bar & cocoa, or bar and cocoa in search. You'd think with that with the exact domain name, it would at least move the needle a bit, but it has not helped. Even being in Denver, I'm getting results for a "Bar Cocoa" business located in Charlotte, NC, and the secondary pages that belong to that business, and then a bunch of other companies, products and irrelevant search results (like a parked domain)! Any suggestions or ideas, please help!
Intermediate & Advanced SEO | | flowsimple1 -
Best to Combine Listing URLs? Are 300 Listing Pages a "Thin Content" Risk?
We operate www.metro-manhattan.com, a commercial real estate website. There about 550 pages. About 300 pages are for individual listings. About 150 are for buildings. Most of the listings pages have 180-240 words. Would it be better from an SEO perspective to have multiple listings on a single page, say all Chelsea listings on the Chelsea neighborhood page? Are we shooting ourselves in the foot by having separate URLs for each listing? Are we at risI for a thin cogent Google penalty? Would the same apply to building pages (about 150)? Sample Listing:Â http://www.nyc-officespace-leader.com/listings/364-madison-ave-office-lease-1802sf Sample Building:Â http://www.nyc-officespace-leader.com/for-a-new-york-office-space-rental-consider-one-worldwide-plaza-825-eighth-avenue My concern is that the existing site architecture may result in some form of Google penalty. If we have to consolidate these pages what would be the best way of doing so? Thanks,Â
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Ranking drop for a particular page on a particular keyword
Take a look at centerforhealthysex.com/sex-addiction. For months we were ranking in the top 3 spots with this URL for "sex addiction los angeles", even reaching #1 for a while. Late last year we redesigned and developed the site to clean up the code and redirects. We also cleaned up the internal linking structure. For years we had been ranking on "sex addiction los angeles" for the home page ... bumping around the top 5 spots, but we wanted organic traffic to go to /sex-addiction. In the Fall, we saw overall site traffic rise steadily. We made few changes to the site and none to this page or links flowing back to the page once we had achieved strong ranking -- we didn't want to mess with a good thing. Then November 27th we started losing ranking on this term and a couple others. The good news is that we gained ranking on some high volume traffic terms so overall organic traffic is reasonably strong, BUT we're not ranking on the terms where we want to rank. Centerforhealthysex.com/sex-addiction is now nowhere to be found on the target search term despite fairly strong page rank. I tried redoing and resubmitting the site map, cleaning out some potentially duplicative content but to no avail. I see no issues, errors or warnings in Webmaster Tools. We have a few medium priority fixer-uppers in SEO Moz, but we've taken care of the majority of the big stuff. What am I not seeing? Thank you!
Intermediate & Advanced SEO | | joshuakrafchin0 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0 -
How do you rank in the "brands for:" section in Google's search results ?
There's a "brands for:" section that appears above the first organic listing for certain search queries. For example, if you search for "dedicated servers" in Google, you will see that a "brands for:" appears. How do you get listed there? Thanks, Brian
Intermediate & Advanced SEO | | InMotionHosting0