When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
-
I just want a second opinion
The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
-
Hi Jeff,
Thanks for your answer. Please take a look to the reply above on Fredrico
-
Hi Federico,
In this case it's an affiliate website and the 10.000 pages are all prodcutpages. It's all coming from datafeeds so it's duplicate content.
We don't want to index this that's for sure.
So noindex,follow or disallow the whole directory or both...
We have our own opinion about this but I want to hear what others are thinking about this
Thanks in advanced!
-
Yep, I agree with belt and suspenders.
-
Wesley - I do agree with Federico.
That said, if they really don't want those pages indexed, use the belt-and-suspender method (if you wear both a belt and suspenders, chances are greater that your pants won't fall down).
I'd put a robot.txt file to disallow the indexing of the directory, and also no-index / no-follow each of the pages, too.
That way when they have someone working on the pages in the site and they change things to followed, you're still covered. Likewise, if someone blows away the robot.txt file.
Just my $0.02, but hope it helps…
-- Jeff -
What do they have? 10,000 pages of uninteresting content? a robots tag noindex,follow will do to leave them our of engines. But to decide you really need to know what those pages have. 10,000 isn't a few, and if there's value content worth sharing, a page could get a link, that if you disallow it through the robots, won't even flow pagerank.
It all comes down to what are those pages for...?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page not being ranked properly
Hi, Wondering if someone could possibly shed some light on why some of our pages are not being ranked properly on Google. For example this page https://www.mypetzilla.co.uk/dog-breeds Keyword "Dog Breeds" we can't be found on and we are absolutely baffled why? Could it be that we are listing all 100 and something dog breeds on one page? Should we introduce pagination or load more as user scrolls down. This page has been up for at least 4 years. Any suggestion or advice would be much appreciated. Many thanks
Intermediate & Advanced SEO | | Mypetzilla0 -
Totally lost ranking for a targetted page - and dont understand why
I am trying to rank this page https://www.vitari.no/regnskapssystem/visma-net/ for keyword visma.net on google.no. At first it went really well, and I was on first page, ranking as number 8. Then it fell to second page. But then it totally dissapeared. If I go to google.no now and search for visma.net, its not on any of the pages, I have looked through them all. I have used moz and other seo tools to analyze, and try to understand what has happened. But I simply cant understand it. And I cant get rankings back. I've been fighting with this for a long time. However - other pages on the site is ranking for the same keyword. If anyone has an answer leading to me solving this, they will be my hero of the day!
Intermediate & Advanced SEO | | contenting0 -
72KB CSS code directly in the page header (not in external CSS file). Done for faster "above the fold" loading. Any problem with this?
To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)
Intermediate & Advanced SEO | | lcourse0 -
2 pages ranking for the same term
Hi everyone, I have had two pages ranking on page two of Google for a while now for the same term. I have tried dedicating a page to it but as the other has a url with the search term in Google is ranking both it seems. How can I without deindexing one of the pages help better tell Google which one to rank? I imagine if it only ranked one page I would get a higher result rather than 2 weaker ones? On-site has been done and so has links to the homepage, but the innerpage still ranks also as it has the search term in its url. Would a canonical tag be worth it here? the page is however getting some traffic itself for other terms so I am reluctant to do that. Any help much appreciated.
Intermediate & Advanced SEO | | tdigital0 -
Rel="canonical" and rel="alternate" both necessary?
We are fighting some duplicate content issues across multiple domains. We have a few magento stores that have different country codes. For example: domain.com and domain.ca, domain.com is the "main" domain. We have set up different rel="alternative codes like: The question is, do we need to add custom rel="canonical" tags to domain.ca that points to domain.com? For example for domain.ca/product.html to point to: Also how far does rel="canonical" follow? For example if we have:
Intermediate & Advanced SEO | | AlliedComputer
domain.ca/sub/product.html canonical to domain.com/sub/product.html
then,
domain.com/sub/product.html canonical to domain.com/product.html0 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0 -
How to remove "Results 1 - 20 of 47" from Google SERP Snippet
We are trying to optimise our SERP snippet in Google to increase CTR, but we have this horrid "Results 1 - 20 of 47" in the description. We feel this gets in the way of the message and so wish to remove it, but how?? Any ideas apart from removing the paging from the page?
Intermediate & Advanced SEO | | speedyseo0 -
Was moving up in SERPS then Got Stuck on Page 2
Hi, I was continuously acquiring quality back-links and my site was moving up in Google SERPS for 3 main keywords. Within a few weeks i was on Page 2 and 3 for these three keywords, but after reaching there I got stuck on these pages and positions despite no change in link building strategy / pattern. I have even increased the number and quality of links that I acquire per day, but I am still stuck at exact same positions. The website is10 months old and related to a software niche. I update this website once a week. For one keyword I am stuck at position 1 of page two (you can well imagine the frustration..!!). My question is that what do I need to do to get out of this "SERP lock"?
Intermediate & Advanced SEO | | RightDirection0