Third party pages
-
Suppose you are using a third party tool such as an affiliate program. Typically, all the files are organized under one subdirectory. In addition, you may have little or no ability to modify any of the files in terms of SEO. Would you recommend hiding the entire subdirectory with a noindex?
Best,
Christopher -
I am unclear on what you are asking. The only thing I can share based on your question is to noindex any content you don't wish Google to index or otherwise have concerns about.
-
If I noindex all links from my pages to the main page of the affiliate program, and Google can still see all the other pages within the affiliate subdirectory, is that OK? In other words, if I put a noindex firewall between my pages and the third party pages, is it OK that the third party pages exist in a subdirectory on my site?
Best,
Christopher -
I would recommend using the noindex meta tag over blocking pages with the robots.txt file. Either will achieve the same result, but the noindex tag is a superior solution for several reasons:
-
the noindex tag will allow a crawler to see the rest of the page and for your PR to continue it's natural flow throughout your site.
-
there are times when a robots.txt file is deleted or modified in error, and then you have a big headache on your hands. Even if a meta tag is erased, it would just affect a single page so the damage is a lot more manageable.
-
-
I should have been more clear. The affiliate code that we licensed tracks the referrals from our affiliates and pays commissions for conversions to sales. We installed the package in a subdirectory. While the package does allow some customization, the code is not written to be modified. A number of pages have the same titles.
Per your suggestion, I added a Disallow in the robots.txt to keep the bots away from that subdirectory.
Best,
Christopher -
You should have the ability to completely control any and all content which appears on your site. An affiliate program may provide images, suggested text, and may have certain requirements such as always including certain phrases, etc. That is fine, but not entire content blocks.
If you are sharing any content, affiliate or otherwise, which is not under your control or is duplicated on other sites, then yes, I would recommend the noindex tag being applied to the content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many pages should we optimise?
I have more than 250 pages on my site including my products. Is it a good idea to optimise each page with a unique keyword or is there a limit to the number of pages we should aim for?
On-Page Optimization | | Timberwink0 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
Product page reviews issues
Hi, We've implemented pagination on the reviews with rel=next/prev, but have seen no improvements since this. An example page with reviews is here. Can you see any issues on this that would be causing the problem? Thanks!
On-Page Optimization | | pikka0 -
How is this page ranking?
Hi. A client of mine is being outranked by a competitor whose landing page does not include the keyword within their page content AT ALL. Nor does their URL. Nor do any image alts. And their page title features the keyword in the middle of it, not at the start. Their link profile is not great with directories and the like. They are not socially active.. I am confused! I thought content on a page absolutely had to include the keyword to get ranked for it. Here's the page: www.springsoft.ie, keyword is "water softeners" Any thoughts I would appreciate. Many thanks.Christoffa
On-Page Optimization | | Christoffa0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Pages not cached
Sorry for all the questions. I have dozens of article pages that are not cached by google. How can I get them cached?
On-Page Optimization | | azguy0 -
Home page ranking dropped below internal pages
The index page for a site I manage has dropped significantly - internal pages rank above it. It's a new site, 2 months old but was ranking at 1st. Any suggestions as to how I can debug this?
On-Page Optimization | | OptioPublishing0 -
Page Authority
I have recently optimised a set of images for a client of ours: I'm looking through all the PA of these newly optimised images, and have varying PA {from SEOmoz toolbar} I understand that internal linking will pass link juice, and obviously external links will add to the overall PA. I have several pages with a PA of 36: { Fairly deep pages} Yet they have no external or internal links going to them. My question is "How can a page gain any authority when it has no visible links pointing at it?" Obviously there must be a link pointing at it {internally} as Google wouldn't have crawled the page right? Also lets say all the keywords are of equal competitiveness would the keywords with highest PA rank higher than those on O PA pages. Many Thanks
On-Page Optimization | | Yozzer0