NOINDEX,NOFOLLOW - Any SEO benefit to these pages?
-
Hi
I could use some advice on a site architecture decision. I am developing something akin to an affiliate scheme for my business. However it is not quite as simple as an affliate setup because the products sold through "affiliates" will be slightly different, as a result I intend to run the site from a subdomain of my main domain.
I am intending to NOINDEX,NOFOLLOW the subdomained site because it will contain huge amounts of duplication from my main site (it is really a subset of the main site with some slightly different functionality in places). I don't really want or need this subdomain site indexed, hence my decision to NOINDEX,NOFOLLOW it.
However given I will, hopefully, be having lots of people link into the subdomain I am hoping to come up with some sort of arrangement that will mean that my main domain derives some sort of benefit from the linking. They are, after all, votes for my business so they feel like "good links". I am assuming here that a direct link into my NOFOLLOW,NOINDEX subdomain is going to provide ZERO benefit to my main domain. Happy to be corrected!
The best I can come up with is to have a "landing page" on my main domain which links into parts of my main domain and then provides a link through to the subdomain site. However this feels like a bad experience from the user's point of view (i.e. land on a page and then have to click to get to the real action) and feels a bit spammy, i.e. I don't really have a good reason for this page other than linking!
Equally I could NOINDEX,FOLLOW the homepage of the affiliate site and link back to the main domain from there. However this also feels a bit spammy and would be far less beneficial, I guess, because the subdomain homepage would have many more outgoing links than I envisaged for my "landing page" idea above. Also, it also looks a bit spammy (i.e. why follow the homepage and nofollow everything else?)!
The trouble, I guess, is that whatever I do feels a bit spammy. I suppose this is because IT IS spammy! Has anyone got any good ideas how I could setup an arrangement like I described above and derive benefit to my main domain without it looking (or being) spammy? I just hate to think of all of those links being wasted (in an SEO sense).
Thanks
Gary
-
Ha, brilliant. take care!
-
Would you believe me if I told you I had a brother called Derek? Most don't but it is sadly TRUE! Named before the show, my parents aren't that cruel.
I think we might have just excluded any US readers of this thread!
Have a good day!
-
Ha, no worries, glad it helped. I find I have to let things percolate or just draw them on a big white board. Often, the answer is pretty simple, it's just all too easy to box yourself into a 'I must do it this way' way of thinking when with a little flexibility, things are easy solved.
So, your a Trotter, any relation to Derrick?
-
Thanks Marcus. Today I learnt that I think best in the shower.....
Somtimes it just helps to get the question out there for others to comment - Your responses obviously got me thinking!
Thanks for all for your input!
-
Gary, I think that is spot on, we are telling you how to suck eggs!
-
After a long hot shower, I just thought this one up, how about.....
1.) I give each affiliate/trade user a URL with an affiliate ID on the URL, i.e. ?ID=123 (say) which points at my websites homepage
2.) If a user lands on my site with a URL containing an affiliate ID, the homepage is served up to the user with links that take the user to pages onto the affiliate subdomain site (the homepage that gets served up will be slightly different to the standard homepage). If the user navigates anywhere from this page they end up surfing the subdomain pages (all of which will be NOINDEX,NOFOLLOW).
3.) The "homepage" that gets displayed to the user is always INDEX,FOLLOW and has a rel=canonical tag for the homepage itself.
I "think" that this way my main domain gets the benefit of the links and the users always get the version of the site they are looking for without any extra "spammy" landing pages. Any one see any problems with this?
-
Is that really right? If that was the case presumably Google needs an index of NOINDEX pages or am I way off?
I "site A" gets a link from a NOINDEX page then Google must have some sort of record of that page's "WORTH" (for want of a better word) to attribute some value to "site A". That page's "WORTH" being derived from those pages that link to it.
That suggests to me (and my poor befuddled brain) that NOINDEX means it is indexed it just doesnt show in search results?
-
Thanks for your response. I don't think I can realistically go down the route of rewriting the entire site (the products change every year so it would not be a one-off cost by any means) - I would not get a return on that investment/time. I suppose, if I thought I might then why bother making it an affiliate site?
I agree with "a bit risky or just wouldn't work" - that's why I am asking because I didn't much like my own ideas!
Thanks again.
-
Well, it's true, in SEO, you really can learn something every day, even after ten + years.
I did not think the link value would pass through if the page was not indexed, almost like a cup with a hole, it would not hold the benefit.
-
I agree with Marcus about doing the writing. In the meantime I would NOINDEX, FOLLOW. The FOLLOW will allow pagerank to flow through these pages into the other pages of your site. If you use NOFOLLOW that pagerank is lost.
I have some syndicated content on one of my sites and some thin content that have NOINDEX, FOLLOW.
-
In my mind the best solution here would be to write (or have written) unique content for the sub domains so you could have them indexed, get increased exposure via search and win back links from these sites into the main page.
Anything else would be either a bit risky or just wouldnt work.
Hope that helps
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my page being indexed?
To put you all in context, here is the situation, I have pages that are only accessible via an intern search tool that shows the best results for the request. Let's say i want to see the result on page 2, the page 2 will have a request in the url like this: ?p=2&s=12&lang=1&seed=3688 The situation is that we've disallowed every URL's that contains a "?" in the robots.txt file which means that Google doesn't crawl the page 2,3,4 and so on. If a page is only accessible via page 2, do you think Google will be able to access it? The url of the page is included in the sitemap. Thank you in advance for the help!
Technical SEO | | alexrbrg0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
Similar pages: noindex or rel:canonical or disregard parameters?!
Hey all! We have a hotel booking website that has search results pages per destinations (e.g. hotels in NYC is dayguest.com/nyc). Pages are also generated for destinations depending on various parameters, that can be star rating, amenities, style of the properties, etc. (e.g. dayguest.com/nyc/4stars, dayguest.com/nyc/luggagestorage, dayguest.com/nyc/luxury, etc.). In general, all of these pages are very similar, as for example, there might be 10 hotels in NYC and all of them will offer luggage storage. Pages can be nearly identical. Come the problems of duplicate content and loss of juice by dilution. I was wondering what was the best practice in such a situation: should I just put all pages except the most important ones (e.g. dayguest.com/nyc) as noindex? Or set it as canonical page for all variations? Or in google webmaster tool ask google to disregard the URLs for various parameters? Or do something else altogether?! Thanks for the help!
Technical SEO | | Philoups0 -
Why are pages still showing in SERPs, despite being NOINDEXed for months?
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated.
Technical SEO | | MadeLoud0 -
SEO On-Page Planning - Wire Framing
This is open to discussion. I'm interested in getting help/opinions from others on how they plan their next SEO Project. What methods do you use for laying out On-Page Keyword targeting? Do you use any specific wire framing tools for laying out a large website? With larger websites, this stage is very important so I'd find it very useful to get help in this area. If you know of any useful threads covering this area, share them here.
Technical SEO | | Nick-SEOSpark0 -
Page Over-optimized?
I read over this post on the blog tonight: http://www.seomoz.org/blog/lessons-learned-by-an-over-optimizer-14730 & it's got me concerned that I might be having a similar issue on our site? Back in March & April of last year, we ranked fairly well for a number of long tail keywords, here is one in particular 'Mio Drink' for this page: http://www.discountqueens.com/free-mio-drink-from-kraft-facebook-offer The page is still indexed, but appears back on page #3 for the search term. During this time we had made a number of different updates to our site & I can't seem to put an exact finger on what might have caused the problem? Can anyone see any issues that might have caused this to drop? Thanks, BJ
Technical SEO | | seointern0 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10 -
SEO Benefit from Redirecting New Exact Match Domains?
Hi, All! This is a question asked in the old Q & A section, but the answer was a little ambiguous and it was about 3 years ago, so I decided to repost and let the knowledgeable SEO public answer... From David LaFerney: It’s clear that it’s much easier to get high rankings for a term if your domain is an exact match for the query. If you own several such domains that are very related such as – investmentrealestate.com, positivecashflow.com, and rentalproperty.com – would you be able to benefit from those by 301ing them to a single site, or would you have to maintain separate sites to help capture those targeted phrases? In a nutshell – SEO wise, is it worth owning multiple domains to exactly match valuable search phrases? Or do you lose the exact match benefit when you redirect?>> To clarify: redirecting an old domain with lots of history and links to a new exact match domain seems to contain SEO benefit. (You get links+exact match domain, approximately.) But the other way around? Redirecting a new exact match domain to an older domain with links? Does that do anything for the ranking of the old domain for the exact match keyword? Or absolutely nothing? (My impression has been that it's nothing, but the question came up for a client and I just wanted to make sure I wasn't missing something.) Thanks in advance!
Technical SEO | | debi_zyx0