Secretly back-linking from whitelabel product
-
Lets say a company (provider.com) offers a whitelabel solution which enables each client to have all of the content on their own domain (product.client.com), with no branding by the content provider.
Now lets say that client.com is a site with a lot of authority, and to promote the launch of product.client.com, they put a lot of links from their main site to the subdomain. This can be very valuable link juice, and provider.com would like to be able to take advantage. The problem is, that client.com wouldn't like it if provider.com put in links on their whitelabel site.
Suppose the following:
All pages on product.client.com start to have a rel="canonical" link to themselves, with a get variable (e.g. product.client.com/page.htm -> product.client.com/page.html?show_extra_link=true)
When the page is visited with the extra get parameter "show_extra_link" a link appears in the footer that points to provider.com
My question is, would this have the same effect for provider.com as placing a link on the non-canonical version of the pages on the whitelabel site would?
-
I'm with Alan - in theory, the canonical would pass the link-juice to the version with the link, but you're not only misleading the client - you're one step away from cloaking the link. You could actually get your own clients penalized for this, and that seems very short-sighted.
Add the NOINDEX on top of this, and I'd be willing to bet that the value of these links would be very low. Even if the client approved followed white-label pages with footer links, for example, we're seeing those types of links get devalued - they're just too easy to get. Now, you add these links all at once, NOINDEX the page, and canonical to a weird variant, and you've painted a very suspicious picture for Google. It might work for a while, but you're taking a significant risk for potentially a very small gain.
-
i would say the canonical.
if the pages are not indexed, but follow, then they would have no value themselfs unless they had in-coming links. if they do have in-coming links then yes they will pass link juice, but only from the canonical i would think, based one what i said above about a canonical being much like a 301
-
Hi Alan,
All of the pages on the subdomain have a robots meta with noindex, follow on them. The pages are only used for data collection (forms), and the clients do not want their pages showing up in google, which is why extracting link juice shouldn't be a problem. As such, the canonical url need not be indexed.
From what I understand, if a page has duplicate content and specifies a rel=canonical, url, the inbound link juice effectively gets syphoned into the original content page. What I'm wondering is, which page does google use for the purpose of propagating outbound link juice?
-
With prev next the content of every page is given to page 1, in that case the link would be part of the content. But with a canonical I am not sure.
If you go by comments by Matt Cutts and Bings Duane Forrester canonicals are the same as a 301 execpt they dod not pyhsiclly move the viewer to the canonical page. so in the case of a canonical the content would not be merged, only the content on the canonical page would be indexed, the links from other verrsions would be redirected. so the link on the show_extra_link version of the page would not be indexed.
As for the morality of this, i would not do it, you are not being honet with the clint and you would be caught out sooner or later when the url was seen in the index(if it was indexed)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
What is really a bad link in 2017?
Hi, Routine answer is: A link which doesn't provides any value. Tired of listening to this statement where we can see number of back-links been generated with different scenarios. There are still many low DA websites which speaks exactly about a brand and link a brand naturally. So, is this a bad link or good link? Let's be honest here. No one gonna visit such pages and browse through our website; it's all about what it's been doing in-terms of SEO. Do these websites to be in disavow list? Beside the context how a brand been mentioned, what are the other metrics to disavow a domain? Expecting some real answers for this straight question. If it's a low DA site and speaking about exactly our website- Good or bad? Vice-versa...high DA website mentioned website with less matching content. What is the proportion of website authority and content context? Can we keep a medium DA backlinks with some Moz spam score?
White Hat / Black Hat SEO | | vtmoz0 -
What is your SEO agency doing in terms of link building for clients?
What are you or your SEO agency doing for your client's link building efforts? What are you (or the agency) doing yourself, or out-sourcing, or having the client do for link building? If a new client needs some serious link building done, what do you prescribe and implement straight off the bat? What are your go-to link building tactics for clients? What are the link building challenges faced by your agency in 2013/2014? What's working for your agency and what's not? Does your agency work closely with the client's marketing department to gain link traction? If so, what are collaborating on? What else might you be willing to share about your agencies link building practices? Thanks
White Hat / Black Hat SEO | | Martin_S0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Cross-Site Links with different Country Code Domains
I have a question with the penguin update. I know they are really cracking down on "spam" links. I know that they are wanting you to shift from linking keywords to the brand name, unless it makes sense in a sentence. We have five sites for one company in the header they have little flag images, that link to different country domains. These domains all have relatively the same domain name besides the country code. My question is, linking these sites back and fourth to each other in this way, does it hurt you in penguin? I know they are wanting you to push your identity but does this cross-site scheme hurt you? In the header of these sites we have something like this. I am assuming the best strategy would probably be to treat them like separate entities. Or, just focus on one domain. They also have some sites that have links in the footer but they are set up like:
White Hat / Black Hat SEO | | AlliedComputer
For product visit Domain.com Should nofollows be added on these footer links as well? I am not sure if penguin finds them spammy too.0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380 -
Link Building using Badges
In light of penguin update, is link building using badges(like "I love SEOMOZ" badge) still considered a white hat tactic? I have read old posts on SEOMOZ blog about this topic and wondering if this method is still effective. Look forward to feedback from MOZers.
White Hat / Black Hat SEO | | Amjath0 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010