Affiliate Link is Trumping Homepage - URL parameter handling?
-
An odd and slightly scary thing happened today: we saw an affiliate string version of our homepage ranking number one for our brand, along with the normal full set of site-links.
We have done the following:
1. Added this to our robots.txt :
User-agent: *
Disallow: /*?2. Reinserted a canonical on the homepage (we had removed this when we implemented hreflang as had read the two interfered with each other. We haven't had canonical for a long time now without issue. Is this anything to do with the algo update perhaps?!
The third thing we're reviewing I'm slightly confused about: URL Parameter Handling in GWT. As advised - with regard to affiliate strings - to the question: "Does this parameter change page content seen by the user?" We have NO selected, which means they should be crawling one representative URL. But isn't it the case that we don't want them crawling or indexing ANY affiliate URLs? You can specify Googlebot to not crawl any of particular string, but only if you select: "Yes. The parameter changes the page content." Should they know an affiliate URL from the original and not index them? I read a quote from Matt Cutts which suggested this (along with putting a "nofollow" tag in affiliate links just in case)
Any advice in this area would be appreciated. Thanks.
-
I'm glad to hear you've been sorted out Lawrence Neal. I find it interesting the the other Lawrence saw something similar, and I'll ask around to see if it was a glitch that other people have noticed too.
For anyone reading this wondering what Mr. Neal was referring to in regard to rel canonical / href lang conflict, there's a good writeup of it over at Dejanseo.com and Gianluca Fiorelli mentions it in his comment on Dr. Pete's Rel Canonical uber post here on Moz.
-
Luckily it's disappeared today, which leads me to believe it was a Google-side algo error that was swiftly corrected (nothing we have done will have reflected in the serp so quickly, I doubt)
-
Lets say your site is using php?
Your system no doubt picks up the parameter with a php get and stores it as a session variable.
That is likely all that would need to be done before the page is 301 redirected.
Best thing to do is create a test page with the cod mentioned above on your site and try it
have the page redirect to the homepage and see if that affiliate code is stored.
-
I don't know if this has anything to do with the algo update, but at least your not the only one. I saw a competitor ranking with a second version of their homepage. The second version had utm parameters behind them.
Luckily the page with the utm parameters disappeared from the serps this morning. He was actually ranking first with the normal version and second with the version with the url parameters. This was on some pretty competitive keywords and lasted almost three days.
-
Thanks for your reply, Gary. I'm not entirely sure how our (far reaching and lucrative) affiliate tracking/logging works, but I would have thought 301ing all the links to the original page would sabotage it, no?!
The canonical will certainly work but we've only reinstated it on the homepage as we have 6 other sites that have hreflang alternates in place and the canonical seems to interfere with their function.
-
hmmm.. seems like Google is getting some strong linking signals that this is the popular page to arrive at.
The canonical tag on the homepage is the right way to go.
You could 301 redirect any customer that lands on you with an affiliate code in the url? This would be a very simple bit of code you could even put it in an an include at the top of each page. This way those pages never even exist and you get all the link juice.
One other thing might be to put a noindex on any page that has an affiliate parameter. But you would lose the link juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL has crawl issue - Submitted URL seems to be a Soft 404 - but all looks fine
Google Search Console is showing some pages up as "Submitted URL has crawl issue" but they look fine to me. I have set them as fixed but after a month they were finally re-crawled and google states the issue persists. Examples are: https://www.rscpp.co.uk/counselling/175809/psychology-alcester-lanes-end.html
Technical SEO | | TommyNewmanCEO
https://www.rscpp.co.uk/browse/location-index/889/index-of-therapy-in-hanger-lane.html
https://www.rscpp.co.uk/counselling/274646/psychology-waltham-forest-sexual-problems.html There's also some "Submitted URL seems to be a Soft 404": https://www.rscpp.co.uk/counselling/112585/counselling-moseley-depression.html I also have more which are "pending", but again I couldn't see a problem with them in the first place. I'm at a bit of a loss as to what to do next. Any advice? Thanks in advance.0 -
How to delete specific url?
I just ran drawl diagnostics and trying to delete pages such as "oops that page can't be found" or "404 (not found_ error response pages. Can anyone help?
Technical SEO | | sawedding0 -
What is the best way to handle links that lead to a 404 page
Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
Technical SEO | | mm9161570 -
Link building with AddThis URL
We've begun using AddThis for tracking our social sharing. AddThis has been adding the snippet to the end of the URLs on our pages and we've been finding that people linking to us are linking to the URL with the snippet. AddThis says this isn't a problem for SEO. Is this correct? Here is an example: https://www.harborcompliance.com/information/how-to-start-a-non-profit-organization-in-colorado.php#.UunCfPldVig I want to make sure this is not affecting our SEO in any way, particularly that Google would see this as an affiliate or paid link since it has the "#". I may be crazy but I just want to make sure!
Technical SEO | | Harbor_Compliance0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Link Juice
When we say "link juice", does it mean that a particular page has link juice ( due to backlinks pointing towards the page ) or each link on that page has link juice which it passes to the target page I suppose "link juice " is different from Pagerank ?
Technical SEO | | seoug_20050 -
Too many links on your blog?
In all of my campaigns, I have a lot of URLs with too many links on the page (defined loosely as around or over 100 links per page); these links are virtually all found on blog pages. The link count shoots up quickly when you start using things like tag clouds, showing all the tags/categories a post is in, in addition to all the cross linking thats typical of blog posts. My question is: Does this matter? Do you work to get blog pages down under that 100 link limit, or just assume most blogs are like this and move along? If you think it does matter, what strategies have you used to cut down the number of links while still keeping popular elements like tag clouds?
Technical SEO | | AdoptionHelp0