Panda and affiliate networks
-
Last year we were hit with a Panda penalty and have been unable to recover from this. We removed all duplicate content from our site and SEOMoz confirms this.
I have spoken to a new SEO company and they have identified that the cause maybe our content being duplicated off site by affiliates who are using our product feed such as Shopwiki and Thefind.
Is this a possibibilty and if so I presume the easiest thing to do is stop these affiliates using our descriptions.
Neil
-
Hi,
I ended the affiliate programmes with Shopwiki, Digidrop and one or two other similar affiliates. We saw a quick and small bounce back of some product related terms but it has take a while for the affiliates to remove our data so their listings are still dropping our of google. It seems that for some reason the affiliates are being credited with being the owners of the content and therefore listing higher than us.
Do you think this is because the feeds are uploading every night and Shopwiki etc are being spidered quicker than our site. One solution may be a weeks delay in new products being put onto our affiliate feeds.
Here is an example of where the affiliates are all listing above the link for http://www.gear-zone.co.uk
Please let me know what you think and how you are getting on with the issues you are experiencing.
-
Hello
Did you get any further with this? We are in a similar position with shop wiki and would love to know how you got on with this issue.
Thanks
-
All the content is written by us. The reason I chose that item is that it is basically an own brand item. Heres another example
"Two-layer Gore-Tex uppers and Ballistics cloth bottoms make these Black Diamond Front Point Gaiters ideal for glacial travel and ice climbing.
The Front Point Gaiters provide warmth and waterproofing, and are tough as nails in repelling crampon spikes. Secured underfoot with a burly strap, these gaiters are easy to put on or remove using a velcro closure system and shock cord."
As you can see all the sites listed are affiliate shopping sites.
-
As the affiliates are through networks they have affiliate tracking code so there is no link building benefit to us.
Our URL is www.gear-zone.co.uk and you can see if you google this text we do not appear first
"Get a grip when the weather does its worst with these excellent snow and ice grips.
Each sole is equipped with 13 self-tapping, replaceable carbon steel screw spikes, providing superb traction in the ice and snow."
It relates to this page so as you can see there is definatly a problem
-
Do you have canonical tags? Is it possible to add a cross-site canonical to any of the affiliate pages? At the very least they should link directly back to the relevant product page on your own site.
-
Thats an awful lot of work as our catalogue is over 4000 items and to be honest they are not contributing as much as they have in the past.
Is there some clever code to force Google to show we are the primary source?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Affiliate Url & duplicate content
Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
Technical SEO | | Direct_Ram
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E0 -
Panda Recovery ETA?
I have a blog hit by Panda in 2011 and 2012. The thing is, I've no-indexed around 1000 posts out of 11xx. No-indexed tags and archives. But, Google was taking a very long time to remove them from their indexes. So, I had to do a manual removal from Google WMT. Removed /2011/ and /2013/ as directories, and removed /pages/ (this is an WordPress site) so all of them are now no longer in their index. It was a smartphone blog started in 2011 which I turned into an tech blog on a new domain (I let the old PR3 DA 30+ domain expire and now someone's asking me $200 if I am to get it). I had a team when it was a smartphone blog. Our articles had been featured on places like Engadget, PhoneArena, UberGizmo etc. So, with the loss of the domain, we've lost quite a few important backlinks as well. Also, Authorship doesn't work for the site. The Rich Snippets testing tool says everything's all right, but it never really works / shows up on SERPs. I fear it's because of a penalty. It seems to me like no one has ever thought about a penalty that affects Authorship. So, now you know the problem, and the things I did in order to fix it, could you tell me if: Google will lift the penalty whenever they wish. (And an ETA?) They'll lift it when the next major algorithmic update occurs. (I made the changes on September 28th) But I don't see how this is a possibility since Panda has now been integrated into the core algorithm. Anything else. Thanks in advance everyone!
Technical SEO | | RohitPalit0 -
Moving articles to new site, can't 301 redirect because of panda
I have a site that is high quality, but was hit by penguin and perhaps panda. I want to remove some of the articles from my old site and put them on my new site. I know I can't 301 redirect them because I will be passing on the bad google vibes. So instead, I was thinking of redirecting the old articles to a page on the old site which explains that the article is moved over to the new site. I assume that's okay? I'm wondering how long I should wait between the time I take them down from the old site to the time I repost them on the new site. Do I need to wait for Google to de-index them in order to not be considered duplicate content/syndication? We'll probably reword them a bit, too - we really want to avoid panda. Thanks!
Technical SEO | | philray
Phil0 -
Panda recovery timeframe question
Site was hit by Panda Aug. 22nd. Lost 90% of Google traffic. I know 🙂 We think we found a reason and made few changes to landing pages structure. Updated sitemaps submitted. When can we expect effect (if any) - few days or after next Panda data refresh? Thank you!P.S. What is also interesting, similar traffic loss from Bing/Yahoo happened at exactly the same date. Does that mean Bing is "stealing" search results from Google when can't provide their own relevant results? 🙂
Technical SEO | | LocalLocal0 -
Affiliate links
I have a wordpress blog and was wondering the best seo practice for doing affiliate link redirects? Peter
Technical SEO | | PeterM220 -
How ehow beat Panda?
Looks although ehow was hit by Panda 2.0, its traffic has increased back to previous levels. Does anyone know of an article / study that goews over what ehow did.
Technical SEO | | nicole.healthline0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0