An Unfair Content related penalty :(
-
Hi Guys,
Google.com.au
website: http://partysuppliesnow.com.au/We had a massive drop in search queries in WMT around the 11th of september this year, I investigated and it seemed as though there were no updates around this time.
Our site is only receiving branded search now - and after investigating i am led to believe that Google has mistakingly affected our website in the panda algorithm. There are no manual penalties applies on this site as confirmed by WMT.
Our product descriptions are pretty much all unique but i have noticed that when typing a portion of text from these pages into google search using quotation marks, shopping affiliate sites which we use are being displayed first and our page no where to be seen or last in the results. This leads me to believe that Google thinks we have scraped the content from these sites when in actual fact they have from us. We also have G+ authorship setup.
Typing a products full name into Google (tried a handful) our site is not in the top 100 or 200 at times, i think this further clarifies that we are penalised.
We would really appreciate some opinions on this. Any course of actions would be great. We don't particularly want to invest in writing content again.
From our point of view it looks like Google is stopping our site from ranking because it's getting mixed up with who the originator for our content is.
Thanks and really appreciate it.
-
Hey Jarrod,
I'm afraid there isn't anything you can actually do to tell Google you are the original author of your content, other than the tips Remus mentioned.
However, there is a service that you can use to help you identify sites that are duplicating your content. It's called Copysentry and it automatically scans the web to check for content duplication. You could use this, in conjunction with DMCA take down requests (as mentioned in Remus's post) to help to defend against this in future.
-
Hi guys,
Thank you all, for your kind advices. We have planned to re-write our content (product descriptions). Now, we will write 2 types of descriptions. 1 for our site and 1 for our affiliates (who promote our products). We hope Google won't confuse it this time.
As we are going to write the content again. I am still afraid, it could be stolen again. So, is there a way that we could tell Google that we are the originator of this new content???
If there isn't any solution, I think, we would lose our ranking again. Right??? I don't wanna lose our efforts again. So, can you suggest any concrete solution???
thanks again guys
Jarrod -
Our product descriptions are pretty much all unique but i have noticed that when typing a portion of text from these pages into google search using quotation marks, shopping affiliate sites which we use are being displayed first and our page no where to be seen or last in the results.
I saw the same thing. There is your problem.
This leads me to believe that Google thinks we have scraped the content from these sites when in actual fact they have from us. We also have G+ authorship setup.
Although google says that they are "pretty good" at attributing content to the creator the truth is that the suck at it.
Lots of people have this problem. Guard your content so it doesn't get out to affiliates and shopping engines. This means strongly enforced rules for your affiliates and blocking crawlers from your site - but allowing google in.
-
In addition going forward you should always ensure you have two types of content. A set of content you use on your site, and another set of content that you supply to affiliate sites and any other sites you supply products too.
I know this isn't much help now, but its something you should do in future to prevent such issues.
-
Hi Jarrod,
You are in a very complicated situation. I hope you can find a solution.
This video posted by Matt Cutts a wile ago might help you with a few additional tips:
How can I make sure that Google knows my content is original?
- DMCA request: http://www.google.com/dmca.html
- Google News source attribution metatags: link here
- Or even spam report like Matt Cutts suggests.
-
Hi Jarrod,
The first thing I noticed, a lot of pages in your site don't contain a rel=canonical tag. For example, this one: http://www.partysuppliesnow.com.au/view-products/96/LED-Furniture
We know that Google is not particularly good at identifying the original source of a content. So, you can report the sites that scraped your content to Google (https://www.google.com/webmasters/tools/spamreport?hl=en). That'll let Google know about the issue and hopefully lift the penalty off your site and penalize the other site.
Another issue could be the Authorship setup on product pages. It's considered as Authorship abuse. Generally, you don't want to link a Google+ profile with a site's homepage and other generic pages.
I've had some experience with Panda. I can say no-indexing is very effective in fighting Panda. If you know about a significant number of low-quality pages in your site, that you wouldn't prefer to open as a searcher, you should add a meta no-index tag in the section of those pages. It takes some time to get out of the Panda box.
Regards,
Rohit
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
How do I optimize dynamic content for SEO?
Hello, folks! I'm wondering how I optimize a site if it is built on a platform that works based on dynamic content. For example, the page pulls in certain information based on the information it has about the user. Not every user will see the same page. Thanks!
Intermediate & Advanced SEO | | Geonetric
Lindsey0 -
Publishing content in two or more places?
I've been thinking about publishing an article on LinkedIn and then posting the same article to the news page on the website. It would be high quality informative and useful but is that likely to cause any duplicate content issues?
Intermediate & Advanced SEO | | seoman100 -
Is my website is having enough content on it to rank?
I have less content on my website, is this okay or I need to add more content on my pages? Website is - brandstenmedia.com.au Any other suggestions for the website?
Intermediate & Advanced SEO | | Green.landon0 -
Duplicate content - Images & Attachments
I have been looking a GWT HTML improvements on our new site and I am scratching my head on how to stop some elements of the website showing up as duplicates for Meta Descriptions and Titles. For example the blog area: <a id="zip_0-anchor" class="zippedsection_title"></a>This blog is full of information and resources for you to implement; get more traffic, more leads an /blog//blog/page/2//blog/page/3//blog/page/4//blog/page/6//blog/page/9/The page has rel canonicals on them (using Yoast Wordpress SEO) and I can't see away of stopping the duplicate content. Can anyone suggest how to combat this? or is there nothing to worry about?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Google Penalty - Has It Been Lifted?
Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0