How does google recognize original content?
-
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website.
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic.
What do you guys think? You think all our original content effort is going to trash?
-
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software.
This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
-
Thanks.
About ten years ago I sold a lot of stuff on Amazon. Things were going well. I was the only person selling a nice selection of items. Then they started to sell the same items - and sold them at such a low price there was no way for me to make a profit. Impossible. That was just like working really really hard for someone who would become almost an impossible to beat competitor and dominate your SERPs for the next decade.
-
(offers napkin to EGOL to wipe up coffee spittle)
-
Excellent points by EGOL.
Amazon, and Walmart, are two edged swords that cut one way (you). I understand why businesses go that route, but it is very difficult to win. Sometimes someone does though:
A lady who is a friend of mine about 15 years ago took over the US arm of a German toy distributor and they created a very cool doll. Everyone with the German company and all on the US marketing team screamed they had to take it to Walmart. She politely refused to and said, let Walmart come to me. She then went all over hawking the doll and ended up on HSN. (I think that is the original big TV sales channel). About a year in everyone wanted these dolls and Walmart did not have them.
When Walmart called, she named the price - she did not have to kiss someone's... They were pleased to do the kissing.
One of my favorite stories of all time.
-
Well, sounds like i am screwed since we are sending our feeds to amazon last 7 months. I am going to update the feed and remove the descriptions from amazon feed. But i don't know if it will help me at all. By the way, i am talking about amazon ads, Not selling on amazon. However if amazon doesn't have that product in their database, they basically use your descriptions and create a product page but says this product is available on external website.
-
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too.
- spits coffee *
Whoa! I would not do that. I would remove or replace those descriptions on Amazon if at all possible.
When you sell on Amazon, any content, any image, any anything that you put on their site will be used against you. And, if you strike gold there then Amazon will quickly become your competitor.
This is exactly why I don't sell on amazon. They solicit me a couple times a year to sell my stuff on their site. No way. I did that in the past and my work benefited Amazon more than it benefited me and benefited my competitors too.
I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
This is not true. I don't care who says this is true, I am going to argue. No way. I'll argue with anybody about this. Even the big names at Google. They do a horrible job at attributing first publisher. Horrible. Horrible.
I have published a lot of content given to me by others. Other people have stolen my content. I can tell you with assurance that the powerful often wins... and if a LOT of people have grabbed your content you can lose to a ton of weak sites.
Google does not honor first publisher. They honor powerful publishers - like Amazon. Giving content to Amazon that you are going to publish on your website is feeding the snake!
So google thought that we have a shallow or duplicated content and dropped our rankings?
If your content is on Amazon, they are probably taking your traffic. Go out and look at the SERPs.
-
Serkie
Given these are product descriptions, but apply only to you selling them (even if it is through Amazon/G) I think there are a couple of ways you can go. One would be to add author markup if that is possible; I don't know how many products, etc. you are dealing with or what type of eCommerce or other platform you may be using.
Secondarily, within your actual text, you could state authorship and place a link back to you.(likely at very end of description.)
Last would be that if you register a copyright (no not a circle with a c in it as most do - the real thing) it can be fairly inexpensive. Depending how you package it to the copyright office we find it can run about a dollar a page. That would give you ownership should you ever have an issue with someone using your description without authorization (obviously you give it to Amazon and Google.)
A final note is this: when you started rewriting the descriptions my guess is you wrote, changed, rewrote, etc. In the event you ever had to defend yourself or prove you are the actual owner, in a court the documents showing how you arrived at the final are invaluable.
I don't know if this is what you were looking for, but I hope something here will help.
Best
-
For our ecommerce sites we always make sure to have original content in our product feeds as well as our pages. That way the things from our feeds don't poach from our sites and we have a broader range of search terms covered as well as avenues to be reached through.
-
Google typically looks at who published it first, as well as the authority of the sites that house the content. You could be running into problems because Amazon is going to have much more authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Re-using content
Hi, I've just sold the domain for a website, so I'm free to re-purpose the content over to another website I own. How can I make sure that Gg doesn't deem it as duplicate? Do I need to let Gg naturally realise that the 'original' website no longer has the content on it? Do I need to hold-off putting the content live again? Should I notify Gg by-way of a de-index request, etc (assuming the domain won't incur any difficulty if I do this)? Thanks in advance.
Intermediate & Advanced SEO | | newstd1000 -
Duplicate Content Pages - A Few Queries..
I am working through the latest Moz Crawl Report and focusing on the 'high priority' issues of Duplicate Page Content. There are some strange instances being flagged and so wondered whether anyone has any knowledge as to why this may be happening... Here is an example; This page; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bruges/ ...is apparently duplicated with these pages; http://www.bolsovercruiseclub.com/guides/excursions http://www.bolsovercruiseclub.com/guides/cruises-from-the-uk http://www.bolsovercruiseclub.com/cruise-deals/norwegian-star-europe-cruise-deals Not sure why...? Also, pages that are on our 'Cruise Reviews' section such as this page; http://www.bolsovercruiseclub.com/cruise-reviews/p&o-cruises/adonia/cruising/931 ...are being flagged as duplicated content with a page like this; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bilbao/ Is this a 'thin content' issue i.e. 2 pages have 'thin content' and are therefore duplicated? If so, the 'destinations' page can (and will be) rewritten with more content (and images) but the 'cruise reviews' are written by customers and so we are unable to do anything there... Hope that all makes sense?! Andy
Intermediate & Advanced SEO | | TomKing0 -
Google Penalty - Has It Been Lifted?
Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Website No Longer Ranking In Google:
My website was on first page google couple of months ago, now nothing. Shows up in Bing page one. Some queries/pages still showing OK, but some not at all. Example "residential elevators illinois" found nowhere. http://www.accesselevator.net is the website. Have found 900 poor quality links and used disavow tool. Any further suggestions? Their Page Rank also went from a 3 to a 2. Implemented nofollow on all outgoing links. Need advice.
Intermediate & Advanced SEO | | trailblazerzz90 -
Google Penalty or Not?
One of my sites I work with got this message: http://www.mysite: Unnatural inbound linksJune 27, 2013 Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines. As a result, Google has applied a manual spam action to mysite.com/. There may be other actions on your site or parts of your site. But, when I got to manual actions it says: Manual Actions No manual webspam actions found. -- So which is it??? I have been doing link removal, but now I am confused if I need to do a reconsideration request or not.
Intermediate & Advanced SEO | | netviper0 -
Google WMT Showing Duplicate Content, But There is None
In the HTML improvements section of Google Webmaster Tools, it is showing duplicate content and I have verified that the duplicate content they are listing does not exist. I actually have another duplicate content issue I am baffled by, but that it already being discussed on another thread. These are the pages they are saying have duplicate META descriptions, http://www.hanneganremodeling.com/bathroom-remodeling.html (META from bathroom remodeling page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Bathroom Remodeling Washington DC, Bathroom Renovation Washington DC, Bath Remodel, Northern Virginia,DC, VA, Washington, Fairfax, Arlington, Virginia</a>" /> http://www.hanneganremodeling.com/estimate-request.html (META From estimate page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Free estimates basement remodeling, bathroom remodeling, home additions, renovations estimates, Washington DC area</a>" /> WlO9TLh
Intermediate & Advanced SEO | | WebbyNabler0 -
Keyword search in Google Adwords
Hello all, I would like to use the Google Adwords Keywords search tool, in order to start working in the structure of my website and targeting the right keywords. I am targeting all the world, all languages, global monthly searaches but: I have the doubt which filte I should use: broad?, exact? or phrase?I am using "braod" but I do not know if I should use exact instead for keyword selection. Would you recommend me any other tool instead of Google Adwords Keywords Search for keyword analysis? Thank you very much Antonio
Intermediate & Advanced SEO | | aalcocer20030 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0