How does google recognize original content?
-
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website.
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic.
What do you guys think? You think all our original content effort is going to trash?
-
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software.
This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
-
Thanks.
About ten years ago I sold a lot of stuff on Amazon. Things were going well. I was the only person selling a nice selection of items. Then they started to sell the same items - and sold them at such a low price there was no way for me to make a profit. Impossible. That was just like working really really hard for someone who would become almost an impossible to beat competitor and dominate your SERPs for the next decade.
-
(offers napkin to EGOL to wipe up coffee spittle)
-
Excellent points by EGOL.
Amazon, and Walmart, are two edged swords that cut one way (you). I understand why businesses go that route, but it is very difficult to win. Sometimes someone does though:
A lady who is a friend of mine about 15 years ago took over the US arm of a German toy distributor and they created a very cool doll. Everyone with the German company and all on the US marketing team screamed they had to take it to Walmart. She politely refused to and said, let Walmart come to me. She then went all over hawking the doll and ended up on HSN. (I think that is the original big TV sales channel). About a year in everyone wanted these dolls and Walmart did not have them.
When Walmart called, she named the price - she did not have to kiss someone's... They were pleased to do the kissing.
One of my favorite stories of all time.
-
Well, sounds like i am screwed since we are sending our feeds to amazon last 7 months. I am going to update the feed and remove the descriptions from amazon feed. But i don't know if it will help me at all. By the way, i am talking about amazon ads, Not selling on amazon. However if amazon doesn't have that product in their database, they basically use your descriptions and create a product page but says this product is available on external website.
-
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too.
- spits coffee *
Whoa! I would not do that. I would remove or replace those descriptions on Amazon if at all possible.
When you sell on Amazon, any content, any image, any anything that you put on their site will be used against you. And, if you strike gold there then Amazon will quickly become your competitor.
This is exactly why I don't sell on amazon. They solicit me a couple times a year to sell my stuff on their site. No way. I did that in the past and my work benefited Amazon more than it benefited me and benefited my competitors too.
I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
This is not true. I don't care who says this is true, I am going to argue. No way. I'll argue with anybody about this. Even the big names at Google. They do a horrible job at attributing first publisher. Horrible. Horrible.
I have published a lot of content given to me by others. Other people have stolen my content. I can tell you with assurance that the powerful often wins... and if a LOT of people have grabbed your content you can lose to a ton of weak sites.
Google does not honor first publisher. They honor powerful publishers - like Amazon. Giving content to Amazon that you are going to publish on your website is feeding the snake!
So google thought that we have a shallow or duplicated content and dropped our rankings?
If your content is on Amazon, they are probably taking your traffic. Go out and look at the SERPs.
-
Serkie
Given these are product descriptions, but apply only to you selling them (even if it is through Amazon/G) I think there are a couple of ways you can go. One would be to add author markup if that is possible; I don't know how many products, etc. you are dealing with or what type of eCommerce or other platform you may be using.
Secondarily, within your actual text, you could state authorship and place a link back to you.(likely at very end of description.)
Last would be that if you register a copyright (no not a circle with a c in it as most do - the real thing) it can be fairly inexpensive. Depending how you package it to the copyright office we find it can run about a dollar a page. That would give you ownership should you ever have an issue with someone using your description without authorization (obviously you give it to Amazon and Google.)
A final note is this: when you started rewriting the descriptions my guess is you wrote, changed, rewrote, etc. In the event you ever had to defend yourself or prove you are the actual owner, in a court the documents showing how you arrived at the final are invaluable.
I don't know if this is what you were looking for, but I hope something here will help.
Best
-
For our ecommerce sites we always make sure to have original content in our product feeds as well as our pages. That way the things from our feeds don't poach from our sites and we have a broader range of search terms covered as well as avenues to be reached through.
-
Google typically looks at who published it first, as well as the authority of the sites that house the content. You could be running into problems because Amazon is going to have much more authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Google image search
How does google decide which image show up in the image search section ? Is is based on the alt tag of the image or is google able to detect what is image is about using neural nets ? If it is using neural nets are the images you put on your website taken into account to rank a page ? Let's say I do walking tours in Italy and put a picture of the leaning tower of pisa as a top image while I be penalised because even though the picture is in italy, you don't see anyone walking ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Google Update
My rank has dropped quite a lot this past week and I can see from the Moz tools that there is an unconfirmed Google update responsible. Is there any information from Moz on this?
Intermediate & Advanced SEO | | moon-boots0 -
Fetch as Google
I have odd scenario I don't know if anyone can help? I've done some serious speed optimisation on a website, amongst other things CDN and caching. However when I do a Search Console Fetch As Google It is still showing 1.7 seconds download time even though the cached content seems to be delivered in less than 200 ms. The site is using SSL which obviously creams off a bit of speed, but I still don't understand the huge discrepancy. Could it be that Google somehow is forcing the server to deliver fresh content despite settings to deliver cache? Thanks in advance
Intermediate & Advanced SEO | | seoman100 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Does Google penalise content that sits behind a read gate?
Does Google penalise content that sits behind a read gate? Currently, most of the content on our site sits behind a read gate. People have to register before they can view the detailed content. Currently, our forums are accessible to all which draws a lot of long tail traffic. Google does seem to be indexing some of our gated content, but can someone advise me how they view this content more generally please?
Intermediate & Advanced SEO | | RG_SEO0 -
Sitelink demotion from Google Webmaster
HI Everybody, When I search Vallnord on Google one of the sitelinks takes you to a site that does not exists. In order to avoid this sitelink to appear I Demoted that URL using Google Webmaster. Any idea how much time will this take in order to see the results? Regards, G.
Intermediate & Advanced SEO | | SilbertAd0 -
Google Places - How do we rank
So, google places showing up on search results is great feature . . . But how can we get our results to the top? I mean I can see some terrible websites appearing at the top of the google places with their places page having no activity whatsoever. Is there a trick to this at all? What can we do to increase our ranking on Google Places because our old GOOD rankings are now appearing BELOW the map results Cheers
Intermediate & Advanced SEO | | kayweb0