Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How does google recognize original content?
-
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website.
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic.
What do you guys think? You think all our original content effort is going to trash?
-
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software.
This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
-
Thanks.
About ten years ago I sold a lot of stuff on Amazon. Things were going well. I was the only person selling a nice selection of items. Then they started to sell the same items - and sold them at such a low price there was no way for me to make a profit. Impossible. That was just like working really really hard for someone who would become almost an impossible to beat competitor and dominate your SERPs for the next decade.
-
(offers napkin to EGOL to wipe up coffee spittle)
-
Excellent points by EGOL.
Amazon, and Walmart, are two edged swords that cut one way (you). I understand why businesses go that route, but it is very difficult to win. Sometimes someone does though:
A lady who is a friend of mine about 15 years ago took over the US arm of a German toy distributor and they created a very cool doll. Everyone with the German company and all on the US marketing team screamed they had to take it to Walmart. She politely refused to and said, let Walmart come to me. She then went all over hawking the doll and ended up on HSN. (I think that is the original big TV sales channel). About a year in everyone wanted these dolls and Walmart did not have them.
When Walmart called, she named the price - she did not have to kiss someone's... They were pleased to do the kissing.
One of my favorite stories of all time.
-
Well, sounds like i am screwed since we are sending our feeds to amazon last 7 months. I am going to update the feed and remove the descriptions from amazon feed. But i don't know if it will help me at all. By the way, i am talking about amazon ads, Not selling on amazon. However if amazon doesn't have that product in their database, they basically use your descriptions and create a product page but says this product is available on external website.
-
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too.
- spits coffee *
Whoa! I would not do that. I would remove or replace those descriptions on Amazon if at all possible.
When you sell on Amazon, any content, any image, any anything that you put on their site will be used against you. And, if you strike gold there then Amazon will quickly become your competitor.
This is exactly why I don't sell on amazon. They solicit me a couple times a year to sell my stuff on their site. No way. I did that in the past and my work benefited Amazon more than it benefited me and benefited my competitors too.
I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
This is not true. I don't care who says this is true, I am going to argue. No way. I'll argue with anybody about this. Even the big names at Google. They do a horrible job at attributing first publisher. Horrible. Horrible.
I have published a lot of content given to me by others. Other people have stolen my content. I can tell you with assurance that the powerful often wins... and if a LOT of people have grabbed your content you can lose to a ton of weak sites.
Google does not honor first publisher. They honor powerful publishers - like Amazon. Giving content to Amazon that you are going to publish on your website is feeding the snake!
So google thought that we have a shallow or duplicated content and dropped our rankings?
If your content is on Amazon, they are probably taking your traffic. Go out and look at the SERPs.
-
Serkie
Given these are product descriptions, but apply only to you selling them (even if it is through Amazon/G) I think there are a couple of ways you can go. One would be to add author markup if that is possible; I don't know how many products, etc. you are dealing with or what type of eCommerce or other platform you may be using.
Secondarily, within your actual text, you could state authorship and place a link back to you.(likely at very end of description.)
Last would be that if you register a copyright (no not a circle with a c in it as most do - the real thing) it can be fairly inexpensive. Depending how you package it to the copyright office we find it can run about a dollar a page. That would give you ownership should you ever have an issue with someone using your description without authorization (obviously you give it to Amazon and Google.)
A final note is this: when you started rewriting the descriptions my guess is you wrote, changed, rewrote, etc. In the event you ever had to defend yourself or prove you are the actual owner, in a court the documents showing how you arrived at the final are invaluable.
I don't know if this is what you were looking for, but I hope something here will help.
Best
-
For our ecommerce sites we always make sure to have original content in our product feeds as well as our pages. That way the things from our feeds don't poach from our sites and we have a broader range of search terms covered as well as avenues to be reached through.
-
Google typically looks at who published it first, as well as the authority of the sites that house the content. You could be running into problems because Amazon is going to have much more authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Pages & Content
Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
If a website Uses <select>to dropdown some choices, will Google see every option as Content Or Hyperlink?</select>
If a website Uses <select> to dropdown some choices, will Google see every option as Content Or Hyperlink?</select>
Intermediate & Advanced SEO | | Zanox0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
How do you archive content?
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant". My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website? How do you technically archive content? watch?v=y8s6Y4mx9Vw
Intermediate & Advanced SEO | | SorinaDascalu1