Duplicate Content on Event Pages
-
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
-
A penalty is something google will have to manually remove and you will be able to see that in webmaster tools. A devaluation is when you are adjusted by the algorithm and lowered as a result because each thing that google does not like acts as points against you but you can quickly change and see your results return. Does that make sense?
-
We decided that it was worth a large investment as we would own the content ourselves and not worry in the future about anyone claiming ownership to the content as google gets stricter. So we re wrote half a million words!
-
Also could you fully explain the difference between devaluation and a penalty?
-
Do you mind if I ask how much of the content you re-wrote? My main fear is the amount of work that this would take since a lot of content goes up on the site daily. If the content is re-written did you do the same amount of content or did you re-write your office space listings with less content?
-
This is a Panda issue.
Google has said many times with affiliate sites that use the same content that if they do a better job than the original site it will rank them. So its not all bad when you look at it from that point of view.
However, Google loves unique content and will do its best to rank sites first that have the unique content. I have a business in the office space industry and a few years back we used to aggregate office apace listings which were shared amongst 30+ sites. The display of these listings would be different for many searches but the content was the same as all the other sites. This slowly put us in a PANDA DEVALUATION (there is no panda penalty).
After re-writing them with our clients we saw a significant change once the content had be re-crawled.
So it can have a great effect. If Google starts to see that large parts of your site are duplicate content it will start to question the authority you have in your industry.
Could you offer and incentive to your customers to write something unique? And also maybe inform your users not to copy and paste their own content on your site as this could affect them negatively in Google?
If you are an authority could you tell users that if you want to be listed it must be unique? Or if its a paid service have an ad on service for a few bucks where you write a professional description? Might become a nice additional income?
Just a few ideas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Which page to rank for a Keyword? Home Page or Deep Page?
So, we have a situation where there is one particular keyword we want to rank for. We have been up and down over the years, at our best probably position 4-5, and now at 20ish. Thats for our home page of course, which the majority of our linking is probably pointing at. We also have a sub page which is optimised for that particular service. The term is "web design brisbane".
On-Page Optimization | | MauriceKintek
So as you can imagine, Web Design is in itself a service and we offer others. Should we optimise our home page for it and remove the sub page?
Keep the sub page because its one our services and optimise both?
Do some kind of canonical thing?
Change our interlinking? All our competitors home pages seem to be the ones that rank, and it feels and looks better in results if its the home page, but if switching up to our sub page is better im all ears. Also if our sub page is somehow hurting or leaking SEO from the home page, id like to know as well. Would prefer to not have to provide a link, due to competition but if someone wants to know we can always PM.0 -
How to check duplicate content with other website?
Hello, I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them? Thanks,
On-Page Optimization | | JohnHuynh1 -
E-commerce store having same content different language pages
Hello, I have an e-commerce store operating on PrestaShop. I have four languages Fr, De, En and Nl. The url for each page changes like Example.com/en/product1 Example.com/fr/Product1 Example.com/de/product1 Example.com/nl/Product1 All these pages have same content about product1 but translated in respective language. Is it considered as duplicate content? Should I suppose to write different content for each page?
On-Page Optimization | | MozAddict0 -
Duplicate Page Content Question
This article was published on fastcompany.com on March 19th. http://www.fastcompany.com/magazine/164/designing-facebook It did not receive much traffic, so it was re-posted on Co.Design today (March 27th) where it has received significantly more traffic. http://www.fastcodesign.com/1669366/facebook-agrees-the-secret-to-its-future-success-is-design My question is if google will dock us for reprinting/reusing content on another site (even if it is a sister site within the same company). If they do frown on that, is there a proper way to attribute the content to the source material/site (fastcompany.com)?
On-Page Optimization | | DanAsadorian0 -
Duplicate Title & Content in WordPress
I'm getting a lot of Crawl Errors due to duplicate content and duplicate title because of category and tag posts in WordPress. I rebuilt the sitemap and said to exclude category and tags, should that clear up the issue? I've also went through and did NO INDEX and NO FOLLOW for all categories and posts. Any thoughts on this issue?
On-Page Optimization | | seantgreen0 -
Is is it true that Google will not penalize duplicated content found in UL and LI tags?
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
On-Page Optimization | | jdjamie0