Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content on Event Pages
-
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
-
A penalty is something google will have to manually remove and you will be able to see that in webmaster tools. A devaluation is when you are adjusted by the algorithm and lowered as a result because each thing that google does not like acts as points against you but you can quickly change and see your results return. Does that make sense?
-
We decided that it was worth a large investment as we would own the content ourselves and not worry in the future about anyone claiming ownership to the content as google gets stricter. So we re wrote half a million words!
-
Also could you fully explain the difference between devaluation and a penalty?
-
Do you mind if I ask how much of the content you re-wrote? My main fear is the amount of work that this would take since a lot of content goes up on the site daily. If the content is re-written did you do the same amount of content or did you re-write your office space listings with less content?
-
This is a Panda issue.
Google has said many times with affiliate sites that use the same content that if they do a better job than the original site it will rank them. So its not all bad when you look at it from that point of view.
However, Google loves unique content and will do its best to rank sites first that have the unique content. I have a business in the office space industry and a few years back we used to aggregate office apace listings which were shared amongst 30+ sites. The display of these listings would be different for many searches but the content was the same as all the other sites. This slowly put us in a PANDA DEVALUATION (there is no panda penalty).
After re-writing them with our clients we saw a significant change once the content had be re-crawled.
So it can have a great effect. If Google starts to see that large parts of your site are duplicate content it will start to question the authority you have in your industry.
Could you offer and incentive to your customers to write something unique? And also maybe inform your users not to copy and paste their own content on your site as this could affect them negatively in Google?
If you are an authority could you tell users that if you want to be listed it must be unique? Or if its a paid service have an ad on service for a few bucks where you write a professional description? Might become a nice additional income?
Just a few ideas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I redirect or add content, to 47 Pages?
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this? I'm I right in thinking I have 2 options? Either add new content or redirect the page? Thanks in advance 🙂
On-Page Optimization | | laurentjb1 -
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Landing page separate from product page
Hello there, I have a wordpress website with a woocommerce plugin. I have 4 landing pages that describe my products and at the end of the pages, I have a CTA to my product page. is it bad for SEO? my website: https://relationadviser.ir
On-Page Optimization | | Aaron.be1 -
Duplicating words in the page title OK?
Im finding a site with lots of duplicated words in the title tags, I have always avoided doing this in the past, Is there any penalty for having a word repeated twice in the title, indeed is there a benefit from having it twice, IM assuming not
On-Page Optimization | | Donsimong
For example:Â Marketing Services in Milton Keynes | Our Services | TFA
https://www.t-f-a.co.uk/services the word service is repeated twice, in my opinion this is of no benefit at all and is better rewritten to remove the duplication1 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website.  However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...).  I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0