Duplicate content issue
-
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right?
Thanks!
-
No problem!
-
Thanks for the help!
-
Adding nofollow to links that point to dictionary pages will prevent search engines from getting there, but since the pages are in the index (and you don't want to change that) you're still facing the duplicate content issue.
I know it's a huge project to take on to add content to these pages, but it seems as though it's your only option. Perhaps you could split the project up between a few people and each update one page per day. That way it doesn't turn into a major time-suck.
-
Got it. We actually have plenty of organic entrances to these pages. So rel=canonical is not an option here.
And one more thing. Does it make sense to add nofollow links internally to main dictionary page(http://anglu24.lt/zodynas)? What are downsides of that? Or the negative effect might be similar to rel=canonical in our case?
-
You can do that, but you should check Google Analytics to see how many organic entrances you get to these dictionary pages first. If a lot of people enter your site that way, rel=canonical is going to hurt your traffic numbers significantly. For example, when you add a canonical tag to this page (http://anglu24.lt/zodynas/a-suitcase-lagaminas) that points elsewhere, the suitcase page is going to get dropped from the index.
-
Thanks for the suggestion. Adding more content is the perfect way to deal with this. The downside for us is that we unfortunately don't have resources at the time to make such upgrades to 1000+ pages.
What about using rel=canonical? Is it possible to choose one dictionary page to be the original, and to tell Google that all the other ones are similar thus avoiding possible penalties? How would this work?
-
The ideal situation would be to create more unique content on these pages. You're getting duplicate errors because more than 90% of the source code on the dictionary pages is a match. When you consider the header and footer, and the other code for the template, it's the same everywhere. The dictionary pages are very thin on content, so it's not enough to differentiate. If you can, build out the content more.
Here's a few ways you might add more content to each dictionary page:
- Include a sentence (or 2) for in-context example of each word
- Game-ify it by writing a short paragraph of text where the translated word is blank and the user has to choose from a set of answers
- Add the phonetics for how to pronounce each word
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New websites issues- Duplicate urls and many title tags. Is it fine for SEO?
Hey everyone, I have found few code issues with our new website and wanted to see how bad those problems are and if I have missed anything. If someone can take a look at this and help me it would mean the world. Thank you. all! We hired an agency to design a new site for us and it's almost ready, but the other day I found some problems that made me wonder if this new site might not be as good as I thought and I wanted to ask you to take a look at the code and possibly help me understand if from SEO prospective it is sound. But I really want someone who understands SEO and web design to look at our code and point out what might be wrong there. Here is a link to the actual site which is on a new server: http://209.50.54.42/ What I found few days ago that made me wonder something might not be right. Problem 1. Each page has 3 title tags, I guess whatever template they are using it automatically creates 3 title tags. When you do " View Page Source" For example on this url: http://209.50.54.42/washington-dc-transportation when you view the code, the lines Lines 16,19 and 20 have the title tag which in my opinion is wrong and there should only be one. Could this hurt our SEO? Problem 2. Infinite duplicate urls found All following pages have INFINITE NUMBER OF DUPLICATE URLS. EXAMPLE: http://209.50.54.42/privacy-policy/8, http://209.50.54.42/privacy-policy/1048, http://209.50.54.42/privacy-policy/7, http://209.50.54.42/privacy-policy/1, http://209.50.54.42/privacy-policy you can add any type of number to this url and it will show the same page. I really think this 2nd problem is huge as it will create duplicate content. There should be only 1 url per page, and if I add any number to the end should give a 404 error. I have managed to find these 2 issues but I am not sure what else could be wrong with the code. Would you be able to look into this? And possible tell us what else is incorrect? I really like the design and we worked really hard on this for almost 5 moths but I want to make sure that when we launch the new site it does not tank our rankings and only helps us in a positive way. Thanks in advance, Davit
Intermediate & Advanced SEO | | Davit19850 -
Are feeds bad for duplicate content?
One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
Intermediate & Advanced SEO | | cindyt-17038
Cindy T.0 -
Duplicate page content errors for Web App Login
Hi There I have 6 duplicate content errors, but they are for the WebApp login from our website. I have put a Noindex on the Sitemap to stop google from indexing them to see if that would work. But it didn't. These links as far as I can see are not even on the website www.skemaz.net, but are links beyond the website and on the Web App itself eg : <colgroup><col width="529"></colgroup>
Intermediate & Advanced SEO | | Skemazer
| http://login.skemaz.net |
| http://login.skemaz.net/LogIn?ReturnUrl=%2Fchangepassword |
| http://login.skemaz.net/Login |
| http://login.skemaz.net/LogIn?ReturnUrl=%2FHome | Any suggestions would be greatly appreciated. Kind regards Sarah0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Duplicate Content - Panda Question
Question: Will duplicate informational content at the bottom of indexed pages violate the panda update? **Total Page Ratio: ** 1/50 of total pages will have duplicate content at the bottom off the page. For example...on 20 pages in 50 different instances there would be common information on the bottom of a page. (On a total of 1000 pages). Basically I just wanted to add informational data to help clients get a broader perspective on making a decision regarding "specific and unique" information that will be at the top of the page. Content ratio per page? : What percentage of duplicate content is allowed per page before you are dinged or penalized. Thank you, Utah Tiger
Intermediate & Advanced SEO | | Boodreaux0