Duplicate content issue
-
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right?
Thanks!
-
No problem!
-
Thanks for the help!
-
Adding nofollow to links that point to dictionary pages will prevent search engines from getting there, but since the pages are in the index (and you don't want to change that) you're still facing the duplicate content issue.
I know it's a huge project to take on to add content to these pages, but it seems as though it's your only option. Perhaps you could split the project up between a few people and each update one page per day. That way it doesn't turn into a major time-suck.
-
Got it. We actually have plenty of organic entrances to these pages. So rel=canonical is not an option here.
And one more thing. Does it make sense to add nofollow links internally to main dictionary page(http://anglu24.lt/zodynas)? What are downsides of that? Or the negative effect might be similar to rel=canonical in our case?
-
You can do that, but you should check Google Analytics to see how many organic entrances you get to these dictionary pages first. If a lot of people enter your site that way, rel=canonical is going to hurt your traffic numbers significantly. For example, when you add a canonical tag to this page (http://anglu24.lt/zodynas/a-suitcase-lagaminas) that points elsewhere, the suitcase page is going to get dropped from the index.
-
Thanks for the suggestion. Adding more content is the perfect way to deal with this. The downside for us is that we unfortunately don't have resources at the time to make such upgrades to 1000+ pages.
What about using rel=canonical? Is it possible to choose one dictionary page to be the original, and to tell Google that all the other ones are similar thus avoiding possible penalties? How would this work?
-
The ideal situation would be to create more unique content on these pages. You're getting duplicate errors because more than 90% of the source code on the dictionary pages is a match. When you consider the header and footer, and the other code for the template, it's the same everywhere. The dictionary pages are very thin on content, so it's not enough to differentiate. If you can, build out the content more.
Here's a few ways you might add more content to each dictionary page:
- Include a sentence (or 2) for in-context example of each word
- Game-ify it by writing a short paragraph of text where the translated word is blank and the user has to choose from a set of answers
- Add the phonetics for how to pronounce each word
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to no index no follow sections on a page to avoid duplicative text issues?
I'm working on an event-related site where every blog post starts with an introductory header about the event and then a Call To Action at the end which gives info about the Registration Deadline. I'm wondering if there is something we can and should do to avoid duplicative content penalties. Should these go in a widget or is there some way to No Index, No Follow a section of text? Thanks!
Intermediate & Advanced SEO | | Spiral_Marketing0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
[eCommerce Issues] Having a tough time writing content for product color variations. Any recommendations?
wow, after being hit with panda i'm having a real tough time with this issue. Maybe i'm going about it the wrong way.. How can i possibly write unique content for all of these different colors of the same product?... http://www.suddora.com/green-sweatbands-wholesale-green-wristbands.html http://www.suddora.com/pink-sweatbands-wholesale-pink-wristbands.html http://www.suddora.com/black-sweatbands-wholesale-black-wristbands.html http://www.suddora.com/green-headbands-wholesale-pricing-available.html http://www.suddora.com/pink-headbands-wholesale-pricing-available.html http://www.suddora.com/black-headbands-wholesale-pricing-available.html Should i be going about this a different way? Thanks, Paul
Intermediate & Advanced SEO | | Hyrule0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1 -
Does a mobile site count as duplicate content?
Are there any specific guidelines that should be followed for setting up a mobile site to ensure it isn't counted as duplicate content?
Intermediate & Advanced SEO | | nicole.healthline0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0