How do I fix this type of duplicate page content problem?
-
Sample URLs with this Duplicate Page Content
URLs Internal Links External Links Page Authority Linking Root Domains http://rogerelkindlaw.com/index.html
30 0 26 1 http://www.rogerelkindlaw.com/index.html
30 0 20 1 http://www.rogerelkindlaw.com/
As you can see there are three duplicate pages;
http://rogerelkindlaw.com/index.html
http://www.rogerelkindlaw.com/index.html
http://www.rogerelkindlaw.com/
What would be the best and most efficient way to fix this problem and also how to prevent this from happening?
Thank you.
-
Hi Brian
What Brent says in key, although some web developers tell you can't be done - don't get put off by them.
Also - (but not instead of) in Google Webmaster tools you can select which URL is always displayed.
Hope this helps
PH
-
Canonicalization - Very simple way to clean up a lot of duplicate issues.
Here is a great article on how to take care of these:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Are all duplicate pages bad?
I just got my first Crawl Report for my forum and it said I have almost 9,000 duplicate pages. When I looked at a sample of them though I saw that many of them were "reply" links. By this I mean the "reply" button was clicked for a topic yet since the crawler was not a member, it just brought them to the login/register screen. Since all the topics would bring you to the same login page I'm assuming it counted all these "reply" links as duplicates. Should I just ignore these or is there some way to fix it? Thanks in advance.
Technical SEO | | Xee0 -
Question about content on ecommerce pages.
Long time ago we hired a seo company to do seo in our website and one of the things they did is that they wrote long text on the category pages of our products. Example here: http://www.theprinterdepo.com/refurbished-printers/wide-format-laser-refurbished-printers Now my marketing person is saying that if its possible to put the text below the items, technically I will find out how to do it, but from your seo experience, is it good or bad? What about if we short those texts to one paragraph only? Thanks
Technical SEO | | levalencia10 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0