How to Solve Duplicate Page Content Issue?
-
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report.
Please, look in to Duplicate Page Content Issue.
I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
-
No probs glad to help!
Best of luck!
-
Oh great. That's fine. Now, I got idea... What's wrong with my site. If I suppose to find that duplicate page in Admin so I may not able to find that pages. Right?? So, I just need to set 301 (Permanent Redirection) with help of htaccess. Now, It's clear. Thanks a lot for your prompt reply and quick discussion on my issue.
-
Are you talking about HTML sitemap or XML sitemap?
If HTML sitemap so you are right and I suppose to make it live very soon with proper structure.
If you are talking about XML sitemap so I have created it with two partition.
http://www.vistastores.com/main_sitemap.xml
http://www.vistastores.com/products_sitemap.xml
So, What you think about it?
301 redirection is final. Right?? After removal of duplicate pages.
-
The tool you used to find these pages seems to work fine ;).
It's simply a crawl so use the seomoz data as it's a crawl of your site
Set a 301 redirect with the complete list of duplicated content, you don't need to delete the pages(it's not deleted anyway since it's dynamically created pages)
/ G
-
Ok! Then we have that cleared
If you already are using URL rewrite there shouldn't be any duplicated content in the manner the list shows.
- But since there is a problem I would check so that you're not using a sitemap that dynamically crawls and creates these urls for the pages.
If that's not the issue:
- We then come back to doing the 301:s by using .htaccess.
/ Gustav
-
In addition you may want to remove "zero products" pages from index and not link to them as they are not good for users or search engines.
Yes, you are right. I want to remove zero product pages as well as all pages which were created due to human error.
As I mentioned above: Following page is not available on website for buyers. Buyers will not able to see this page any more because, not a single page available on website which is available with hyperlink to that page.
http://www.vistastores.com/126/cookwares.html
But, SEOmoz crawler detect it and when I see in excel so I quite confuse. Because, that page is working and visible with all products which are available to original page.
Users are not able to go that duplicate page but Google crawler can go there and able to detect as duplication.
I have bit knowledge about duplication and assume that Google will detect both page as duplicate. That's it. And, I want to resolve it.
-
I think, This issue is not regarding URL rewriting. I can say very sure for it. As I said, duplicate page content URLs are not available on website. If any visitors will visit website and go to each page so that visitor will not able to find duplicate page.
URL rewriting will fix issue regarding URL structure but, what about duplication which is available in my own website to compete both pages with same keyword?
-
hi again!
After reading you follow up question I know a better solution for you
Instead of going canonical which takes much manual labor or doing 301:s since both are simply just temporary fixes for the problem and not the cure.
The best way and what I always recommend our clients is working with URL rewrite. This takes care of this problem completely, however it takes some coding to implement.
If you can handle this yourself and only need hints and guides read below:
- I assume that you use Apache server?
- If so read this: http://httpd.apache.org/docs/2.0/misc/rewriteguide.html
Or:
- Contact the admin for the webshop and ask them to make an URL Rewrite based on the URL structure that you would want to use
- Good structure: root/productcategory/Product
Hope this helped you
-
I am quite confuse with 301 redirect. There are too many duplicate pages which were generated due to wrong categorization or create new same category rather edit old category.
I want to remove all duplicate pages from my website and want to set 301 redirect.
I want to give one example for it.
Original Page:
http://www.vistastores.com/125_126/kitchen-and-dining/cookwares.html
Duplicate Page:
http://www.vistastores.com/126/cookwares.html
Duplicate pages were created due to Admin issue or by human error. No one will find our land on duplicate page during website surfing.
Now, I want to detect and remove all pages which are available with duplicate content.
So, Does it matter to delete that pages from website and set 301 to associated page or home page?
-
Best practice in your case would be to implement URL canonicalisation (rel="canonical").
Watch this: http://www.google.com/support/webmasters/bin/answer.py?answer=139394 (contains explanation and examples)
In addition to this try to prevent page duplication form happening in the first place, though this may need to be done on a programming level.
It seems that string such as "6_129_130" appears based on the category and navigational path. For example if user browses from outdoor and home decor and arrive on the same page the URL will have different number.
In addition you may want to remove "zero products" pages from index and not link to them as they are not good for users or search engines.
-
Hi!
Good question, one I often come across
I would say this is not a real issue for you in the serp. But if you want to fix it and competition is fierce in your field you should of course do it
Step one: Look up your index at google: simply with site:"youdomain.com" and see if there are any duplicated content in the index(chance is slim)
Step two: look up your webshop cms, perhaps there is a function for redirecting dynamic pages to a static page(usually there is)
Step three: If it's not possible to do the 301:s from webshop admin use the .htaccess file and implement 301:s with the url:s you have in the list.
Other comments:
If you would like to clean up these URLs work with 301:s and decide which structure is most important.(base it on the current index)
There are several ways to do an 301: but in this case I would say the easiest way since you have the list is to go to the .htacces and put the 301 redirect list there.
This way could of course be bad for you if the duplicated pages are somehow necessary for the webshop so start to look at the webshop to make certain that you don't mess up any critical part by doing it by .htaccess.
Best regards!
/ Gustav
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Titles Issue in Campaign Crawl Error Report
Hello All! Looking at my campaign I noticed that I have a large number of 'duplicate page titles' showing up but all they are the various pages at the end of the URL. Such as, http://thelemonbowl.com/tag/chocolate/page/2 as a duplicate of http://thelemonbowl.com/tag/chocolate. Any suggestions on how to address this? Thanks!
Technical SEO | | Rich-DC0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Duplicate Page Content / Rel Canonical
Hi, The diagnostics shows me that I have 590 Duplicate Page Content , but when it shows the Rel Canonical I have over 1000, so dose that mean I have no Duplicate Page Content problem? Please help.
Technical SEO | | Joseph-Green-SEO0 -
Duplicate content issues, I am running into challenges and am looking for suggestions for solutions. Please help.
So I have a number of pages on my real estate site that display the same listings, even when parsed down by specific features and don't want these to come across as duplicate content pages. Here are a few examples: http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html?feature=waterfront http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html This happens to be a waterfront community so all the homes are located along the waterfront. I can use a canonical tag, but I not every community is like this and I want the parsed down feature pages to get index. Here is another example that is a little different: http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=without-pool http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=4-bedrooms http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=waterfront So all the listings in this community happen to have 4 bedrooms, no pool, and are waterfront. Meaning that they display for each of the parsed down categories. I can possible set something that if the listings = same then use canonical of main page url, but in the next case its not so simple. So in this next neighborhood there are 48 total listings as seen at: http://luxuryhomehunt.com/homes-for-sale/windermere/isleworth.html and being that it is a higher end neighborhood, 47 of the 48 listings are considered "traditional listings" and while it is not exactly all of them it is 99%. Any recommendations is appreciated greatly.
Technical SEO | | Jdubin0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Duplicate content, Original source?
Hi there, say i have two websites with identicle content. website a had content on before website b - so will be seen as the original source? If the content was intended for website b, would taking it off a then make the orinal source to google then go to website b? I want website b to get the value of the content but it was put on website a first - would taking it off website a then give website b the full power of the content? Any help of advice much appreciated. Kind Regards,
Technical SEO | | pauledwards0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0