How to Solve Duplicate Page Content Issue?
-
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report.
Please, look in to Duplicate Page Content Issue.
I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
-
No probs glad to help!
Best of luck!
-
Oh great. That's fine. Now, I got idea... What's wrong with my site. If I suppose to find that duplicate page in Admin so I may not able to find that pages. Right?? So, I just need to set 301 (Permanent Redirection) with help of htaccess. Now, It's clear. Thanks a lot for your prompt reply and quick discussion on my issue.
-
Are you talking about HTML sitemap or XML sitemap?
If HTML sitemap so you are right and I suppose to make it live very soon with proper structure.
If you are talking about XML sitemap so I have created it with two partition.
http://www.vistastores.com/main_sitemap.xml
http://www.vistastores.com/products_sitemap.xml
So, What you think about it?
301 redirection is final. Right?? After removal of duplicate pages.
-
The tool you used to find these pages seems to work fine ;).
It's simply a crawl so use the seomoz data as it's a crawl of your site
Set a 301 redirect with the complete list of duplicated content, you don't need to delete the pages(it's not deleted anyway since it's dynamically created pages)
/ G
-
Ok! Then we have that cleared
If you already are using URL rewrite there shouldn't be any duplicated content in the manner the list shows.
- But since there is a problem I would check so that you're not using a sitemap that dynamically crawls and creates these urls for the pages.
If that's not the issue:
- We then come back to doing the 301:s by using .htaccess.
/ Gustav
-
In addition you may want to remove "zero products" pages from index and not link to them as they are not good for users or search engines.
Yes, you are right. I want to remove zero product pages as well as all pages which were created due to human error.
As I mentioned above: Following page is not available on website for buyers. Buyers will not able to see this page any more because, not a single page available on website which is available with hyperlink to that page.
http://www.vistastores.com/126/cookwares.html
But, SEOmoz crawler detect it and when I see in excel so I quite confuse. Because, that page is working and visible with all products which are available to original page.
Users are not able to go that duplicate page but Google crawler can go there and able to detect as duplication.
I have bit knowledge about duplication and assume that Google will detect both page as duplicate. That's it. And, I want to resolve it.
-
I think, This issue is not regarding URL rewriting. I can say very sure for it. As I said, duplicate page content URLs are not available on website. If any visitors will visit website and go to each page so that visitor will not able to find duplicate page.
URL rewriting will fix issue regarding URL structure but, what about duplication which is available in my own website to compete both pages with same keyword?
-
hi again!
After reading you follow up question I know a better solution for you
Instead of going canonical which takes much manual labor or doing 301:s since both are simply just temporary fixes for the problem and not the cure.
The best way and what I always recommend our clients is working with URL rewrite. This takes care of this problem completely, however it takes some coding to implement.
If you can handle this yourself and only need hints and guides read below:
- I assume that you use Apache server?
- If so read this: http://httpd.apache.org/docs/2.0/misc/rewriteguide.html
Or:
- Contact the admin for the webshop and ask them to make an URL Rewrite based on the URL structure that you would want to use
- Good structure: root/productcategory/Product
Hope this helped you
-
I am quite confuse with 301 redirect. There are too many duplicate pages which were generated due to wrong categorization or create new same category rather edit old category.
I want to remove all duplicate pages from my website and want to set 301 redirect.
I want to give one example for it.
Original Page:
http://www.vistastores.com/125_126/kitchen-and-dining/cookwares.html
Duplicate Page:
http://www.vistastores.com/126/cookwares.html
Duplicate pages were created due to Admin issue or by human error. No one will find our land on duplicate page during website surfing.
Now, I want to detect and remove all pages which are available with duplicate content.
So, Does it matter to delete that pages from website and set 301 to associated page or home page?
-
Best practice in your case would be to implement URL canonicalisation (rel="canonical").
Watch this: http://www.google.com/support/webmasters/bin/answer.py?answer=139394 (contains explanation and examples)
In addition to this try to prevent page duplication form happening in the first place, though this may need to be done on a programming level.
It seems that string such as "6_129_130" appears based on the category and navigational path. For example if user browses from outdoor and home decor and arrive on the same page the URL will have different number.
In addition you may want to remove "zero products" pages from index and not link to them as they are not good for users or search engines.
-
Hi!
Good question, one I often come across
I would say this is not a real issue for you in the serp. But if you want to fix it and competition is fierce in your field you should of course do it
Step one: Look up your index at google: simply with site:"youdomain.com" and see if there are any duplicated content in the index(chance is slim)
Step two: look up your webshop cms, perhaps there is a function for redirecting dynamic pages to a static page(usually there is)
Step three: If it's not possible to do the 301:s from webshop admin use the .htaccess file and implement 301:s with the url:s you have in the list.
Other comments:
If you would like to clean up these URLs work with 301:s and decide which structure is most important.(base it on the current index)
There are several ways to do an 301: but in this case I would say the easiest way since you have the list is to go to the .htacces and put the 301 redirect list there.
This way could of course be bad for you if the duplicated pages are somehow necessary for the webshop so start to look at the webshop to make certain that you don't mess up any critical part by doing it by .htaccess.
Best regards!
/ Gustav
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal link is creating duplicate content issues and generating 404s from website crawl.
Not sure what the best way to describe it but the site is built with Elementor page builder. We are finding out that a feature that is included with a pop modal window renders an HTML code as so: Click So when crawled I think the crawling is linking itself for some reason so the crawl returns something like this: xyz.com/builder/listing/ - what we want what we don't want xyz.com/builder/listing/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9//%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ so you'll notice how that string in the HREF is appended each time and it loops a couple times. Could I 301 this issue, what's the best way to go about handling something like this? It's causing duplicate meta descriptions/content errors for some listing pages we have. I did add a rel='nofollow' to the anchor tag with JavaScript but not sure if that'll help.
Technical SEO | | JoseG-LP0 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Magento Duplicate Content help!
How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?
Technical SEO | | adamxj20 -
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Lots of duplicate content warnings
I have a site that says that I have 2,500 warnings. It is a real estate website and of course we use feeds. it says I have a lot of duplicate content. One thing is a page called "Request an appointment" and that is a url for each listing. Since there are 800 listings on my site. How could I solve this problem so that this doesn't show up as duplicate content since I use the same "Request an Appointment" verbeage on each of those? I guess my developer who used php to do it, created a dedicated url to each. Any help would be greatly appreciated.
Technical SEO | | SeaC0 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0