Error msg 'Duplicate Page Content', how to fix?
-
Hey guys,
I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content?
I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation).
Any advice would be very helpful!
Cheers,
Peter
-
I dont think the problem is the https vs http here.
Maybe you could share one of the erros with me?
Fredrik
-
Hi Fredrik,
Thanks for the follow up! I got the error msg from my SEOmoz dashboard. I just got it a couple days ago and the first crawl has been completed. I can also see in Google Webmaster a similar error msg. The old pages are on an older SSL secured site, but the domain is still the same.
For example, my old site was https://www.kellanapparel.com and now I don't use the SSL certificate and the website is hosted on a new platform (shopify) but with the same domain minus 's' (so now its http://www.kellanapparel.com). Maybe it's the case that the old SSL certificate is still valid and because the old content is associated on there it's still triggering? Would this make a difference?
Cheers,
Peter
-
Hi Peter
When you say you got the message 'Duplicate Page Content', is this from Google Webmaster Tools?
You mention an old page? Is that still active under different domain?
To avoid the duplicate content error you need to make sure that there is only ONE url for each piece of content. There are many ways of achieving this and here I only suggest a few:
Canonical tags
Try to use Canonical tags where ever you can. This can be implemented on both versions. The orgininal version should have tags pointing to itself. More info on excatly how to do this here:
http://www.seomoz.org/learn-seo/canonicalization
Remove old site from index
Make sure the old system is no longer indexed. Do this using either robots.txt or add a no index no follow metatag.
More on robots.txt here:
http://www.seomoz.org/learn-seo/robotst.xt
301 redirects
Try to use 301 redirects pointing any old page to the new page. You can also use this to point non www pages to always use www or the other way around. If you have old pages on a different system make sure these are not online. If they are still active on different domain, make sure to make 301 redirects to the new page.
More on this here:
http://www.seomoz.org/learn-seo/redirection
You can find more info on how to avoid this error on this post:
http://www.seomoz.org/learn-seo/duplicate-content
Good luck and hope these tips can help you get started.
Fredrik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
Duplicate pages on wordpress
I am doing SEO on a site which is running on WP. And it has all pages and categories duplicates on domain.com/site/ However, as it got crawled I saw that all domain.com/ pages have rel=canonical with main page tag (does it mean something?). Thing is I will fix permalinks structure and I think WP automatically redirects if it is changed from /?page_id= to /%category%/%postname%/ or /%postname%/ Isn't there something I miss? Second problems is a forum. After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ (fortunately, there are no domain.com/site/forum duplicates). What could be an easy solution to this?
Technical SEO | | OVJ0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
After fixing duplicate pages problem - keyword rankings have fallen off a cliff!
We have recently signed up to SEOMOZ and found that our site had over 2,500 duplicated pages. We reported it the the web designer and they found links on the website to an old prototype version of the website and so they did a SQL run to get rid of them. Doing this got rid of 90% of them. However, this morning, moz has just done another crawl of our website and our keyword rankings have fallen off a cliff. Particularly, important one that we were at position 1 for. We are now on the fifth page. Can anyone shed any light on it? Will this be temporary? Thanks Stuart
Technical SEO | | Stuart260 -
How can i see the pages that cause duplicate content?
SEOmoz PRO is giving me back duplicate content errors. However, i don't see how i can get a list of pages that are duplicate to the one shown. If i don't know which pages/urls cause the issue i can't really fix it. The only way would be placing canonical tags but that's not always the best solution. Is there a way to see the actual duplicate pages?
Technical SEO | | 5MMedia0 -
Magento and Duplicate content
I have been working with Magento over the last few weeks and I am becoming increasingly frustrated with the way it is setup. If you go to a product page and remove the sub folders one by one you can reach the same product pages causing duplicate content. All magento sites seem to have this weakness. So use this site as an example because I know it is built on magento, http://www.gio-goi.com/men/clothing/tees/throve-t-short.html?cid=756 As you remove the tees then the clothing and men sub folders you can still reach the product page. My first querstion is how big an issue is this and two does anyone have any ideas of how to solve it? Also I was wondering how does google treat question marks in urls? Should you try and avoid them unless you are filtering? Thanks
Technical SEO | | gregster10001 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0