Duplicate Content issue on pages with Authority and decent SERP results
-
Hi, I'm not sure what the best thing to do here is.
I've got quite a few duplicate page errors in my campaign. I must admit the pages were originally built just to rank a keyword variation.
e.g. Main page keyword is [Widget in City] the "duplicate" page is [Black Widget in City]
I guess the normal route to deal with duplicate pages is to add a canonical tag and do a 304 redirect yea? Well these pages have some page Authority and are ranking quite well for their exact keywords, what do I do?
-
Just thought I would update the results on this.
After taking PoolGuy's advice I 301'd the duplicate pages to their main pages, i.e.
[Black Widget in City] ->301-> [Widget in City]
Now almost every single main page has moved up on average 12 places to #1 on google!
The duplicate pages are still showing in the SERPs but appear to be dropping and I guess will eventually dissapear.
So I'm now a staunch promoter of dealing with duplicate content!
-
I would predict you will continue to rank well for your duplicate pages. I further predict they (because of Panda) will bring down the rankings of the rest of your site. If you can add some good unique content to 50 different colors of a product, then by all means, make 50 pages. But if you can't generate the content necessary to establish each individual page as "quality", then I would remove them, and simply not go that route at all, and 301 redirect all those pages into the main page. If you think it still holds value to users, then keep them up and just use a canonical tag.
-
Hi There
From what you've described, it does sound like Duplicate Content is a potential issue for your site, it's great that you're addressing it.
A Canonical Tag as you've suggested is a logical and sensible course of action. Correctly implemented, this will certainly help.
One thought, what would provide the best user experience for your visitors? Have a think about whether these 'additional' pages actually add any value or not for your visitors. I suspect that you could include every different color of a certain type of widget on a single page for each type, rather than having separate pages for each color.
In this scenario, consider 301 redirecting those additional/duplicate pages to the main appropriate widget page, that would almost certainly provide a cleaner better user experience. Obviously need to take into account your business objectives and target audience goals when making such a decision.
So really, it's Canonicals or 301s, depending on what best matches your business objectives and the user experience.
Hope that helps,
Regards
Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate Content For E-commerce
On our E-commerce site, we have multiple stores. Products are shown on our multiple stores which has created a duplicate content problem. Basically if we list a product say a shoe,that listing will show up on our multiple stores I assumed the solution would be to redirect the pages, use non follow tags or to use the rel=canonical tag. Are there any other options for me to use. I think my best bet is to use a mixture of 301 redirects and canonical tags. What do you recommend. I have 5000+ pages of duplicate content so the problem is big. Thanks in advance for your help!
Intermediate & Advanced SEO | | pinksgreens0 -
Page indexed but not showing up at all in search results
I am currently working on the SEO for a roofing company. I have developed GEO targeted pages for both commercial and residential roofing (as well as attic insulation and gutters) and have hundreds of 1st page placements for the GEO targeted keywords. What is baffling me is that they are performing EXTREMELY poorly on the bigger cities, to the point of not evening showing up in the first 5 pages. I also target a page specifically for roof repair in Phoenix and it is not coming up AT ALL. This is not typically the results I get when directly targeting keywords. I'm working on implementing keyword variations as well as adding about 10 or so information pages (@ 700 words) regarding different roofing systems which I plan to cross link on the site, etc. I'm just wondering if there is a simple answer as to why the pages I want to be showing up the most are performing so poorly and what I would need to do to improve their rankings.
Intermediate & Advanced SEO | | dogstarweb0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0