I try to apply best duplicate content practices, but my rankings drop!
-
Hey,
An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated.
http://www.domain.com.au/digital-inverter-generator-3300w/
and
http://www.domain.com.au/shop/digital-inverter-generator-3300w/
The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option).
This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc).
Rankings went up and so did traffic.
In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks.
Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it?
Ideas, suggestions?
-
Agreed with Alan (deeper in the comments) - you may have cut off links to these pages or internal link-juice flow. It would be much better to either 301-redirect the "/shop" pages or use the canonical tag on those pages. In Apache, the 301 is going to be a lot easier - if "/shop/product" always goes to "/product" you can set up a rewrite rule in .htaccess and you don't even need to modify the site code (which site-wide canonical tags would require).
The minor loss from the 301s should be much less than the problems that may have been created with Robots.txt. As Alan said, definitely re-point your internal links to the canonical (non-shop) version.
-
No I mean internal links,
if you do a 301, your internal links poiting to shop/ urls will still work, but they will lose a little link juice when they are 301 redirected. you should point them to the final destination the non shop url.
-
No, it was a very new site. There was only 10 links all to the root.
Thanks for your help.
-
Yes i would 301 then
Are there any links pointing to the shop/ version, internal links? then i would fix them also, as 301's leak link juice you should create link that go directly to the destionation page where you can.
-
The difference between blocking something in robots.txt and a 301?
The duplicates are actively created.
When the products are added to the cart plugin, they automatically create the /shop/product page. These pages were horrible for SEO, and as they were automatically created they could not be edited easily (the plugin developers clearly had no SEO understanding).
My client's developer created a new WP Post for every product and added in a shortcode calling the product from the plugin. This created the duplicate. As this is a wordpress post, the SEO was far more adaptable.
-
I dont understand the difference, is this the reason behind the duplicates?
-
No it's Apache.
Would you guys say it's best to just 301 rewrite all /shop/product to /product? Then unblock from robots.txt?
-
I work with Microsoft Technolgies, I dont work with CMS like WP.
On IIS i would use a outgoing url rewrite url to insert the meta tag. You can do this without touching the website.
Are you by any luck hosting on IIS or are you on Apache?
-
The shopping cart WP plugin is creates the /shop/product pages automatically. I have very little control over them.
Instead the developer has created post pages and inserted the product short codes (this gives the /product effect). I have far more control over these pages, and as such they are far better for SEO.
Do you know of a way I can no index/follow, all /shop pages in the robots.txt?
-
I have been telling other not to do what you have done. what is better is you use "no-index follow" tags instead.
Link juice flows into your pages blocked by robots though links to them never to be seen again. If you use the noindex-follow meta tag you allow the link juice to flow in and out.
The best idea is not to have the duplicates, after thet you should use a canonical tag, if that is not posible then use the noindex, follow tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Will we be penalised for duplicate content on a sub-domain?
Hi there, I run a WordPress blog and I use [community platform] Discourse for commenting. When we publish a post to Wordpress, a duplicate of that post is pushed to a topic on Discourse, which is on a sub-domain. Eg: The original post and the duplicated post. Will we be penalised for duplicating our own content on a subdomain? If so, other than using an excerpt, what are our options? Thanks!
Intermediate & Advanced SEO | | ILOVETHEHAWK0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Why does Google Claimed Local Listing Ranking Drop?
I have two local google places listinggs unlaimed. Both listings were ranking in the blended search in 7 pack. Once I claimed the local listings for the business both listings rankings have dropped. And one has totally vanished from the search rankings. Is this normal as it appears local places that are not claimed are ranking higher than local places claimed?
Intermediate & Advanced SEO | | VivaArturo0 -
Duplicate Content - Panda Question
Question: Will duplicate informational content at the bottom of indexed pages violate the panda update? **Total Page Ratio: ** 1/50 of total pages will have duplicate content at the bottom off the page. For example...on 20 pages in 50 different instances there would be common information on the bottom of a page. (On a total of 1000 pages). Basically I just wanted to add informational data to help clients get a broader perspective on making a decision regarding "specific and unique" information that will be at the top of the page. Content ratio per page? : What percentage of duplicate content is allowed per page before you are dinged or penalized. Thank you, Utah Tiger
Intermediate & Advanced SEO | | Boodreaux0