I try to apply best duplicate content practices, but my rankings drop!
-
Hey,
An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated.
http://www.domain.com.au/digital-inverter-generator-3300w/
and
http://www.domain.com.au/shop/digital-inverter-generator-3300w/
The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option).
This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc).
Rankings went up and so did traffic.
In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks.
Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it?
Ideas, suggestions?
-
Agreed with Alan (deeper in the comments) - you may have cut off links to these pages or internal link-juice flow. It would be much better to either 301-redirect the "/shop" pages or use the canonical tag on those pages. In Apache, the 301 is going to be a lot easier - if "/shop/product" always goes to "/product" you can set up a rewrite rule in .htaccess and you don't even need to modify the site code (which site-wide canonical tags would require).
The minor loss from the 301s should be much less than the problems that may have been created with Robots.txt. As Alan said, definitely re-point your internal links to the canonical (non-shop) version.
-
No I mean internal links,
if you do a 301, your internal links poiting to shop/ urls will still work, but they will lose a little link juice when they are 301 redirected. you should point them to the final destination the non shop url.
-
No, it was a very new site. There was only 10 links all to the root.
Thanks for your help.
-
Yes i would 301 then
Are there any links pointing to the shop/ version, internal links? then i would fix them also, as 301's leak link juice you should create link that go directly to the destionation page where you can.
-
The difference between blocking something in robots.txt and a 301?
The duplicates are actively created.
When the products are added to the cart plugin, they automatically create the /shop/product page. These pages were horrible for SEO, and as they were automatically created they could not be edited easily (the plugin developers clearly had no SEO understanding).
My client's developer created a new WP Post for every product and added in a shortcode calling the product from the plugin. This created the duplicate. As this is a wordpress post, the SEO was far more adaptable.
-
I dont understand the difference, is this the reason behind the duplicates?
-
No it's Apache.
Would you guys say it's best to just 301 rewrite all /shop/product to /product? Then unblock from robots.txt?
-
I work with Microsoft Technolgies, I dont work with CMS like WP.
On IIS i would use a outgoing url rewrite url to insert the meta tag. You can do this without touching the website.
Are you by any luck hosting on IIS or are you on Apache?
-
The shopping cart WP plugin is creates the /shop/product pages automatically. I have very little control over them.
Instead the developer has created post pages and inserted the product short codes (this gives the /product effect). I have far more control over these pages, and as such they are far better for SEO.
Do you know of a way I can no index/follow, all /shop pages in the robots.txt?
-
I have been telling other not to do what you have done. what is better is you use "no-index follow" tags instead.
Link juice flows into your pages blocked by robots though links to them never to be seen again. If you use the noindex-follow meta tag you allow the link juice to flow in and out.
The best idea is not to have the duplicates, after thet you should use a canonical tag, if that is not posible then use the noindex, follow tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Rankings: One keyword dropped, dragging other rankings down. Possible or not?
Hey moz fans, So these week I noticed significant drop in rankings... But what caught my attention is that one specific keyword dropped 18 positions, and all the other just 1-3. Print screen: http://prntscr.com/7fb4g4 Do you think it's possible that the drop of that page, that went 18 positions down, brought the whole domain down? Or is it another cause?
Intermediate & Advanced SEO | | Kokolo0 -
Membership/subscriber (/customer) only content and SEO best practice
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice? A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!) Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Category Content Duplication
Does indexing category archive page for a blog cause duplications? http://www.seomoz.org/blog/setup-wordpress-for-seo-success After reading this article I am unsure.
Intermediate & Advanced SEO | | SEODinosaur0 -
Mobile version creating duplicate content
Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
Intermediate & Advanced SEO | | peterkn
http://www.youtube.com/watch?v=mY9h3G8Lv4k
http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?1