New Magento store with lots of duplicate contect
-
Hi,
I am fairly new to SEOmoz and would really appreciate some pointers.
I have just received the results of my first crawl and I have over 1800 pages of duplicate content.
this is an example
Cream poplin slim fit mens business shirt from Jermyn Street Shirt Co.
http://jsshirts.com.au/aylesbury-cream-poplin-slim-fit-shirt.html
Heading other URL's indicates 10
My questions are what does this figure 10 represent? what do I need to do to remove this duplicate content.
Thanks
| | | | |
| | | | |
| | | | | -
Magento 1.4 and above has native support for canonical tags but it has to be be enabled. Goto System > Configuration >Catalog > Search Engine Optimization and enable Use Canonical Link Meta Tag For Categories and Use Canonical Link Meta Tag For Products
-
Yeah, canonicals are the key.
We use this extension on our Magento shop which lets us control the Canonical URLs properly as well as generating some good google friendly sitemaps
http://www.mageworx.com/seo-suite-pro-magento-extension.html
Hope this helps,
James
-
Hi Jason,
Unfortunately the campaign link doesn't work but typically the duplicate content issues in Magento will arise around category (when refining search criteria) & product pages (when selecting different options - collar size, shirt material, color etc).
For example, all these URLs contain the same Title & Meta Description but essentially display the same textual content:
- http://jsshirts.com.au/cufflinks.html?cufflink_material=92
- http://jsshirts.com.au/cufflinks.html?cufflink_material=94&dir=desc&order=position
- http://jsshirts.com.au/cufflinks.html?cufflink_material=92&dir=desc&order=position
- http://jsshirts.com.au/cufflinks.html?cufflink_material=94&cufflink_range=88&dir=desc&order=position
- http://jsshirts.com.au/cufflinks.html?cufflink_material=94&dir=desc&mode=list&order=position
- etc etc
A canonical tag will help attribute a single URL as the content originator
Other options could be to control the refining via JQuery or nofollow those links. It all depends on more thorough investigation of course
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I have a client based in the UK and one of their distributors based in the UAE have copied the content for their own website, will this affect my clients rankings because of duplicate content?
Content Development | | CreativeCow0 -
Knowledge base for seo, announcing new articles on blog (dupe content)
Hi all, Im thinking of creating a knowledge base with all many asked questions in my company. This could be a great Link-bait source but also nice ranking opportunities i think. But sometimes some new articles are so actual that i also want to blog them.
Content Development | | mdkay
Can i for example double post them (or post a big excerpt) on the blog and canonicalise it to the KB article?
Will links to the blog have equal value to KB links? And will this work?0 -
Can I post my MailChimp articles on my blog without getting hit for duplicate content?
I would like to post my newsletters on my blog, but am afraid of duplicate content since you can click a link on the MailChimp email blast to view the Newsletter online. Is this considered dup content?
Content Development | | RoxBrock0 -
New Link Tracker from My Blog Guest
Hi, Just come across this and thought I would share it, looks good from the outside but not yet signed up, going to though in the next hour or so. Anyway looks good - http://tracker.myblogguest.com/
Content Development | | activitysuper1 -
Possible to recover from Thin/Duplicate content penalties?
Hi all, first post here so sorry if in wrong section. After a little advice, if I may, from more experienced SEOers than myself with regards to writing off domains or keeping them. To cut a long story short I do a lot of affiliate marketing, back in the day (until the past 6 months or so) you could just take a merchant's datafeed and with some SEO outrank them for individual products. However, since the Google Panda update this hasn't worked as well and now it's much hard to do - which is better for the end user. The issue I have is that I got lazy and tried to see if I could still get some datafeeds to rank with only duplicate content. The sites ranked very well at first but after a couple of weeks died massively. They went from 0 to 300 hits a day in a matter of 24 hours and back to 2 hits a day. The sites now not rank for anything which is obviously because they are duplicate content. The question I have is are these domains dead, can they be saved? Not talking about duplicate content but as a domain itself. I used about 10 domains to test things, they ranged from DA 35 to DA 45 - one of the tests being can a domain with reasonable DA rank for duplicate content. Seeing as the test didn't work I want to use the domains for proper sites with proper unique content, however so far although the new unique content is getting indexed it is suffering from the same ranking penalties the duplicate (and now deleted content) pages had. Is it worth trying to use these domains, will Google finally remove the penalty when they notice that the bad content is no longer on the site or are the domains very much dead? Many thanks
Content Development | | ccgale0 -
Best way to resolve duplicate content issue?
Not sure about what to do about this - I have a client who has a ton of pages (around 1200) which are all City specific pages, for long-tail search. These are all written with paragraphs in the format such as: Order to [City] today. So every page has essentially the same content. The site also only has 1562 pages, so with 1200 of them being City-specific same-content pages, that can't be good. However the problem is that these pages still rank very well (usually Position 1 or 2) for the terms they're targeting, and bring in enough traffic and revenue to justify their purpose. We also have Country specific pages, and these are all with unique content, rather than the scripted content on the City pages. So for example, for Italy we might have: Italy Page (Unique Content) Rome (Duplicate Content) Milan (Duplicate Content) Venice (Duplicate Content) etc. (Duplicate Content) For a low traffic country (Austria), we tried to 301 the City pages to the Country page, but that only resulted in us seeing a drop in search results for the city keywords, from (usually) Position 1 to more like Page 3 or 4, so quite a drop. So, without writing 1200 pages worth of unique content, what would your advice be?
Content Development | | TME_Digital0 -
Archive older, low ranked content to help new content in Panda 2.2?
After watching the white board friday re: Panda 2.2, it got me to thinking about old content. One of the sites that I work with generates 3-10 new articles/day (movie reviews, interviews, guides, event previews, etc) and has been doing so since 2005. Now, they have almost 10k articles, 7k of which are indexed. The quality of the content varies, and much of it is dated (movies, events) much of the amount of older content gets 0-5 pageviews/month, made in the days BEFORE the site was using Google News + social tools to spread the word (and backlinks). Note that those older articles also of course tend to have 100% bounce, and small/zero TOS. Is this hurting the site? With 75-100 articles/month being published, I want to make sure they get maximum exposure. I'm also concerned that crawlers get sucked into the site chasing down old BS content, and that is hurting it as well. What to do with this content? Should I unpublish unpopular, dated content and get it off the internet? Or, do I leave it on, but NOINDEX it so Google won't crawl it?
Content Development | | EricPacifico0