What's the best way to solve this sites duplicate content issues?
-
Hi,
The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands.
I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed.
Currently it looks like this...
Main URL
http://www.expressgolf.co.uk/shop/clothing/galvin-green
Different Versions
http://www.expressgolf.co.uk/shop/clothing/galvin-green/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/1
http://www.expressgolf.co.uk/shop/clothing/galvin-green/2
http://www.expressgolf.co.uk/shop/clothing/galvin-green/3
http://www.expressgolf.co.uk/shop/clothing/galvin-green/4
http://www.expressgolf.co.uk/shop/clothing/galvin-green/all
http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/
Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots?
Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ?
I'm sure this question has been answered but I was having trouble coming to a solution for this one site.
Cheers,
Paul
-
Greetings Champion!
Cannonical Linking is the best way to go!
For your conundrum in Example B with the Calvin Green I would find out which URL version is dominant or has the most link juice and structure the redirects to that links. For instance, let us say [http://www.expressgolf.co.uk/shop/clothing/galvin-green/] is the dominant link I would have the copy urls pointing back to that one like so
I manage an Ecommerce Site as well, I had the same issue with the categories and what I basically did the same thing for each level of my site, categories that have the url ending in /1, /2, /3 and so on I would redirect to the first page to make that Url stronger.
Practically think about what Noindexing or Nofollowing would do, you would lose so much link power that you could harness, With redirects you can focus this power in a more effective manner.
Also I was snooping and clicked Golf Clubs which took me to this URL
expressgolf.co .uk/shop/clubs
Made me realize that you had "clubs" at the end, I would put Golf Clubs because when you do a keyword search for Clubs, Bars come up. Putting a Keyword in the URL can greatly enhance that URL. Just a tip if you want to utilize it or not friend ^.^.
Good luck on your quest for Page 1!
Justin Smith
-
It really sounds like canonical is what you need here. Here is Matt Cutts explaining:
-
Hi,
I would say canonical or rel=”next” and rel=”prev”
http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
Would probably be your best bets, as a 301 would solve from a Search Engine perspective, but definitely not good for users wanting to see different versions of the product in question.
hope this helps
w00t!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
What is the best approach to specifying a page's language?
I have read about a number of different tags that can accomplish this so it is very confusing. For example, should I be using: OR
Technical SEO | | BlueLinkERP0 -
What's the best URL Structure if my company is in multiple locations or cities?
I have read numerous intelligent, well informed responses to this question but have yet to hear a definitive answer from an authority. Here's the situation. Let's say I have a company who's URL is www.awesomecompany.com who provides one service called 'Awesome Service' This company has 20 franchises in the 20 largest US cities. They want a uniform online presence, meaning they want their design to remain consistent across all 20 domains. My question is this; what's the best domain or url structure for these 20 sites? Subdomain - dallas.awesomecompany.co Unique URL - www.dallasawesomecompany.com Directory - www.awesomecompany.com/dallas/ Here's my thoughts on this question but I'm really hoping someone b*tch slaps me and tells me I'm wrong: Of these three potential solutions these are how I would rank them and why: Subdomains Pros: Allows me to build an entire site so if my local site grows to 50+ pages, it's still easy to navigate Allows me to brand root domain and leverage brand trust of root domain (let's say the franchise is starbucks.com for instance) Cons: This subdomain is basically a brand new url in google's eyes and any link building will not benefit root domain. Directory Pros Fully leverages the root domain branding and fully allows for further branding If the domain is an authority site, ranking for sub pages will be achieved much quicker Cons While this is a great solution if you just want a simple map listing and contact info page for each of your 20 locations, what if each location want's their own "about us" page and their own "Awesome Service" page optimized for their respective City (i.e. Awesome Service in Dallas)? The Navigation and potentially the URL is going to start to get really confusing and cumbersome for the end user. Think about it, which is preferable?: dallas.awesomcompany.com/awesome-service/ www.awesomecompany.com/dallas/awesome-service (especially when www.awesomecompany.com/awesome-service/ already exists Unique URL Pros Potentially quicker rankings achieved than a subdomain if it's an exact match domain name (i.e. dallasawesomeservice.com) Cons Does not leverage the www.awesomecompany.com brand Could look like an imposter It is literally a brand new domain in Google's eyes so all SEO efforts would start from scratch Obviously what goes without saying is that all of these domains would need to have unique content on them to avoid duplicate content penalties. I'm very curious to hear what you all have to say.
Technical SEO | | BrianJGomez0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0