Up to my you-know-what in duplicate content
-
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google.
The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages?
Thanks.
-
You've pretty much confirmed my suspicions. I can set the redirects up myself, its just been about 5 years since I've done any SEO work. What I meant was should I mod_rewrite or "redirect 301 /oldurl /newurl" ...I've forgot a lot of stuff that I used to do with ease. My own sites were always started off right and weren't as bad as the one I'm working on now, so I'm in unfamiliar territory. Thanks for your advice, I appreciate it
-
I want to make sure that you are getting the proper advice. Can you provide me the URLs here, or PM them to me to keep them private? Once I see the problem firsthand, I can reply with the answer here for you. I am pretty sure my advice above is the way to go, but it doesn't hurt to double check!
You need to choose ONE domain for going forward. I don't care which one it is, but choose one. It makes sense to choose the one with the better rankings, at least from my perspective.
After that, you 301 redirect all versions the URLs to the proper URL (which would be WWW if it was my choice).
Yes, mod_rewrite is a server-side redirect that you can choose. Make sure whoever sets them up knows what he is doing. Having a ton of server-side redirects can increase load times and cause issues with site speed if it is not done properly. Don't be afraid of doing it, but just make sure you know what you are doing, especially since you're dealing with thousands of URLs.
You want to use permanent 301 redirects, yes.
-
Thanks I appreciate the advice. So you don't think having 2 seperate domains pointing (or redirecting) to each other occasionally will hurt anything? I have like 1000+ URLs I need to redirect already on the completely separate domain.com, as for the keyworddomain.com forum I don't think I need too many redirects as just one from seperate.domain.com to keyworddomain.com, and then one there from nonWWW to WWW should fix all the broken URLs right? When you say 301 do you mean "redirect 301" or mod_rewrite? Thanks for the help
-
I would first, choose which version you want to use going forward. You have three versions: subdomain, non-www, and www. Don't use the subdomain, that is a given. I personally like using WWW instead of non-WWW, however there are reasons to use non-WWW over WWW. But, given this scenario, it makes sense to use the WWW version. I know that the non-WWW version has more pages indexed, but pages indexed doesn't mean much in the grand scheme of things. Given that WWW has good rankings and is more identifiable to a user, I would choose that. Of course, if you choose non-WWW my advice below will remain the same.
Now that you have chosen what version you want to use going forward, you need to do a few things:
-
Implement a .htaccess 301 server-side redirect and redirect non-WWW to WWW (or vice versa if you so choose), make sure it's permanent. This way going forward, it'll fix your non-www and WWW issue.
-
Next, you need to redirect all non-WWW indexed pages and URLs to their WWW version. This is not easy, especially with thousands of pages. However, it must be done to help preserve the PR and link-juice so it passes as much as it can through. What I recommend is seeing if there is a plugin or extension for whatever forum software you use that can aid you in this effort, or hire a programmer to build you one. It's actually not that complex to do and I have done it before in a similar situation and it does work. If you need more advice on that, PM me.
-
You need to take care of the subdomain by setting up a permanent redirect to the main WWW version if someone goes to the subdomain, and also setup redirects for existing subdomain pages/URLs that have PR/Rank/LinkJuice.
-
From there, make sure that you are utilizing sitemaps properly, that can greatly increase your indexing rate and volume.
I hope that these help, if you need anything further please do not hesitate to PM me or post here.
Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content/Similar Pages
Hello, I'm working on our site and I'm coming into an issue with the duplicate content. Our company manufactures heavy-duty mobile lifts. We have two main lifts. They are the same, except for capacity. We want to keep the format similar and the owner of the company wants each lift to have its own dedicated page. Obviously, since the layout is the same and content is similar I'm getting the duplicate content issue. We also have a section of our accessories and a section of our parts. Each of these sections have individual pages for the accessory/part. Again, the pages are laid out in a similar fashion to keep the cohesiveness, and the content is different, however similar. Meaning different terminology, part numbers, stock numbers, etc., but the overall wording is similar. What can I do to combat these issues? I think our ratings are dropping due to the duplicate content.
Technical SEO | | slecinc0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
Cross domain shared/duplicate content
Hi, I am working on two websites which share some of the same content and we can't use 301s to solve the problem; would you recommend using canonical tags? Thanks!
Technical SEO | | J_Sinclair0 -
Link Structure & Duplicate Content
I am struggling with how I should handle the link structure on my site. Right now most of my pages are like this: Home -> Department -> Service Groups -> Content Page For Example: Home -> IT Solutions -> IT Support & Managed Services -> IT Support Home -> IT Solutions -> IT Support & Managed Services -> Managed Services Home -> IT Solutions -> IT Support & Managed Services -> Help Desk Services Home -> IT Solutions -> Virtualization & Data Center Solutions -> Virtualization Home -> IT Solutions -> Virtualization & Data Center Solutions -> Data Center Solutions This structure lines up with our business and makes logical sense but I am not sure how to handle the department and service group pages. Right now you can click them and it just brings you to a page with a small snippet for the links below. The real content is on the content pages. What I am worried about is that the snippets on those pages are just a paragraph or two of the content that's on the content page. Will this hurt me and get considered duplicate content? What is the best practice for dealing with this? Those department/service group pages have some good content on them but it's just parts of other pages. Am I okay doing this because there are not direct duplicates of other pages just parts of a few pages? Any help on this would be great. Thanks in advance.
Technical SEO | | ZiaTG0 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0