Up to my you-know-what in duplicate content
-
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google.
The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages?
Thanks.
-
You've pretty much confirmed my suspicions. I can set the redirects up myself, its just been about 5 years since I've done any SEO work. What I meant was should I mod_rewrite or "redirect 301 /oldurl /newurl" ...I've forgot a lot of stuff that I used to do with ease. My own sites were always started off right and weren't as bad as the one I'm working on now, so I'm in unfamiliar territory. Thanks for your advice, I appreciate it
-
I want to make sure that you are getting the proper advice. Can you provide me the URLs here, or PM them to me to keep them private? Once I see the problem firsthand, I can reply with the answer here for you. I am pretty sure my advice above is the way to go, but it doesn't hurt to double check!
You need to choose ONE domain for going forward. I don't care which one it is, but choose one. It makes sense to choose the one with the better rankings, at least from my perspective.
After that, you 301 redirect all versions the URLs to the proper URL (which would be WWW if it was my choice).
Yes, mod_rewrite is a server-side redirect that you can choose. Make sure whoever sets them up knows what he is doing. Having a ton of server-side redirects can increase load times and cause issues with site speed if it is not done properly. Don't be afraid of doing it, but just make sure you know what you are doing, especially since you're dealing with thousands of URLs.
You want to use permanent 301 redirects, yes.
-
Thanks I appreciate the advice. So you don't think having 2 seperate domains pointing (or redirecting) to each other occasionally will hurt anything? I have like 1000+ URLs I need to redirect already on the completely separate domain.com, as for the keyworddomain.com forum I don't think I need too many redirects as just one from seperate.domain.com to keyworddomain.com, and then one there from nonWWW to WWW should fix all the broken URLs right? When you say 301 do you mean "redirect 301" or mod_rewrite? Thanks for the help
-
I would first, choose which version you want to use going forward. You have three versions: subdomain, non-www, and www. Don't use the subdomain, that is a given. I personally like using WWW instead of non-WWW, however there are reasons to use non-WWW over WWW. But, given this scenario, it makes sense to use the WWW version. I know that the non-WWW version has more pages indexed, but pages indexed doesn't mean much in the grand scheme of things. Given that WWW has good rankings and is more identifiable to a user, I would choose that. Of course, if you choose non-WWW my advice below will remain the same.
Now that you have chosen what version you want to use going forward, you need to do a few things:
-
Implement a .htaccess 301 server-side redirect and redirect non-WWW to WWW (or vice versa if you so choose), make sure it's permanent. This way going forward, it'll fix your non-www and WWW issue.
-
Next, you need to redirect all non-WWW indexed pages and URLs to their WWW version. This is not easy, especially with thousands of pages. However, it must be done to help preserve the PR and link-juice so it passes as much as it can through. What I recommend is seeing if there is a plugin or extension for whatever forum software you use that can aid you in this effort, or hire a programmer to build you one. It's actually not that complex to do and I have done it before in a similar situation and it does work. If you need more advice on that, PM me.
-
You need to take care of the subdomain by setting up a permanent redirect to the main WWW version if someone goes to the subdomain, and also setup redirects for existing subdomain pages/URLs that have PR/Rank/LinkJuice.
-
From there, make sure that you are utilizing sitemaps properly, that can greatly increase your indexing rate and volume.
I hope that these help, if you need anything further please do not hesitate to PM me or post here.
Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content for www and non-www. Help!
Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing
Technical SEO | | QRate0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Question about duplicate content in crawl reports
Okay, this one's a doozie: My crawl report is listing all of these as separate URLs with identical duplicate content issues, even though they are all the home page and the one that is http://www.ccisolutions.com (the preferred URL) has a canonical tag of rel= http://www.ccisolutions.com: http://www.ccisolutions.com http://ccisolutions.com http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain I will add that OSE is recognizing that there is a 301-redirect on http://ccisolutions.com, but the duplicate content report doesn't seem to recognize the redirect. Also, every single one of our 404-error pages (we have set up a custom 404 page) is being identified as having duplicate content. The duplicate content on all of them is identical. Where do I even begin sorting this out? Any suggestions on how/why this is happening? Thanks!
Technical SEO | | danatanseo1 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0