SEOMoz Internal Dupe. Content & Possible Coding Issues
-
SEOmoz Community!
I have a relatively complicated SEO issue that has me pretty stumped...
First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point.
Now, the issues I am having:
- I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx
A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments).
I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages:
-
Car Financing How It Works
-
A Home Loan is Possible with Bad Credit
(Please let me know if you could use more examples)
At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords.
All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not!
I could really use some significant help here...
- Both of our sites have a number of access points:
http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such.
However, my sites can also be reached via the following:
http://www.federalautoloan.com
http://www.federalautoloan.com/default.aspx
http://www.federalmortgageservices.com
http://www.federalmortgageservics.com/default.aspx
Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags?
I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details.
Thank you all in advance for the help! I greatly appreciate it!
-
Hey Cyrus,
Thank you very much for the detailed response!
-
Hi Colt,
Looks to me like you're getting a duplicate content errors because your page templates are so large, they are tripping the SEOmoz duplicate content filter, which goes off if more than 95% of your code is similar between 2 pages.
For example, take a look at these 2 URLs.
http://www.federalautoloan.com/Why-Shopping-for-an-Auto-Loan-is-Good.aspx
http://www.federalautoloan.com/Regarding-Dealer-Financing.aspx
With the gazillions of links at the bottom of the two pages, the pages have 98% similar code. (You can check it out yourself with this duplicate content tool) The good news is the TEXT content similarity is less than 40%.
1. Google is more sophisticated than Moz, but it would be a good idea to remove some of those links and put them into categories. If you could get these 100 or so links down to 20, that would be closer to ideal
2. Just a recommendation > most of your text is in a scroll box. I'd reformat your page so that all the text was visible without the box. Not sure if this is hurting you or not, but it seems contrary to best user experience, so I'd be inclined to think Google doesn't look to favorably on it.
3. Noticed you blocked a lot of files in your robots.txt, including your css. Unless you have a very specific reason for keeping Google out of these files, I'd let them crawl as Google uses CSS to render your page to see what content is above and below the fold.
4. Best practices is to redirect non-www to www versions of your site (or vice versa). If you can't do this, a canonical tag will do just as well. But how about redirecting everything WITHOUT the /index.aspx? That would look cleaner in search results.
Hope this helps. Best of luck with your SEO!
-
Hello Colt.
You have the joys of a Microsoft webserver.
That means all of these URLs work - and many more:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CRedit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREdit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREDit.aspx
It does exactly the same thing if you remove the www.
- duplicate content issues.
Also, if you do this:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspxZ
- you return a 200 response code and serve up the front page
I didn't find a way to make your server give me a 404
-
As an aside: please feel free to extend any other SEO suggestions you may have my way! I am doing my best to learn the SEO trade... and ANY advice is appreciated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Question - issue
A while back we had a 'bleed' on one of our sites, which basically meant one of our sites started to leak across pages to another and that site started to rank for the same pages and now we have hundreds of pages ranking for urls that do not exists. It's hard to explain, bare with me. If you were to click on the cached view in Google for the ranked page it would show you the main site, but if you were to click it as usual, then you would be taken to the site but a 404 would show as the intended page was not for that site. We believe we fixed the 'bleed' and have setup 301s for all the affected pages to go to the home page for the site it affected. But these pages have not been removed from Google, which we thought a 301 would do. So we still have hundreds of pages being ranked but are redirected to the home page. Why hasn't these pages been removed?
Intermediate & Advanced SEO | | JH_OffLimits0 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
International Subdomain Headache
My client set up a separate domain for their international clients, then set up separate subdomains for each country where they're active (so, for example, the original site is xx.com and the global is xxworldwide.com, with subdomains like mx.xxxworldwide.com). They auto-translated a large amount of content and put the translations on those international sites. The idea was to draw in native speakers. Now, I don't think this is a great practice, obviously, and I'm worried that it could hurt their original site (the xxx.com in the example above). My concern is that Google will see through the translated text, since it was handled with Google Translate, and penalize both sites. I don't think the canonical tag applies here, since Google recommends a no-follow for autotranslated text, but I've also never dealt with this type of situation before. Anyways, if you made it through all of that, congratulations. My question is whether xxx.com is getting any negative effects other than a potential loss of link juice -- and whether there's any legitimate way to present auto-translated text with a few minor changes without incurring a penalty.
Intermediate & Advanced SEO | | Ask44435230 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Copying contents from a blog site (External) to a company blogsite (internal)
Hi, I have a client that has several external blogs www.blogsite1.info www.blogsite2.info and he also has the www.companywebsite.com the main domain of course is the comapnywebsite.com. They are doing some thing wrong, because instead of generating contents inside the main domain, the create contents in the blogsites and send links to the blogsites to see those contents. So they are inviting their users to EXIT the website... So, I told him, If you want to generate contents, please keep a blog INSIDE your domain www.companywebsite.com/blog, but keep the other ones, cause they are generating links (they are .info domains, that is not good, but they are nice keyword match domains) Now, he told me he was thinking on copy and paste the contents from the external blogsites to the internal website. I warned him about generating duplicate content. But.... is it really a problem? They are not in the same domain... Could google give a penalty because of that to the main domain? Thanks!
Intermediate & Advanced SEO | | teconsite0 -
Duplicate content across internation urls
We have a large site with 1,000+ pages of content to launch in the UK. Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration. We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option. Thanks
Intermediate & Advanced SEO | | jazavide0 -
A First For Me: Client Wants New Website & Completely Updated Content
A client that I worked with on another project has approached me asking if I can handle the transition from his old website to his new website. He already had the new website designed and the URL structure is completely different than the old one. Normally this would not be a problem, just 301 redirect each page to the new page. However, this client has COMPLETELY redone his website from the ground up including navigation, pages and page content. The old website has been around for 8 years and is ranking for some good keywords. The reason he decided to build the new site is because the URL of his old domain is very long and for whatever reason he didn't like it (I'm assuming this would be misspellings of people trying to get back to his website and long email addresses, but he didn't clarify). I have never dealt with such a drastic change before and wanted the SEOMoz community input on the best way to pass authority/link juice from the old domain to the new one. Thanks in advance for your help.
Intermediate & Advanced SEO | | Bo-Jangles0