SEOMoz Internal Dupe. Content & Possible Coding Issues
-
SEOmoz Community!
I have a relatively complicated SEO issue that has me pretty stumped...
First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point.
Now, the issues I am having:
- I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx
A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments).
I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages:
-
Car Financing How It Works
-
A Home Loan is Possible with Bad Credit
(Please let me know if you could use more examples)
At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords.
All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not!
I could really use some significant help here...
- Both of our sites have a number of access points:
http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such.
However, my sites can also be reached via the following:
http://www.federalautoloan.com
http://www.federalautoloan.com/default.aspx
http://www.federalmortgageservices.com
http://www.federalmortgageservics.com/default.aspx
Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags?
I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details.
Thank you all in advance for the help! I greatly appreciate it!
-
Hey Cyrus,
Thank you very much for the detailed response!
-
Hi Colt,
Looks to me like you're getting a duplicate content errors because your page templates are so large, they are tripping the SEOmoz duplicate content filter, which goes off if more than 95% of your code is similar between 2 pages.
For example, take a look at these 2 URLs.
http://www.federalautoloan.com/Why-Shopping-for-an-Auto-Loan-is-Good.aspx
http://www.federalautoloan.com/Regarding-Dealer-Financing.aspx
With the gazillions of links at the bottom of the two pages, the pages have 98% similar code. (You can check it out yourself with this duplicate content tool) The good news is the TEXT content similarity is less than 40%.
1. Google is more sophisticated than Moz, but it would be a good idea to remove some of those links and put them into categories. If you could get these 100 or so links down to 20, that would be closer to ideal
2. Just a recommendation > most of your text is in a scroll box. I'd reformat your page so that all the text was visible without the box. Not sure if this is hurting you or not, but it seems contrary to best user experience, so I'd be inclined to think Google doesn't look to favorably on it.
3. Noticed you blocked a lot of files in your robots.txt, including your css. Unless you have a very specific reason for keeping Google out of these files, I'd let them crawl as Google uses CSS to render your page to see what content is above and below the fold.
4. Best practices is to redirect non-www to www versions of your site (or vice versa). If you can't do this, a canonical tag will do just as well. But how about redirecting everything WITHOUT the /index.aspx? That would look cleaner in search results.
Hope this helps. Best of luck with your SEO!
-
Hello Colt.
You have the joys of a Microsoft webserver.
That means all of these URLs work - and many more:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CRedit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREdit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREDit.aspx
It does exactly the same thing if you remove the www.
- duplicate content issues.
Also, if you do this:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspxZ
- you return a 200 response code and serve up the front page
I didn't find a way to make your server give me a 404
-
As an aside: please feel free to extend any other SEO suggestions you may have my way! I am doing my best to learn the SEO trade... and ANY advice is appreciated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bigcommerce & Blog Tags causing Duplicate Content?
Curious why moz would pick up our blog tags as causing duplicate content, when each blog has a rel canonical tag pointing to either the blog post itself and on the tag pages points to the blog as a whole. Kinda want to get rid of the tags in general now, but also feel they can add some extra value to UX later on when we have many more blog posts. Curious if anyone knows a way around this or even a best solution practice when faced with such odd issues? I can see why the duplicate content would happen, but when grouping content into categories?
Intermediate & Advanced SEO | | Deacyde0 -
Showing Different Content To Members & Non-Members/Google and Cloaking Risk
How do we safely show logged-in members/Google one type of content on a page and logged out/non-members another kind of content without getting slammed for cloaking? Right now we do this thing where we show Google everything on the page, but new visitors partial forum comments with the pitch to sign up and see full comments. So far, we have not gotten into trouble for this. The new idea is to show non-members a lot of marketing messages and one kind of navigation and then once they sign up and are logged in, show different or no marketing messages and a different kind of navigation. How do we stay out of trouble with this? Where is the cloaking line drawn? It's got me kinda nervous. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Content Above The Fold (strategies)
Does anyone know if using a wide responsive layout that brings content well above the fold on big screens (but still pushes it down on small screens or mobile devices) is a good option? We have an adsense site that just got destroyed and I'm assuming its this new Google algo that's looking at sites with too big of ads above the fold.
Intermediate & Advanced SEO | | iAnalyst.com0 -
Should I have as few internal links as possible?
On most pages of my site i have a Quick Links section, which gives x3 cross sales links to other products, a newsletter sign up link, link to Blog, x4 links from images to surveys, newsletters, feedback etc. Will these links be hurting my optimal SEO juice between pages, should the number of internal links be kept to a minimum? My site is www.over50choices.co.uk if that helps. Thanks
Intermediate & Advanced SEO | | AshShep1
Ash0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0 -
Image and Content Management
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images. Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc... I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
Intermediate & Advanced SEO | | KJ-Rodgers0