SEOMoz Internal Dupe. Content & Possible Coding Issues
-
SEOmoz Community!
I have a relatively complicated SEO issue that has me pretty stumped...
First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point.
Now, the issues I am having:
- I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx
A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments).
I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages:
-
Car Financing How It Works
-
A Home Loan is Possible with Bad Credit
(Please let me know if you could use more examples)
At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords.
All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not!
I could really use some significant help here...
- Both of our sites have a number of access points:
http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such.
However, my sites can also be reached via the following:
http://www.federalautoloan.com
http://www.federalautoloan.com/default.aspx
http://www.federalmortgageservices.com
http://www.federalmortgageservics.com/default.aspx
Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags?
I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details.
Thank you all in advance for the help! I greatly appreciate it!
-
Hey Cyrus,
Thank you very much for the detailed response!
-
Hi Colt,
Looks to me like you're getting a duplicate content errors because your page templates are so large, they are tripping the SEOmoz duplicate content filter, which goes off if more than 95% of your code is similar between 2 pages.
For example, take a look at these 2 URLs.
http://www.federalautoloan.com/Why-Shopping-for-an-Auto-Loan-is-Good.aspx
http://www.federalautoloan.com/Regarding-Dealer-Financing.aspx
With the gazillions of links at the bottom of the two pages, the pages have 98% similar code. (You can check it out yourself with this duplicate content tool) The good news is the TEXT content similarity is less than 40%.
1. Google is more sophisticated than Moz, but it would be a good idea to remove some of those links and put them into categories. If you could get these 100 or so links down to 20, that would be closer to ideal
2. Just a recommendation > most of your text is in a scroll box. I'd reformat your page so that all the text was visible without the box. Not sure if this is hurting you or not, but it seems contrary to best user experience, so I'd be inclined to think Google doesn't look to favorably on it.
3. Noticed you blocked a lot of files in your robots.txt, including your css. Unless you have a very specific reason for keeping Google out of these files, I'd let them crawl as Google uses CSS to render your page to see what content is above and below the fold.
4. Best practices is to redirect non-www to www versions of your site (or vice versa). If you can't do this, a canonical tag will do just as well. But how about redirecting everything WITHOUT the /index.aspx? That would look cleaner in search results.
Hope this helps. Best of luck with your SEO!
-
Hello Colt.
You have the joys of a Microsoft webserver.
That means all of these URLs work - and many more:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CRedit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREdit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREDit.aspx
It does exactly the same thing if you remove the www.
- duplicate content issues.
Also, if you do this:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspxZ
- you return a 200 response code and serve up the front page
I didn't find a way to make your server give me a 404
-
As an aside: please feel free to extend any other SEO suggestions you may have my way! I am doing my best to learn the SEO trade... and ANY advice is appreciated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Site Migration
Hi guys, In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc. The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk A lot of the content between these English based countries are the same. E.g. same product descriptions.
Intermediate & Advanced SEO | | jayoliverwright
So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site? So for: Australian pages: United States pages: Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering? Thankyou, Chris0 -
A new website issue
Hello everybody,
Intermediate & Advanced SEO | | mtmaster
I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Should I move our blog internal....
I wanted to also ask the wider moz community this question. Our blogs are currently run on blogger/wordpress using a subdomain strategy - blog.website.com and has now gained a home page PR3. It's been running for 2-3 years. This runs contrary to best practice of website.com/blog. I'm now considering making the blog internal but want to get your opinion as the longer I leave it, the bigger a decision it will be.... Do the pro's of making the blog internal outweigh the cons of doing so ? Pro's Blog benefits from root domain Fresh content on the site that people can interact with Root domain benefits from links the content gains Easier to analyse user activity Con's Loss of Page Rank Effort to 301 all URL's and content CMS altered to allow creation of blog content
Intermediate & Advanced SEO | | RobertChapman0 -
Content Landing Page
Hey Mozzers, I wanted to get some opinions on here. I'm going to be building out the content on my site a lot of the next couple of months, and have recently started thinking about creating a content landing page. For those not familiar with the concept it's the idea of building this page that basically just pulls together all the content you've written on a specific subject & serves as hopefully a link magnet & destination for people interested in the topic. So my question is this, I am just outlining all of the different posts & areas that I want to cover on specific topics & it is a lot. I'm talking ~20 posts on each subject. Do you think that would be too much content to try & get on one page? Should I break it down to a more finite 5-7 links to high quality articles per page, or create basically this monster guide that links to all these different articles I'll create. Looking forward to getting your opinion, Chris
Intermediate & Advanced SEO | | chris.kent0 -
Is it possible to "undo" canonical tags as unique content is created?
We will soon be launching an education site that teaches people how to drive (not really the topic, but it will do). We plan on being content rich and have plans to expand into several "schools" of driving. Currently, content falls into a number of categories, for example rules of the road, shifting gears, safety, etc. We are going to group content into general categories that apply broadly, and then into "schools" where the content is meant to be consumed in a specific order. So, for example, some URLs in general categories may be: drivingschool.com/safety drivingschool.com/rules-of-the-road drivingschool.com/shifting-gears etc. Then, schools will be available for specific types of vehicles. For example, drivingschool.com/cars drivingschool.com/motorbikes etc. We will provide lessons at the school level, and in the general categories. This is where it gets tricky. If people are looking for general content, then we want them to find pages in the general categories (for example, drivingschool.com/rules-of-the-road/traffic-signs). However, we have very similar content within each of the schools (for example, drivingschool.com/motorbikes/rules-of-the-road/traffic-signs). As you could imagine, sometimes the content is very unique between the various schools and the general category (such as in shifting), but often it is very similar or even nearly duplicate (as in the example above). The problem is that in the schools we want to say at the end of the lesson, "after this lesson, take the next lesson about speed limits for motorcycles" so there is a very logical click-path through the school. Unfortunately this creates potential duplicate content issues. The best solution I've come up with is to include a canonical tag (pointing to the general version of the page) whenever there is content that is virtually identical. There will be cases though where we adjust the content "down the road" 🙂 to be more unique and more specific for the school. At that time we'd want to remove the canonical tag. So two questions: Does anyone have any better ideas of how to handle this duplicate content? If we implement canonical tags now, and in 6 months update content to be more school-specific, will "undoing" the canonical tag (and even adding a self-referential tag) work for SEO? I really hope someone has some insight into this! Many thanks (in advance).
Intermediate & Advanced SEO | | JessicaB0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0