Local Search | Website Issue with Duplicate Content (97 pages)
-
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page.
Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page.
Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
-
Thank you Miriam.
-
Thanks Miriam!
-
Hi Todd, I'm endorsing Kevin's response as a best answer on this, but also want to add that it will be easier on the client if he makes a plan now to begin to improve the content of key pages rather than scramble to do so after rankings suddenly fall off. Local rankings are in a constant state of flux...drops can happen so swiftly. An ounce of prevention is worth a pound of cure. I would identify the 10 most important cities and write unique content for them, then move on to the 10 next-most important and so on. Do it in a way the client can afford, at a velocity you can manage.
-
Good morning Kevin - most of the individual pages receive little traffic. Thank you for your advice and feedback.
-
Hi Daniel - thank you for response and advice!
-
Hi Todd,
How much traffic is each of those pages getting? Chances are if you look at them over 50% of them are getting little if any traffic. As you know, ranking on the first page in local search really doesn't mean much. You need to be in the top 3 (or 3-5 if maps is displaying results).
My advice would be to help the client focus on the best areas (Based on traffic, demographics, distance, etc.) and the ones that are currently driving traffic then create unique content for each of those pages. This could also bring down the too many links per page signal.
I did this with one of my clients and their rank improved to where they were #1 & #2 for their top 10 areas that were driving 90% of their traffic. If they want to continue targeting all 97 each page should have unique content. Their rankings will definitely improve if done right.
Anyways, I know it's a balancing act of the best strategy and what the clients budget will allow you to do so in the end you have to make the best decision.
Cheers,
Kevin
-
I myself have done this for many clients. I have used a generic paragraph with near duplicate content on over 3000+ pages for one client and it has been going strong for many years. I have also tested websites with near 100% duplicate body text with exception to title, description, h1, image alts and they are ranking good as well with no problems.
I would advise the client of the risk of having duplicate content. You could use textbroker to write some content for each page at around $5 each just to be safe and to feel comfortable moving forward with SEO.
Most of my clients have come to me from other SEO's and I'm always wondering what will drop off when I optimize something because the work was clearly black/grey hat. The good thing is they know the value of SEO already and agree to pay to just fix old issues before moving forward most of the time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Pages on GWT when redesigning website
Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited. I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
Technical SEO | | Essentia
It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that: a. 'This old URL has been redirected, therefore please index the new URL'? or
b. 'Please keep this old URL in your index'? What's your view on this? Thanks1 -
Duplicate content on report
Hi, I just had my Moz Campaign scan 10K pages out of which 2K were duplicate content and URL's are http://www.Somesite.com/modal/register?destination=question%2F37201 http://www.Somesite.com/modal/register?destination=question%2F37490 And the title for all 2K is "Register" How can i deal with this as all my pages have the register link and login and when done it comes back to the same page where we left and that it actually not duplicate but we need to deal with it propely thanks
Technical SEO | | mtthompsons0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
Issue Duplicate Page Title
I'm having some really strange issues with duplicate page titles and I can't seem to figure out what's going on. I just got a new crawl from SEOMOZ and it's showing some duplicate page titles. http://www.example.com/blog/ http://www.example.com/blog/page/2/ http://www.example.com/blog/page/3/ Repeat .............. I have no idea what's going on, how these were duplicated, or how to correct it. Does anyone have a chance to take a look and see if you can figure out what's happening and what I need to do to correct the errors? I'm using Wordpress and all in one SEO plugin. Thanks so much!
Technical SEO | | KLLC0 -
Product Duplicate Content Issue with Google Shopping
I have a site with approx 20,000 products. These products are resold to hundreds of other companies and are fed from one database therefore the content is duplicated many many times. To overcome this, we are launching the site with noindex meta tags on all product pages. (In phase 2 we will begin adding unique content for every product eek) However, we still want them to appear in Google Shopping. Will this happen or will it have to wait until we remove the noindex tags?
Technical SEO | | FGroup0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0