Local Search | Website Issue with Duplicate Content (97 pages)
-
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page.
Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page.
Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
-
Thank you Miriam.
-
Thanks Miriam!
-
Hi Todd, I'm endorsing Kevin's response as a best answer on this, but also want to add that it will be easier on the client if he makes a plan now to begin to improve the content of key pages rather than scramble to do so after rankings suddenly fall off. Local rankings are in a constant state of flux...drops can happen so swiftly. An ounce of prevention is worth a pound of cure. I would identify the 10 most important cities and write unique content for them, then move on to the 10 next-most important and so on. Do it in a way the client can afford, at a velocity you can manage.
-
Good morning Kevin - most of the individual pages receive little traffic. Thank you for your advice and feedback.
-
Hi Daniel - thank you for response and advice!
-
Hi Todd,
How much traffic is each of those pages getting? Chances are if you look at them over 50% of them are getting little if any traffic. As you know, ranking on the first page in local search really doesn't mean much. You need to be in the top 3 (or 3-5 if maps is displaying results).
My advice would be to help the client focus on the best areas (Based on traffic, demographics, distance, etc.) and the ones that are currently driving traffic then create unique content for each of those pages. This could also bring down the too many links per page signal.
I did this with one of my clients and their rank improved to where they were #1 & #2 for their top 10 areas that were driving 90% of their traffic. If they want to continue targeting all 97 each page should have unique content. Their rankings will definitely improve if done right.
Anyways, I know it's a balancing act of the best strategy and what the clients budget will allow you to do so in the end you have to make the best decision.
Cheers,
Kevin
-
I myself have done this for many clients. I have used a generic paragraph with near duplicate content on over 3000+ pages for one client and it has been going strong for many years. I have also tested websites with near 100% duplicate body text with exception to title, description, h1, image alts and they are ranking good as well with no problems.
I would advise the client of the risk of having duplicate content. You could use textbroker to write some content for each page at around $5 each just to be safe and to feel comfortable moving forward with SEO.
Most of my clients have come to me from other SEO's and I'm always wondering what will drop off when I optimize something because the work was clearly black/grey hat. The good thing is they know the value of SEO already and agree to pay to just fix old issues before moving forward most of the time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
Utilising Wordpress Attachment Pages Without Getting Duplicate Content Warnings.
I have a wordpres site that relies heavily on images and their usefulness. Each post links to larger sizes of the images with links back to the post and the "gallery" all images uploaded to the post. Unfortunately this goes against the "rules" and our attachment page show as duplicate content in Google (even though the image titles are different). There must be a way to utlise and make the most of attachment pages without getting duplicate content warnings?
Technical SEO | | DotP0 -
Moz Crawl Diagnostic shows lots of duplicate content issues
Hi my client's website uses URL with www and without www. In page/title both website shows up. The one with www has page authority of 51 and the one without 45. In Moz diagnostic I can see that the website shows over 200 duplicate content which are not found in , e.g. Webmaster. When I check each page and add/remove www then the website shows the same content for both www and no www. It is not redirect - in search tab it actually shows www and then if you use no www it doesn't show www. Is the www issue to blame? or could it be something else? and what do I do since both www URL and no-www URL have high authority, just set up redirect from lower authority URL to higher authority URL?
Technical SEO | | GardenPet0 -
If content is at the bottom of the page but the code is at the top, does Google know that the content is at the bottom?
I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!
Technical SEO | | DA20130 -
Duplicate Content
Hi, I'm working on a site and I'm having some issues with its structure causing duplicate content. The first issue is that the search pages will show up as duplicates.
Technical SEO | | OOMDODigital
A search for new inventory may be new.aspx
The duplicate may be something like new.aspx=page1, or something like that and so on. The second issue is with inventory. When new inventory gets put into the stock of the store, a new page for that item will be populated with duplicate content. There appears to be no canonical source for that page. How can I fix both of these? Thanks!0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
How do I fix Duplicate Content/Title going to memberlist.php page?
I have over 6,000 duplicate title and duplicate content errors going to this link: http://community.mautofied.com/memberlist.php?mode=viewprofile&u=100299 How do I fix this?
Technical SEO | | mautofied0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0