Local Search | Website Issue with Duplicate Content (97 pages)
-
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page.
Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page.
Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
-
Thank you Miriam.
-
Thanks Miriam!
-
Hi Todd, I'm endorsing Kevin's response as a best answer on this, but also want to add that it will be easier on the client if he makes a plan now to begin to improve the content of key pages rather than scramble to do so after rankings suddenly fall off. Local rankings are in a constant state of flux...drops can happen so swiftly. An ounce of prevention is worth a pound of cure. I would identify the 10 most important cities and write unique content for them, then move on to the 10 next-most important and so on. Do it in a way the client can afford, at a velocity you can manage.
-
Good morning Kevin - most of the individual pages receive little traffic. Thank you for your advice and feedback.
-
Hi Daniel - thank you for response and advice!
-
Hi Todd,
How much traffic is each of those pages getting? Chances are if you look at them over 50% of them are getting little if any traffic. As you know, ranking on the first page in local search really doesn't mean much. You need to be in the top 3 (or 3-5 if maps is displaying results).
My advice would be to help the client focus on the best areas (Based on traffic, demographics, distance, etc.) and the ones that are currently driving traffic then create unique content for each of those pages. This could also bring down the too many links per page signal.
I did this with one of my clients and their rank improved to where they were #1 & #2 for their top 10 areas that were driving 90% of their traffic. If they want to continue targeting all 97 each page should have unique content. Their rankings will definitely improve if done right.
Anyways, I know it's a balancing act of the best strategy and what the clients budget will allow you to do so in the end you have to make the best decision.
Cheers,
Kevin
-
I myself have done this for many clients. I have used a generic paragraph with near duplicate content on over 3000+ pages for one client and it has been going strong for many years. I have also tested websites with near 100% duplicate body text with exception to title, description, h1, image alts and they are ranking good as well with no problems.
I would advise the client of the risk of having duplicate content. You could use textbroker to write some content for each page at around $5 each just to be safe and to feel comfortable moving forward with SEO.
Most of my clients have come to me from other SEO's and I'm always wondering what will drop off when I optimize something because the work was clearly black/grey hat. The good thing is they know the value of SEO already and agree to pay to just fix old issues before moving forward most of the time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on Product pages for different product variations.
I have multiple colors of the same product, but as a result I'm getting duplicate content warnings. I want to keep these all different products with their own pages, so that the color can be easily identified by browsing the category page. Any suggestions?
Technical SEO | | bobjohn10 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Uservoice and Duplicate Page Content
Hello All, I'm having an issue where the my UserVoice account is creating duplicate page content (image attached). Any ideas on how to resolve the problem? A couple solutions we're looking into: moving the uservoice content inside the app, so it won't get crawled, but that's all we got for now. Thank you very much for your time any insight would be helpful. Sincerely,
Technical SEO | | JonnyBird1
Jon Birdsong SalesLoft duplicate duplicate0 -
We are still seeing duplicate content on SEOmoz even though we have marked those pages as "noindex, follow." Any ideas why?
We have many pages on our website that have been set to "no index, follow." However, SEOmoz is indexing them as duplicate content. Why is that?
Technical SEO | | cmaseattle0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Htm vs. aspx page extensions & duplicate content
We have a client whose site is fairly new. There isn't much in the way of SEO results so far. In their content management system they have implemented friendly URLs and changed the extensions from aspx to htm. Now the htm pages are all indexed in Google but when I run a campaign report in SEOmoz it shows that all pages are duplicated with there being both htm and aspx pages for each page. Should we do 301 redirects from the aspx pages to the htm pages? Or would we be safe by removing the htm pages and letting Google reindex the site with the aspx page extensions? Does Google have any kind of preference as to what the page extensions are as long as the URLs include keywords?
Technical SEO | | IvieDigital0