New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with this duplicate content
Hello our websites offers prayer times in the US and UK. The problem is that we have nearby towns where the prayer times are the same and the pages (exp : https://prayer-times.us/prayer-times-lake-michigan-12258-en and https://prayer-times.us/prayer-times-lake-12147-en) are in duplicate . Same issue for this page https://prayer-time.uk/prayer-times-wallsend-411-en How can we solve this problem
On-Page Optimization | | Zakirou0 -
How to break the news to a client?
I have been hired to work with a dating website to help with some SEO. The website itself is in a lot of trouble from an SEO standpoint. I have been working on a plan with them for how to fix their problems. They are currently averaging 1.6k unique users a month and wants to get to 50k in 3 months and 200k in 6 months. This seems entirely unreasonable without spending a lot of money on SEM/Social/Other marketing avenues.
On-Page Optimization | | HashtagHustler0 -
Acquired Old, Bad Content Site That Ranks Great. Redirect to Content on My Site?
Hello. my company acquired another website. This website is very old, the content within is decent at best, but still manages to rank very well for valuable phrases. Currently, we're leaving the entire site active on its own for its brand, but i'd like to at least redirect some of the content back to our main website. I can't justify spending the time to create improved content on that site and not our main site though. What would be the best practice here? 1. Cross-domain canonical - and build the new content on our main website? 2. 301 Redirect Old Article to New Location containing better article 3. Leave the content where it is - you won't be able to transfer the ranking across domain. Thanks for your input.
On-Page Optimization | | Blenny0 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Boat broker - issues with duplicate content and indexing search results
Hello, I have read a lot about optimising product pages and not indexing search results or category pages as ideally a person should be directed straight to a product page. I am interested in how best to approach a site that is listing second hand products for sale - essentially a marketplace of second hand goods (in my case, www.boatshed.com - international boat brokers). For example, we currently have 5 Colvic Sailer 26 boats for sale across the world - that is 5 boats of the same make and model but differing years, locations, sellers and prices. My concern is with search results and 'category' pages. Unlike typical e-commerce sites, when someone searches for a 'Colvic sailer 26 for sale' I want them to go to a search results style page as it is more useful for them to see a list of boats than one random one that Google decides is most important (or possibly one it can match by location). Currently we have 3 different URL types to show search results style pages (i.e. paginated lists of boats that include name, image and short description):
On-Page Optimization | | pbscreative
manufacturer URL's e.g. http://www.boatshed.com/colvic-manufacturer-145.html
category URL's e.g. barges http://www.boatshed.com/barges-category-55.html
and normal search results e.g. dosearch.php?form_boattype_textbox=&.... I have noindexed the search results pages but our category and manufacturer URLs show up in search results and ultimately these are pages I want people to land on. I am however getting duplicate content warnings in Moz. Most boats are in several categories and all will come up on 1 manufacturer and one manufacturer and model page. Both sets of URL's are in my opinion needed; lots of users search for exact makes / models and lots of users just search for the type of boat e.g. 'barge for sale' so both sets of landing pages are useful. Any suggestions or thoughts greatly appreciated Thanks Ben0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0