Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can I use duplicate content in different US cities without hurting SEO?
-
So, I have major concerns with this plan.
My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service.
They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank."
My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction.
SEO Experts, your help is genuinely appreciated!
-
Yes, unfortunately, what the agency is suggesting is like a recipe from a duplicate content cookbook. They are trying to offer a shortcut to doing the actual work that should be involved in publishing hundreds of websites.
Has the business ever considered consolidating everything into a single brand with a single site? That way, they could have 1 page for each city and 1 page for each service, redirecting all the old sites to the new one, and never having to worry again about creating content for hundreds of microsites.
-
Thanks for the response Roman. Totally agree with you on this, but if you canonical all but one of the pages, then those pages will be dropped from the Google index right? Google will almost always display the original version regardless of location. And therefore those pages will not reap an organic traffic. That pretty much puts us back at the starting point.
I guess to simplify the question. If we use duplicate content on several sites various locations, is there any way we can get all those pages to rank in their respective markets?
-
While not appealing, you should rewrite all the content to be 100% unique, if it is privacy policy, tos, etc, you can no index those to reduce duplication. Otherwise, your options are limited. I realize that the products/ services will be similar in nature, but writing them in a different way to reduce the significantly similar content.
Alternatively, you can do a cross domain canonical tag, this tells Google that this content is duplicated intentionally on the other URL.
Here are a few articles about that:
https://moz.com/learn/seo/duplicate-content
https://blog.kissmetrics.com/myths-about-duplicate-content/
https://yoast.com/rel-canonical/
http://webmasters.stackexchange.com/questions/56326/canonical-urls-with-multiple-domains
Next, focus on building local links to the individual city pages, to further differentiate the cities and the intent. Also, using the schema.org for 'local business' on each versions of the URL's. And, again I will say this is not an ideal situation and the best case scenario would be to add that content on ONE domain just with different location pages, within a subdirectory format.
IF THE ANSWER WERE FINE ---->>>>> DONT FORGET CLICK ON THUMB UP<<<<<
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paragraphs/Tables for Content & SEO
Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?
Intermediate & Advanced SEO | | BeckyKey0 -
Competitor Title, can I use the same???
there are some pages, my competitor is ranking well and also, we have done page optimization it is 100% for page title keywords as im going to use the same title of the competitor? Will this affect me? Pls suggest wht should I do..
Intermediate & Advanced SEO | | Rahim1190 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
Better to use specific cities or counties for SEO geographics?
Hello SEO experts! We are encountering a difficult situation at our marketing firm with a client who wants to optimize her site for keyworks + counties, as she doesn't want to be restricted to one specific city. We have suggested alternate solutions like location pages, utilization of H2's, etc, however, she wants to know the effectiveness of using a specific city (ie: Winona, MN) vs a county (ie: Winona County, MN) for SEO purposes. The research I have conducted thus far hasn't gotten me very far, basically I'm seeing that it all comes back to what people search for (cleaning services in Winona, MN vs. cleaning services in Winona County, MN). Does anyone have any insight into this issue?
Intermediate & Advanced SEO | | MLTGroup0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1