Can I use duplicate content in different US cities without hurting SEO?
-
So, I have major concerns with this plan.
My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service.
They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank."
My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction.
SEO Experts, your help is genuinely appreciated!
-
Yes, unfortunately, what the agency is suggesting is like a recipe from a duplicate content cookbook. They are trying to offer a shortcut to doing the actual work that should be involved in publishing hundreds of websites.
Has the business ever considered consolidating everything into a single brand with a single site? That way, they could have 1 page for each city and 1 page for each service, redirecting all the old sites to the new one, and never having to worry again about creating content for hundreds of microsites.
-
Thanks for the response Roman. Totally agree with you on this, but if you canonical all but one of the pages, then those pages will be dropped from the Google index right? Google will almost always display the original version regardless of location. And therefore those pages will not reap an organic traffic. That pretty much puts us back at the starting point.
I guess to simplify the question. If we use duplicate content on several sites various locations, is there any way we can get all those pages to rank in their respective markets?
-
While not appealing, you should rewrite all the content to be 100% unique, if it is privacy policy, tos, etc, you can no index those to reduce duplication. Otherwise, your options are limited. I realize that the products/ services will be similar in nature, but writing them in a different way to reduce the significantly similar content.
Alternatively, you can do a cross domain canonical tag, this tells Google that this content is duplicated intentionally on the other URL.
Here are a few articles about that:
https://moz.com/learn/seo/duplicate-content
https://blog.kissmetrics.com/myths-about-duplicate-content/
https://yoast.com/rel-canonical/
http://webmasters.stackexchange.com/questions/56326/canonical-urls-with-multiple-domains
Next, focus on building local links to the individual city pages, to further differentiate the cities and the intent. Also, using the schema.org for 'local business' on each versions of the URL's. And, again I will say this is not an ideal situation and the best case scenario would be to add that content on ONE domain just with different location pages, within a subdirectory format.
IF THE ANSWER WERE FINE ---->>>>> DONT FORGET CLICK ON THUMB UP<<<<<
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Duplicate Content - Classifieds (Panda)
I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.
Intermediate & Advanced SEO | | Sayers1 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Is a different location in page title, h1 title, and meta description enough to avoid Duplicate Content concern?
I have a dynamic website which will have location-based internal pages that will have a <title>and <h1> title, and meta description tag that will include the subregion of a city. Each page also will have an 'info' section describing the generic product/service offered which will also include the name of the subregion. The 'specific product/service content will be dynamic but in some cases will be almost identical--ie subregion A may sometimes have the same specific content result as subregion B. Will the difference of just the location put in each of the above tags be enough for me to avoid a Duplicate Content concern?</p></title>
Intermediate & Advanced SEO | | couponguy0 -
How can we improve the seo on our site?
Hello everyone. I have been reading through this site for a while and tried to put everything together that I have learned so far. Would any of you mind looking at our site and providing any pointers or areas we can still improve on or areas I completely missed. I appreciate any feedback you can give! Our site is faithology.com Thanks again! Brandon
Intermediate & Advanced SEO | | BMPIRE0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Serving different content based on IP location
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B. Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B. My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript? We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient? Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
Intermediate & Advanced SEO | | ChatterBlock0