Duplicate Content on Website with Multiple Locations
-
Hi there,
I've spent hours reading posts on duplicate content and googling this but I'm still not sure what to do.
We created a site that has two WP installs for a company with two different locations - the landing page is website.com and links to WP install 1 (website.com/city1), and WP install 2 (website.com/city2). They specifically wanted two different sites so they could be managed by staff at either location. However some of the pages have the same content - ie. services, policies, etc. so all of those are showing errors for duplicate content. All pages have different city-specific URL's and meta-descriptions but that clearly doesn't help.
We can't redirect the "duplicate" pages because then it would take the user to the other city's specific site. Is there anything we can do?? Is this going to significantly damage rankings?
Thanks kindly for any help you can provide.
-
HI,
It doesn't redirect the user, no. It tells Google 'which URL' you prefer to be indexed. Now, again I don't believe that this is the best option, as you want BOTH cities to rank. Does it affect the rankings? Yes, because you are saying Page A is more valuable than the duplicated Page B- leaving Page B out in the wild as the less important page for ranking.
So, again having two versions of the domains ( based on the cities) isn't beneficial, these should both be under one domain ( wp installation) and adding a "locations" page, to reduce the self competition.
-
Thank you for your help Tammy. I read through your links when you first replied and they helped a lot.
Pardon my lack of knowledge here, but I just want to make sure I understand correctly: If I go the rel=canonical route for the "duplicate" pages, it won't actually redirect the user, but will just tell google where the "original" page lives, and that it's duplicated on purpose correct? Does that then hurt the rankings for the site that's not showing as the "original" content?
Thanks again for your help.
-
While not appealing, you should rewrite all the content to be 100% unique, if it is privacy policy, tos, etc, you can no index those to reduce duplication. Otherwise, your options are limited. I realize that the products/ services will be similar in nature, but writing them in a different way to reduce the significantly similar content.
Alternatively, you can do a cross domain canonical tag, this tells Google that this content is duplicated intentionally on the other URL.
Here are a few articles about that:
https://yoast.com/rel-canonical/
http://webmasters.stackexchange.com/questions/56326/canonical-urls-with-multiple-domains
Next, focus on building local links to the individual city pages, to further differentiate the cities and the intent. Also, using the schema.org for 'local business' on each versions of the URL's. And, again I will say this is not an ideal situation and the best case scenario would be to add that content on ONE domain just with different location pages, within a subdirectory format.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Location National vs. Country - Different Ranking
Why is the ranking for a keyword different for the locations National and for example Austria ... both should bring the ranking for the whole country, but often they are different?
Moz Bar | | Art-of-Artists0 -
Content suggestions
Hello, I am looking at the content suggestions in the page optimisation section and was wondering why the system says that some pages cover some topics and some don't cover it ? Is it based on finding "noun phrases" that co-occurre (twice the same one) ? or "noun phrases" that are used in the same context (even though they are different) ? Thank you,
Moz Bar | | seoanalytics0 -
Why do my Moz duplicate content results show me pages with no noticeably similar content?
Sometimes the "Pages with Duplicate Content" results under Content Issues show pages that, from what I'm able to see or otherwise test, have no duplicate content, save for the same navigation that exists on all of my pages. For example, a recent issue said that the following pages had duplicate content:
Moz Bar | | rickmic
https://freezerworks.com/index.php/html/slider-overlay
https://freezerworks.com/index.php/ufaqs/what-do-i-get-with-my-purchase-of-freezerworks
https://freezerworks.com/index.php/videos/fda-and-freezerworks-2
https://freezerworks.com/index.php/lims-testing-module Even a side-by-side of the page source in a text comparison tool shows nothing but navigation and scripts used in every page. Am I not seeing something?2 -
Is there a way to export all your crawl errors for multiple Moz campaigns at once?
We're looking for a simple way to export all crawl errors for our Moz campaigns. More than likely we could use the API, but was wondering if there was any functionality already built into Moz for exporting all crawl errors.
Moz Bar | | ReunionMarketing0 -
How can I find duplicate pages from a Moz Crawl?
We have many duplicate pages that show up on the Moz Crawl, and we're trying to fix these but it's very difficult because I can't see a way to isolate the code where the duplicate is found. For instance, http://experiencemission.org/immersion/ is one of our main pages, and the crawl shows one duplicate of http://experiencemission.org/immersion. It appears that one of our staff manually edited the source code in one of our pages but forgot the trailing slash. This would be an easy fix but the problem is that this page is linked to internally on our website 2423 times, so it's next to impossible to find the code that is incorrect. We have many other pages with this same basic problem. We know we have duplicates, but it's next to impossible to isolate them. So my question is this: When viewing the Moz Crawl data is there any way to see where a specific duplicate page link is located on our website? Thanks for any and all help!
Moz Bar | | expmission0 -
Cannot crawl website with redirect intalled on subdomain url
Hi! I want to crawl this website : http://www.car-moderne.ch. I tried a got back the crawl just for that one url (not for all the pages of the website). This single line cvs says that the status of the http://www.car-moderne.ch is 200, but in fact it is a redirect 301 to http://www.car-moderne.ch/fr where the live home page is (actually the Moz bar sees the 301, not the 200 as the single-lined crawl does). How can I proceed in this case (a 301 redirect being installed on the subdomain url) to still be able to have a full-fledged juicy cvs with all the broken links, duplicate content, etc. Thank you for your help! Pascal Hämmerli
Moz Bar | | Ethos_Digital0 -
Duplicate page content
The MOZ crawler identifies pages as duplicate content which are not the same.
Moz Bar | | aignerart
The pages http://www.aignerart.com/abstracts-oil-painting/cicli-colora.html and http://www.aignerart.com/abstracts-oil-painting/murs-de-la-ville.html are marked duplicate but they are different paintings. Any ideas?0 -
Duplicate page content
Hi guys the feedback form my campaign suggests I have to much duplicate page content. I’ve had a look at the CSV file but it doesn’t seem to be abundantly clear as to which pages on my site have the duplicate content. Can anyone tell which columns I need to refer to on the sheet, to ascertain this information. Also if the content is only slightly different, will Google still consider it to be duplicate? I look forward to hearing from you
Moz Bar | | Hardley1110