How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
-
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location.
What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
-
Cool, keep me posted.
-
Thanks again and let me know if you have any other ideas... I will keep you posted on what happens...
-
Thank you for the info... Some of that is similar to what I was thinking. I feel that corporate is pretty stiff but I will have to try to make a case with them.
-
It is time to speak to the corporation and tell them to get some professional advice. If they have 800 sites that are all exactly the same then they are at best going to see most of them fail to get indexed and at worse could face some kind of penalty or de-listing across the entire network (are these sites linked together?).
What can you do?
Well, if it were me, I would want to do one of the following:
- Substantially rewrite the content across all of the sites and noindex any pages that can't be rewritten
- Not use these corporate sites and build my own. If we can't have it taken down, at least no index it.
- No index the duplicate pages and build your own unique content on the site to attract traffic
- Create one unique and global site that has all locations listed in a directory type affair - maybe even with their own portal or sub domain with a blog and location specific content.
- Create unique social portals on facebook / google plus and try to make those the main target for search users
As ever, without a link I have to add the caveat that it's kind of tough to give specific and laser targeted advice for sites we have not seen but certainly, this is problematic and will undermine any kind of SEO efforts you may make on the sites.
In a nutshell - you need to resolve this, or give up on these sites as a way of generating search traffic.
Hope that helps!
Marcus -
Well similar content does get penalty for sure. More than 40 to 50 % similar content similarity is even worse. Robots.txt file can be implemented to not allow Google to index it though Google will still crawl it.
Content has to be changed. Contact us, about us are same in man websites which seems to be ok with Google as it cant be avoided and Google understand that. But having home page and other important pages similar too is not good. Will be good to see what others have to say here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong target content in the SERP regarding language
Hi Guys! I'm currently under an SEO issue and need some advices about it. My problem is that, Google doesn't show the good pages in the SERPs regarding the languages. In fact, I translated some content in Italian, German, French etc ... When someone use the branding name of the project to find it by google, if this guy is French, German, or something else, Google shows the English version in the results. I of course would like google showing the German version for a German guy in the SERP ... I already made properly my hreflang tags. Some tips to fix it? Thanks a lot in advance! And hope everybody had a merry christmas!
Intermediate & Advanced SEO | | SEOBubble0 -
Search Causing Duplicate Content
I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search
Intermediate & Advanced SEO | | moon-boots0 -
Interlinking sites in multiple languages
I am working on a project where the client has a main .com site and the following additional sites which are all interlinked: .com site targeting US
Intermediate & Advanced SEO | | rachelmanning888
.com site targeting China
.HK site targeting Hong Kong All sites contain similar information (although the Chinese site is translated). They are not identical copies but being shopping sites, they contain a lot of similar product information. Webmeup software (now defunct) showed that the inbound links to the main site, from the additional domains are considered risky. Linkrisk shows them as neutral. The client wants them to be interlinked and would not want to remove the additional domains as they get a good amount of traffic. In addition, the messages and products for each country domain have been tailored to a degree to suit that audience. We can rewrite the content on the other domains, but obviously this is a big job. Can anyone advise if this would be causing a problem SEO wise and if so, is the best way to resolve it to rewrite the content on the US and Hong Kong sites? Alternatively would it be better to integrate the whole lot together (they will soon be rebuilding the main site, so it would be an appropriate time to do this).0 -
Multiple Domain Names Point To One Site
I spoke with a potential client yesterday and for legitimate reasons they have multiple domain names, all very closely related in name to each other pointing to one site. His main site. So for example this is how things look, mainsiteva.com, mainsitedc.com, mainsitepa.com, mainsiteca.com, mainsitega.com, mainsitela.com ALL forward to mainsite.com This is being done because they used to have different sites for different geographies. Will google look at this as some form of manipulation?
Intermediate & Advanced SEO | | WebbyNabler1 -
Duplicate content across internation urls
We have a large site with 1,000+ pages of content to launch in the UK. Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration. We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option. Thanks
Intermediate & Advanced SEO | | jazavide0 -
Duplicate page content and Duplicate page title errors
Hi, I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error. Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content. I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links". Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla. Thanks a lot for your help Marco
Intermediate & Advanced SEO | | marcodublin0 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0