Multiple Sites Duplicate Content Best Practice
-
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas.
I.e. (widgetmakersinmarietta.com)
-
Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed?
-
Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point.
3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh?
Thanks everyone.
-
-
_A tough choice. As the geo specific sites are not Country code top-level domains, it will be all the more difficult for you to make Google understand that the purpose of launching these websites is to serve the local targeted readers. It might even look like an attempt of domain farming whose main purpose is to rank high in location specific keywords by launching keywords rich domains.
If your business has physical business address in these locations, you can submit your business details in Google Places. And of course you are free to create location specific pages in your main website given the fact that you are adding some interesting details there. No stuffing of keywords or rehash content just to get higher rankings in competitive terms. _
-
If main and duplicate sites are already indexed then I would suggest implementing canonical element on the duplicate sites with the main site URLs as preferred URLs.
New pages can be blocked by adding "noindex" on the page.
All the best !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
Optimizing one site for multiple countries
I am working on a project, where we have one website, with a country specific domain, which is currently ranking well in local search. The client now wants to expand his business into two new countries (all english speaking) and would like to rank for the same keywords in these two new countries. The customer do not want to create new websites for the new countries. Because its a local domain and the website is setup for local search in GWT with locally hosted server, i expect challenges in optimizing for new countries without impacting the current local ranking. Question 1: What would be the recommended approach for maintaining their existing ranking on local search, while optimizing for the new countries.
Technical SEO | | petersen0 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0 -
Best practices for country homepage
Hi, What are the SEO best practices for redirecting to the correct language site based on geographic location? Right now, we're using a 302 redirect to point users to the right country landing page. User reaches site: domain.com > Server detects location > 302 redirect to domain.com/french We'd like to optimize the site for all languages, but which country gets the SEO rank for the domain.com? Thanks for your help! Roya
Technical SEO | | roya0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0