2-websites focused on different markets but similar content
-
Hi all! I have a client who wants to branch out to another market (currently in Northern California and wants to open an office in Southern California), what would happen if we put up a second website that has similar content, but is exclusively for Southern California, with a different office address, and all the content geared towards Southern California market? There would be NO linking between the sites. Would that generate a penalty?
Thanks! BB
-
Thanks for your thoughtful response! Cheers!
-
Great. Thanks!
-
Hi,
I agree with Dennis here.
One question that you can ask yourself:
"Why should Google have two websites with substantially similar content in their index?"
So, the answer would be to come up with substantially unique content on both the websites so that both of them can pass Google's radar safely and justify their presence in their index.
In the past, I worked with a client who had multiple websites operating in the same niche but catering to different states in the US. With almost identical content on many of them, they were not at all ranking good in the search engines overall. We had the content rewritten in such a way that all of them had unique content (please note that all these domains still operated in the same niche but with slightly different keywords/phrases based on the differences in naming conventions in different states). This worked wonders. The results exceeded all the expectations of the client. There was a dramatic increase in the total number of pages from these domains in Google's index along with a whopping increase in overall ranking for many of the competitive terms in the search engines.
With substantially unique, updated, highly relevant and useful content, you will see a healthy and steady growth in your SEO ROI. Those were my two cents my friend. Good Luck to you.
Best regards,
Devanur Rafi
-
Hi BB
I have tried this before (on different boxes though) and it didn't cause any issues.
I would still advice to at least change bits of the content if possible. The design and everything could be the same.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Similar content, targeting different states
I have read many answers regarding not having duplicated pages target different states (cities). Here is the problem. We have same content that will serve different pages in some provinces in Canada that we can't change allot intentionally. We don't want these pages compete within the same province. What would be the best approach not to get penalized and keep SERP? Initially we though about hreflang, but we can't really do it on the provice/state attributes. Thanks in advance!
Intermediate & Advanced SEO | | MSaffou20180 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Thin Content to Quality Content
How should i modify content from thin to high quality content. Somehow i realized that my pages where targetted keywords didn't had the keyword density lost a massive ranking after the last update whereas all pages which had the keyword density are ranking good. But my concern is all pages which are ranking good had all the keyword in a single statement like. Get ABC pens, ABC pencils, ABC colors, etc. at the end of a 300 word content describing ABC. Whereas the pages which dropped the rankings had a single keyword repeated just twice in a 500 word article. Can this be the reason for a massive drop. Should i add the single statement like the one which is there on pages ranking good? Is it good to add just a single line once the page is indexed or do i need to get a fresh content once again along with a sentence of keyword i mentioned above?
Intermediate & Advanced SEO | | welcomecure1 -
I have search result pages that are completely different showing up as duplicate content.
I have numerous instances of this same issue in our Crawl Report. We have pages showing up on the report as duplicate content - they are product search result pages for completely different cruise products showing up as duplicate content. Here's an example of 2 pages that appear as duplicate : http://www.shopforcruises.com/carnival+cruise+lines/carnival+glory/2013-09-01/2013-09-30 http://www.shopforcruises.com/royal+caribbean+international/liberty+of+the+seas We've used Html 5 semantic markup to properly identify our Navigation <nav>, our search widget as an <aside>(it has a large amount of page code associated with it). We're using different meta descriptions, different title tags, even microformatting is done on these pages so our rich data shows up in google search. (rich snippet example - http://www.google.com/#hl=en&output=search&sclient=psy-ab&q=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&oq=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&gs_l=hp.3...1102.1102.0.1601.1.1.0.0.0.0.142.142.0j1.1.0...0.0...1c.1.7.psy-ab.gvI6vhnx8fk&pbx=1&bav=on.2,or.r_qf.&bvm=bv.44442042,d.eWU&fp=a03ba540ff93b9f5&biw=1680&bih=925 ) How is this distinctly different content showing as duplicate? Is SeoMoz's site crawl flawed (or just limited) and it's not understanding that my pages are not dupe? Copyscape does not identify these pages as dupe. Should we take these crawl results more seriously than copyscape? What action do you suggest we take? </aside> </nav>
Intermediate & Advanced SEO | | JMFieldMarketing0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Different Google result for same keyword in different countries
Why does Google display different results in each country for the same keyword and with the same language? how can I take advantage of this to position my website in an specific country? In this case de domain is always a ."com" domain.
Intermediate & Advanced SEO | | abellojuan0 -
Website layout for a new website [Over 50 Pages & targeting Long Tail Keywords]
Hey everyone, We are designing a new website with over 50 pages and I have a question regarding the layout. Should I target my long tail keywords via blog pages? It will be easier to manage and list and link out to similar articles related to my long tail keywords using a word press blog. For this example - lets suppose the website is www.orange.com and we sells 'Oranges' Am I going about this in the right way? Main Section: Main Section 1 : Home Page - Keyword Targeted - Orange Main Section 2 : Important Conversion page - 'Buy oranges' Long Tail Keyword (LTK) 1: www.orange.com/blog/LTK1 Subsection(SS): www.orange.com/blog/LTK1/SS1 www.orange.com/blog/LTK1/SS1a www.orange.com/blog/LTK1/SS1b Long Tail Keyword (LTK) 2: www.orange.com/blog/LTK2 Long Tail Keyword (LTK) 3: www.orange.com/blog/LTK3 Subsection(SS): www.orange.com/blog/LTK1/SS3 www.orange.com/blog/LTK1/SS3a www.orange.com/blog/LTK1/SS3b All these long tail pages and sub sections under them are built specifically for hosting content that targets these specific long tail keywords. Most of my traffic will come initially via the sub section pages - and it is important for me to rank well for these terms initially. _E.g. if someone searches for the keyword 'SS3b' on Google - my corresponding page www.orange.com/blog/LTK1/SS3b should rank well on the results page. _ For ranking purposes - will using this blog/category structure hurt or benefit me? Instead do you think I should build static pages? Also, we are targeting more than 50 long tail keywords - and building quality content for each of these keywords - and I assume that we will be doing this continuously. So in the long term term which is more beneficial? Do you have any suggestions on if I am going about this the right way? Apologies for using these random terms - oranges, LKT, SS etc in this example. However, I hope that the question is clear. Looking forward to some interesting answers on this! Please feel free to share your thoughts.. Thank you! Natasha
Intermediate & Advanced SEO | | Natashadogres0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0