Merging four sites into one... Best way to combine content?
-
First of all, thank you in advance for taking the time to look at this.
The law firm I work for once took a "more is better" approach and had multiple websites, with keyword rich domains. We are a family law firm, but we have a specific site for "Arizona Child Custody" as one example. We have four sites.
All four of our sites rank well, although I don't know why. Only one site is in my control, the other three are managed by FindLaw. I have no idea why the FindLaw sites do well, other than being in the FindLaw directory. They have terrible spammy page titles, and using Copyscape, I realize that most of the content that FindLaw provides for it's attorneys are "spun articles."
So I have a major task and I don't know how to begin.
- First of all, since all four sites rank well for all of the desired phrases-- will combining all of that power into one site rocket us to stardom? The sites all rank very well now, even though they are all technically terrible. Literally.
I would hope that if I redirect the child custody site (as one example) to the child custody overview page on the final merged site, we would still maintain our current SERP for "arizona child custody lawyer."
I have strongly encouraged my boss to merge our sites for many reasons. One of those being that it's playing havoc with our local places. On the other hand, if I take down the child custody site, redirect it, and we lose that ranking, I might be out of a job.
Finally, that brings me down to my last question.
- As I mentioned, the child custody site is "done" very poorly. Should I actually keep the spun content and redirect each and every page to a duplicate on our "final" domain, or should I redirect each page to a better article? This is the part that I fear the most.
I am considering subdomains. Like, redirecting the child custody site to childcustody.ourdomain.com-- I know, for a fact, that will work flawlessly. I've done that many times for other clients that have multiple domains. However, we have seven areas of practice and we don't have 7 nice sites. So child custody would be the only legal practice area that has it's own subdomain. Also, I wouldn't really be doing anything then, would I? We all know 301 redirects work.
What I want is to harness all of this individual power to one mega-site.
Between the four sites, I have 800 pages of content.
I need to formulate a plan of action now, and then begin acting on it. I don't want to make the decision alone. Anybody care to chime in?
Thank you in advance for your help. I really appreciate the time it took you to read this.
-
I like this strategy. As you add new better content, you can add nice links from the old bad content. You can have a constant source of links from relevant sites. Sweet!
-
Well, Google isn't going to punish you for owning 4 different websites, it's perfectly fine to own multiple web properties that drive traffic to your business. In fact, you're diversifying your risk by having multiple sites, since if one drops in rankings, you still have 3 others.
If the other sites are spammy, why would you want that content on your main site anyway? Just include links from the 3 other sites, and point them all to your flagship site. That way your main site still gets the SEO boost, and you can build it out however you want, while you don't lose traffic from the other 3 sites.
-
Thank you Takeshi.
The reason I want to merge them is because we are a major law firm in Phoenix and I think it's search engine spam to try and nail google with niche content sites. Because, really, other than the front page optimization-- they really aren't "niche" sites at all. They just have niche domains and a niche front page description.
Beyond that, nothing but spun content and crappy pages with titles like this:
Phoenix Arizona Divorce Lawyer | Child Custody Lawyer | Child Custody Help | Phoenix Tempe Mesa Tucson | Child Custody Lawyers
I feel that it's "index" spam for us to have multiple sites. What is the "white hat" argument for having multiple websites, for one law firm, in one major metropolitan area? How could I justify it to Matt Cutts?
-
Maybe keep all micro sites and just point them to one domain from there on out.
-
Sounds like a big and risky endeavor, although there could be potential benefits.
Why are you considering merging the sites if they are already ranking well for their keywords? Are they not ranking high enough? If it ain't broke...
A better strategy might be to just build more links and add more content to the sites you have. Another area you could focus your efforts on is conversion rate optimization. You're already getting all this traffic for desirable keywords, so you can boost your client's ROI by figuring out how to turn more of that traffic into leads.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Is this site buying backlinks?
dankstop.com Almost all of their links come from mainly the same few sites, and some of their links have a really high spam score.
White Hat / Black Hat SEO | | tlorenzi0 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Is this Duplicate content?
Hi all, This is now popping up in Moz after using this for over 6 months.
White Hat / Black Hat SEO | | TomLondon
It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0