Are multiple domains spammy if they're similar but different
-
A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as:
- Johnsmithlandclearing.com
- Johnsmithdirtwork.com
- Johnsmithdemolition.com
- Johnsmithtimercompany.com
- Johnsmithhydroseeding.com
- johnsmithtreeservice.com
Each business is unique enough and will cross-link to the other.
My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen?
I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.
-
Yes, link building will be extra (and often duplicate) effort. Adwords would be another concern. Other maintenance issues not so much. My goal was to target a specific visitor segment with a bonus of having the top keyword in the domain name, but recognise there are more benefits to keeping as one site. Thanks for your and Devanur's responses.
-
Your law analogy is the premise on which my though is based. Ironically, I SEO a law site that has segmented their dive attorney business and have found that targeting this specific market separately works well. The keyword in domain name is a bonus.
-
The only time I would do this is if each business entity had it's own group of people dedicated to the business and you had plenty of content to support the site. Lawyers will sometimes break up Family Law, Business Law, Maritime Law etc into different microsites. It's a ton of work though to keep each site updated and current.
-Bob
-
Keri got it right. With multiple websites all your SEO efforts are scattered and with all other operational overheads. Here is my take on this. One website, very stable, high and wide that will become an authority in the niche going forward. Let that one website target all those keywords/phrases. There is really no need to come up with new keyword specific domains. There might be some level of keyword cannibalization with multiple domains targeting the same niche.
If at all you want to cross-link the domains you own, you should ideally make those links no-follow.
I conclude by saying, one big website with loads of quality content will always win over multiple small websites in many respects. Those were my two cents.
Best,
Devanur Rafi
-
My first thought is why do you want to split up your link building and content efforts among several sites, and have several sites to maintain instead of one?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Moz is showing Spam Score at my New Domain?
Hi folks I just registered a new domains boring magazine but I forgot to check the spam score. Recently, I checked and it showing spam score of 46% without any backlinks. You can check the domain age is 30 Days only till now. Need your recommendations on how can I reduce it and on which basis Moz showing it as spams? sp.PNG
White Hat / Black Hat SEO | | ImranZahidAli0 -
'SEO Footers'
We have an internal debate going on right now about the use of a link list of SEO pages in the footer. My stance is that they serve no purpose to people (heatmaps consistently show near zero activity), therefore they shouldn't be used. I believe that if something on a website is user-facing, then it should also beneficial to a user - not solely there for bots. There are much better ways to get bots to those pages, and for those people who didn't enter through an SEO page, internal linking where appropriate will be much more effective at getting them there. However, I have some opposition to this theory and wanted to get some community feedback on the topic. Anyone have thoughts, experience, or data to share on this subject?
White Hat / Black Hat SEO | | LoganRay1 -
Increase in spammy links from image gallery websites i.e. myimagecollection.net
Hi there I've recently noticed a lot of spammy links coming from image gallery sites that all look the same, i.e.: http://mypixlibrary.co/ http://hdimagegallery.net/ http://myimagecollection.net/ http://pixhder.com/ Has anyone else seen links from these? They have no contact details, not sure if they are some form of negative SEO or site spam. Any ideas how to get rid? Thanks
White Hat / Black Hat SEO | | Kerry_Jones0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Is it a good idea to target a similar versions of a keyword?
Salute you all, I am optimizing a site for an attorney. I have done some good research and find the keyword difficulties. Some of my keywords are very similar was wondering is this a good idea and safe (white hat) or not? e.g. page title: 1) city immigration lawyer 2) city immigration attorney My main and first reason is to target all users. Since some will search under 'attorney' and some under 'Lawyer'. Secondly one is easier than the other. I appreciate any input from more experienced seo experts. Chris 🙂
White Hat / Black Hat SEO | | Chris-tx0 -
Getting links on competitor's blog
An SEO agency I'm working with has asked if we're okay with guest posting on a competitor's blog. What are the negatives of getting a link from a competitor's blog? Two things I thought of: They can remove the link at any time - why wouldn't you as a competitor? I generally don't want to alert my competition what I'm doing for SEO and how I'm doing it. Is that enough to not pursue those links? Thanks in advance for your thoughts!
White Hat / Black Hat SEO | | pbhatt0