Server and multiple sites
-
We have multiple sites selling similar products in different ways but have always kept them separate on the off chance that google does not like it or they penalize one site.
We have always put them on different servers but now thinking for performance as they are on shared hosting to put them on a single server which would be our own but we do not know the SEO considerations.
We can assign multiple IPs to a server but I am not 100% sure whether there is still a negative impact of running multiple sites on the same server even if from a different IP.
Any help would be appreciated, what I am really asking is could if they are on the same server with different IP's be still linked together by google?
-
If the look and feel is different on each site and each has it's own content it would not really matter. Just be careful interlinking those site with each other. If you can assign each website with their own IP that would be the best!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Scraped site, hijacked searches for business name.
Hello, I have a site that was scraped (possibly by a competitor's seo company), who then built links to the duplicate site. When people do a search for the name of the business the scraped site is all that comes up along with the usual third-party sites. They seem to take the site down and put it back up every couple of weeks to maintain the rankings in Google. Has anyone ever dealt with something like this? Any advice or recommendations would be appreciated. Search: LIC Dental Associates Scraped site: old-farmshow.net Legit site: licdentalassociates.com Thanks, Emery
White Hat / Black Hat SEO | | tntdental1 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
I think My Site Has Been Hacked
I am working with a client and have noticed lots of 500 server errors that look very strange in their webmaster tools account. I am seeing URLs like this blog/?tag=wholesale-cheap-nfl-jerseys-free-0702.html and blog/?tag=nike-jersey-shorts-4297.html there are 155 similar pages yet the client does not sell anything like this and hasn't created these URLs. I have updated WP and all plugins and cannot find these links or pages on the site anywhere but I am guessing they are slowing the site down as GWT keeps highlighting them as errors. Has anybody had any experiences with these types of hacks and can point me in the right direction of how to clean it up properly? Ta
White Hat / Black Hat SEO | | fazza470 -
Can links from an old site raise DA for other site? Or just unethical?
So this may be an odd question. So a competing company went out of business. Their domain name is now available. So just for research purposes, would you ever or would it be unethical for a person to buy an expired competing domain name, and point it to another site to collect their link juice? The site was only a DA of 10, but not sure if one - its bad to buy a competing companies expired domain - and two - even though in the same industry, this would be bad to point it to another site or create a site from it. Just curious your thoughts.
White Hat / Black Hat SEO | | asbchris0 -
Google authorship and multiple sites with multiple authors
Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.
White Hat / Black Hat SEO | | sumare0 -
Site Maps
I have provided a site maps for google but although it craws my site www.irishnews.com at 6:45AM the details in the site map are not seen on google for a few days - any ideas how to get this feature working better would be great. example <url><loc>http://www.irishnews.com/news.aspx?storyId=1126126</loc>
White Hat / Black Hat SEO | | Liammcmullen
<priority>1</priority>
<lastmod>2012-01-23</lastmod>
<changefreq>never</changefreq></url> thanks0 -
How can I make use of multiple domains to aid my SEO efforts?
About an year, the business I work for purchased 20+ domains: sendmoneyfromcanada.com sendmoneyfromaustralia.com sendmoneyfromtheuk.com sendmoneyfromireland.com The list goes on, but you can get the main idea. They thought that the domains can be useful to aid http://www.transfermate.com/ . I can set up a few micro sites on them, but from that point there will be no one to maintain them. And I'm, honestly, not too happy with hosting multiple sites on one IP and having them all link to the flagship. It is spammy and it does not bring any value to end users. I might be missing something, so my question is - Can I use these domains to boost my rankings, while avoiding any shady/spammy techniques? P.S. I had this Idea of auctioning the domains in order to cover for the domain registration fees.
White Hat / Black Hat SEO | | Svetoslav0