Are different IP addresses enough for sites with similar content?
-
Hi all
We're looking at moving our 2 websites onto a cloud hosting package. The content on our sites is very similar (but not duplicated) so at the moment they are on separate servers.
If we move to the cloud, is it enough for them to have different IP addresses on the same cloud system, or should we host in separate clouds?
Thanks in advance
Heather
-
Hi Ryan
Thanks for your response. I can assure you that pursuing black hat methods was never the intention. I don't think anyone on SEOmoz would want that label. But I can see why you'd see it that why.
It was decided that both websites should offer the same products, focussing on their 2 key products but then saying "we also provide such and such through our sister brand". We are happy to have the association and for Google to know the 2 sites are related.
The reason I asked the original question was due to the advice of an SEO company we used to use who said that we needed to move one of the sites to a different server if we wanted them to rank for their own keywords. I guess they weren't telling us the whole story?
From what you say, a new content strategy is probably the answer.
-
Frankly, it sounds like what you are asking is "how can I trick the search engines into not realizing these two sites are related". I am sorry if that is blunt, but it seems to be accurate. Google is exceptional smart and has tons of data to use to detect manipulation. A few examples:
-
who owns the domain? If at any time the same entity purchased or was listed as an owner of the domain, it is reasonable to think Google is aware of this information and no matter what action you take the sites may be considered related.
-
do the sites use the same code? Even if you change logos and text, if the same base code is used for both sites, it is likely Google can recognize unique aspects of the code and relate the sites
-
you mentioned similar content. Google is also quite capable of recognizing various forms of revising the same content
-
are these separate servers with the same host? If so, it is likely they have the same C-block in which case the sites can be related in that manner.
There are many other means by which Google could establish a relation between sites: same Google WMT accounts, sites accessed by the same IPs, same backlinks, the list is quite long.
If the purpose of varying the IP address is to hide the relationship from Google, I would suggest not even worrying about it. You are pursuing black hat methods and to pull it off would require extensive experience and resources.
-
-
Hi Marcus
We have 4 products. 2 come under the main brand, but the other 2 come under a separate brand that is more appropriate for the market they are targeting.
Each site focuses mainly on their 2 primary products, but also has a section for the secondary products, introducing the other brand and sending them to the other website.
Due to the industry we're in, content is pretty controlled, so creating completely different content for each site isn't straightforward.
All I need to know is if we need to be on separate servers. My priority is to get the sites moved asap and not rebuild and rewrite content at this stage.
Thanks
Heather
-
Hey Heather, if the sites are so similar, can I ask why you have two? If they are truly similar, and serve the same market / goals etc, why not make your life easier and maintain one, truly unique site?
If there is a distinct reason for both sites to exist, then it is worth investing some time and effort to make sure that the content is truly unique across both sites.
These kind of questions are very tough to answer without a clear definition of 'similar' so if you wanted to include a link or provide a little more details regarding what the similarities are then we could likely assist further.
Hope this helps,
Marcus -
Hi Heather,
To be honest it is more about the similarity between your content rather than the IP addresses that might determine any penalty considerations. Essentially because cloud hosting is virtualized (and thus can offer different IPs) they will be seen as separate dedicated servers if the virtualised resource is set up as such (i.e. the cloud hosting is split into 2 virtual servers and given dedicated static IPs).
Good luck!
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
If content is at the bottom of the page but the code is at the top, does Google know that the content is at the bottom?
I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!
Technical SEO | | DA20130 -
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
Schema Address Question
I have a local business with a contact page that I want to add schema markup to. However, I was wondering if having the address with schema info on the contact page instead of the home page has any adverse effects on the rich snippet showing up in search. There's no logical place to add schema for a local business on the home page, so having it on the contact page—not in the footer or sidebar—is the only option.
Technical SEO | | DLaw0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0