Do i need different IP addresses for mini sites?
-
Hi everyone
We are currently building some non-advertorial based mini sites the link to a main "money site", these mini sites are all run off wordpress or similar and have different designs, however all the WHOIS data remains under one company. So therefore I dont know if really you need different Class C IP's anymore as google et al will just look at the whois records and link the websites up that way?
Is this tactic still worth doing?
Thanks for any input!
-
I would very well say that SE's look at the IP address when measuring link-diversity. So the IP address does matter. No matter what is spoken (cutts). Just think how you would do it. Do you think they tell you the truth on every single detail? Don't be so naive..
-
No, no, I only block the Whois data on one website, which is just a fun blog I do to make people laugh in my small community in Florida. Being anonymous is half the fun
What I try to do when I create mini-sites is somehow provide value to the users - that's the best policy. If you create a mini-site ONLY for SEO, you're not going to add much value. But if you create a mini site that gives your users something different than they can find on your main site, you'll be adding value and SEO value as well.
Here's a for-instance. Let's say you are a mortgage broker and your primary website has tons of content on mortgages and the call to action is to submit an application.
Then, you create a mini site that has a ton of generic info and calculators for mortgages, and has listings of data in whatever location you're focusing on ranking your primary site.
This mini site could be a simple Wordpress install and wouldn't take too much work, but it's adding real value for your users.
Does that make sense?
-
There is no problem with hosting things on the same server that's true no argument there.
But if you're planning on setting up microsites to boost one main site, and you set them all up on the IP address don't expect them to pass along the same link juice as sites on different IP addresses.
-
So do you have many false WHOIS data records then? Or are you just upfront? I think honesty is the best policy maybe?
-
Actually, I disacree with Wissam and Tompt. Google expressly lets people know that they don't look at IP addresses in their rankings. So, if you own a dedicated or virtual private server, you can create all of your sites on that server. That's what I've done for years with no problems.
Here is Matt Cutts on IP addresses.
-
yeah we do that as well Some have been hit by panda update though, squidoo seems to be pulling through ok, others have died of death! There are some in there I dont use tho, so thanks for the heads up!
Of course the magic is to create some amazing content / viral thats get hit on a lot, but as we all know thats difficult when budgets are low and you are dealing with a sales based site instead of something free / personal.
-
Lawrence,
how about leveraging free blog platforms for the link wheel?
WordPress.com — Get a Free Blog Here
http://blogger.com
LiveJournal: Discover global communities of friends who share your unique passions and interests.
Blogsome
Bravenet - Web Hosting, Free Web Hosting and Web Tools
Friendster - Home
Knol - a unit of knowledge: share what you know, publish your expertise.
Welcome to Windows Live (MSN Spaces http://msnspaces.com )
Squidoo : Welcome to Squidoo
Sign up | Tumblr
Weebly - Create a free website and a free blog
Webs - Make a free website, get free hosting
Hubpages.com -
Thanks for the reply! What would you suggest currently looking into more than "link wheel" ideas? Away from twitter et al... contextual link systems seem to be dead, link buying is a no no... its just a mind field at the moment!
-
I agree with Wissam Dandan, you definitely need different IP addresses and the other things he listed (not 100% sure on the whois data, but it's likely, better to be safe than sorry). It's an expensive and desperate move, but if you have no other source for links it can be a viable option.
I've used it in very tight niches before and it has helped, but it's a lot of work and a reasonable amount of money you might be better off spending elsewhere.
In addition, if for some reason it's picked up, you lose all that benefit and effort.
-
This Tactic is expensive and take much time to implement.
because you need to have each mini site, hosted on different class C ip, Different TLDs, Different Whois, different GA / Adsense codes (if implemented).
mainly this tactic is kinda popular in the gambling industry
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with Robots.txt
An eCommerce site built with Modx CMS. I found lots of auto generated duplicate page issue on that site. Now I need to disallow some pages from that category. Here is the actual product page url looks like
Intermediate & Advanced SEO | | Nahid
product_listing.php?cat=6857 And here is the auto generated url structure
product_listing.php?cat=6857&cPath=dropship&size=19 Can any one suggest how to disallow this specific category through robots.txt. I am not so familiar with Modx and this kind of link structure. Your help will be appreciated. Thanks1 -
Seo for international sites
Hello, I have a question for the group, our main US site- http://www.datacard.com is utilized to move content to other regional sites like http://www.datacard.co.uk/ and http://www.datacard.fr/ and http://www.datacard.com.br/. Anyhow, we essentially have some regional content on those sites, but for ease of maintaining and updating the content we have a company translate this for us and then undergo an in country review for local people in our company to review the content. That being said the meta descriptions, titles, code, everything gets translated to that language. I know there are issue for SEO for these purposes as we get much better rankings with http://www.datacard.com. The regional sites are newer so this could be part of it. We don't have an agency helping us with SEo and i get a lot of questions on what can be done internally for this for regional sites with our current structure. Any tips you have? It would be greatly appreciated! Laura
Intermediate & Advanced SEO | | lauramrobinson320 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
SEO Site Analysis
I am looking for a company doing a SEO analysis on our website www.interelectronix.com and write a optimization proposal incl. a budgetary quote for performing those optimizations.
Intermediate & Advanced SEO | | interelectronix0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
Need help or explanation on my site!
My site has suffered greatly since the recent Google update. I have done everything as suggested. I have had all bad links removed over 2 months ago. I have lowered keyword density (not easy since the keyword is in our company name!). I have rewritten various content and bolstered our existing content. What gives? What can I do? As an example the keyword, "maysville plumber" - I rank about 40th for this keyword. The first three pages are filled with websites with literally NO content or no added value. Maysville is a town of about 1k residents - there is no competition. Before the update I was #1 for years on this particular keyword. And this is the case with 35 other cities (mostly small cities, but a few larger ones). Please help me understand or suggest what I can possibly do at this point. We have hundreds of pages of unique content on each and every page. We have zero duplicate content (I have ran tests and crawlers). We have no fishy links. I have not gotten any messages from google on Webmasters. PLEASE HELP!! I asked a similar question a little while back and fixed all of the suggestions. My site is www.akinsplumbing.net.
Intermediate & Advanced SEO | | chuckakins0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0