Shared Hosting Vs VPS Hosting
-
From an SEO perspective what are the advantages of VPS Hosting VS Shared Hosting for a local website that has less that 200 pages and gets max 2000 hits per month?
Is VPS Hosting worth the extra expense for a local Real Estate Website?
-
What is the site built on? If wordpress, check out wpengine (no affiliation). Just saw this post and wanted to respond as I had tons of downtime when I migrated a site to a shared environment on another host. Site kept having downtime no matter what I tried with support. So you can imagine the frustration and how this can impact the business not just from a search perspective. I think the question here is how stable do you need something. Does shared work? Yea sure. But if you are putting something that is the core of your business on a $4.95mth host, than you get $4.95 worth of insurance.
-
The 24- 48 hours time frames are highly overstated. There are numerous methods of transferring a site with no downtime. Even if you have downtime, it is typically a couple hour period (locally) where some users will go to the old site, whereas other users will go to the new site. If you have a relatively static site which services a local area, such as a local realty site, the issue is quite minimal.
Here is the bottom line....your current site is hosted on a shared server which is so restrictive you can't even add a 301 redirect. To be candid, a site which has operated under those conditions could not reasonably be so hyper-sensitive to the potential short transition period from a simple server change. If I am mistaken, then you need to hire a professional developer to manage the migration on your behalf.
-
I was also wondering what is your opinion on Rankings impact from changing Host providers?
If done correctly, there is no negative impact at all. The primary issue is during the switch over itself.
The issues of duplicate content and split links should not be a factor. Those issues can potentially occur when URLs change. That does not happen if you are simply changing hosts. The same applies to the www vs non-www and index.html issue. If these issues are not presently occurring, and you properly move all files associated with your site, and your new server is properly configured, the items you mentioned will not be considerations.
-
I have reached out to several hosting companies and they stated that my site will be down for 24 to 48 hours if I transfer my site over to their service.
I am sure that this would impact my rankings for having the site down. They said it's because I will need to update the domain servers.
Is there any way to prevent downtime while transferring a site?
-
Thank you for your answer. I was also wondering what is your opinion on Rankings impact from changing Host providers?
I ask this because my current Web Host doesn't allow me to do a 301 redirect.
I have heard that changing providers can negatively impact search rankings because of the change of IP Address.
Note: my concern is duplicate content and split links to my site. From www and non-www version of home page along with the index.html version.
-
Is VPS Hosting worth the extra expense for a local Real Estate Website?
In short, absolutely. Longer answer below.
Shared hosting typically involves hundreds of websites on a single server. Shared hosting is the lowest form of hosting offered. It is very common to have all types of bad sites (porn sites, mail spammers, etc) on the same server as your real estate site. Server outages are common along with numerous other issues such as having your mail server flagged for spam.
A VPS typically divides a server's resources like a pizza into 8 - 12 "slices". You have dedicated resources assigned to your site along with over a 90% reduction in the user population. If you care at all about SEO, which you apparently do based on your presence at SEOmoz, you should consider using VPS at a minimum for site hosting.
I have worked with clients in the past who have tried shared hosting with quality hosts. I worked with the hosts to move the sites to other shared servers which they deemed as more mature / stable. The sites still experienced monthly outages. Typical site owners are not even aware of these outages. As part of a solid SEO plan, you should use a monitoring service to notify you of any site outages. I use Alertra but there are many similar services available.
-
For a site of that size and traffic volume I think you'll be fine with shared hosting - as long as you choose a reputable, quality host - which can be difficult when it comes to choosing a shared hosting environment!
I hesitate to recommend specific hosting companies because I've had both good and bad experiences with a number of hosts. Do your research and find a host with great support, reputation and standards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Bad if Hosting Company Performs Domain Migration
InMotion Hosting hosts our domain. At the moment, we use domain "A". Domain "B" redirect to domain "A" . Domain "B"" better represent our brand and we want to redirect domain "A" to "B". Our website is designed in Wordpress. It contains about 750 pages. At the moment we do not have an SSL certificate. I would like to add the SSL certificate at the same time we migrate the domain. The data we collect on the site are company name, phone number, email address etcetera. No transactions. I was told that the Auto SSL free certificate is fine and that there is no need to pay for a certificate. Is this correct? My developer has told me that installing an SSL certificate would take about 8 hours. And that migrating the domain would take 24 hours, plus or minus 5 hours. My developer is very professional, and usually does a great job but this seems costly considering a $24/hour labor rate. It also seems like an inordinate amount of time. Several well rated (100% approval) Upwork developers are willing to perform this job for less than $200. Huge differential!! Also, Inmotion Hosting is willing to migrate the site and install the certificates for free. But pay nothing and the quality is usually questionable. Any thoughts?? Also, I have a lot to lose in terms of SEO if something goes wrong. Are there any specifications that I should insist on to make sure the migration proceed smoothly? What do I need to modify on Google Analytics once the migration is done. Any steps I should take to ensure the maintenance of page rank? Thanks!!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
H2 vs. H3 Tags for Category Navigation
Hey, all. I have client that uses tags in the navigation for its blog. For example, tags might appear around "Library," "Recent Posts," etc. This is handled through their WordPress theme. This seems fairly standard, but I wonder whether tags are semantically appropriate. Since each blog post is fairly lengthy (about 500-1000 words) with multiple tags, would it be more appropriate to use tags for this menu navigation? Are we cutting into the effectiveness of our tags by using them for menu navigation? The navigation is certainly an important page element, and it structures content, so it seems that it should use some header tag. Anyways, your thoughts are greatly appreciated. I'm a content creator, not an SEO, so this is a bit out of my skillset.
Intermediate & Advanced SEO | | Ask44435230 -
Dealing with non-canonical http vs https?
We're working on a complete rebuild of a client's site. The existing version of the site is in WordPress and I've noticed that the site is accessible via http and https. The new version of the site will have mostly or entirely different URLs. It seems that both http and https versions of a page will resolve, but all of the rel-canonical tags I've seen point to the https version. Sometimes image tags and stylesheets are https, sometimes they aren't. There are both http and https pages in Google's index. Having looked at other community posts about http/https, I've gathered the following: http/https is like two different domains. http and https versions need to be verified in Google Webmaster Tools separately. Set up the preferred domain properly. Rel-canonicals and internal links should have matching protocols. My thought is that we will do a .htaccess that redirects old URLs regardless of the protocol to new pages at one protocol. I would probably let the .css and image files from the current site 404. When we develop and launch the new site, does it make sense for everything to be forced to https? Are there any particular SEO issues that I should be aware of for a scenario like this? Thanks!
Intermediate & Advanced SEO | | GOODSIR0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Should each physical store have its own ecommerce store on subfolders, or share a single national one?
Each store has its own subfolder (in my mind this hasn't actually happened yet 😉 ) on the main head office domain i.e. maindomain.com/localstore1 , maindomain.com/localstore2 etc. I am happy that this is the best structure for SEO purposes. I like the local SEO advantages to it as each store can have its own NAP and show its own inventory. However I am worried that each store having its own ecommerce site will lead to duplicate content issues. So I am having a rabid debate with myself as to whether each store should:
Intermediate & Advanced SEO | | BruceMcG
a) have its own ecommerce website i.e. maindomain.com/localstore1/ecommercestore
b) have its own ecommerce website i.e. maindomain.com/localstore1/ecommercestore with each product and category page having canonical links to the corresponding page on the main ecommerce website i.e. maindomain.com/ecommercestore
c) just have one ecommerce website with local stock shown e.g. maindomain.com/ecommercestore/productpage shows in an inventory in a line (below the price or such like): " localstore1 (3 items) localstore2 (0 items)"
d) just chill, inventory stock-outs happen just don't worry about showing local stock And its not good to have internal rabid debates, so I'd like to ask the wider moz community. For bricks and mortar stores (branches or franchises) how would you set up ecommerce stores? Thanks.0 -
Do Share buttons take LinkJuice?
Hi, I'm using AddThis sharing on my site. The implementation is by embedding a href's all over the site. Do these buttons take my LinkJuice? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
How can I tell which website pages are hosted on the root domain vs the www subdomain?
One of the SEOmoz help desk professionals told me this today regarding some of my website pages. "it looks like you have pages hosted as separate pages on both the root domain and the www subdomain, which means that these pages are competing for rankings and authority. You may want to consider a 301 redirect or the use of rel=canonical tags.". Can anyone help me understand this? How can I tell which pages are which?
Intermediate & Advanced SEO | | webestate0