Will upgrading my dedicated server improve my site speed
-
hi, at the moment i am concerned about my site speed for www.in2town.co.uk
My hosting company is tmd hosting and my package is
Intel Atom 330 1MB L2 Cache 1.6GH
$159/mo $189/month500GBStorage4GBRAM10TBBandwidthi am looking at increasing this to**$219/mo** $289/month500GBStorage6GBRAM10TBBandwidthcan anyone let me know if this will make a difference to my site speed please
-
I'm not familiar with Joomla, but you should be figuring out how to get serverside caching enabled. Looks like they have their own implementation, so you probably don't need an extension.
If you aren't familiar with the code found on this page, you should probably hire someone qualified to do it.
-
thanks for this. what would you say is the problem with the site. i cannot use the cdn network, tried with amazon and it made no difference to my site and it worked out it was costing me time. i use joomla 1.5 and have tried adding a expires header. Add an Expires or a Cache-Control Header on my joomla site i have this set - GZIP Page Compression not really experienced so not sure if i have - Put Stylesheets at the Top or if i have this - Put Scripts at the Bottom these are the things that the following page which was recommended by tommy http://developer.yahoo.com/performance/rules.html any help and advice would be great as i do not really fancy paying out even more money
-
An increase in monthly payment that high, for 2 gigs of RAM? Ouch.
To answer your question: yes better hardware always means better performance. But often the problem is with code or lack of optimization. Google Chrome has a great developer tool (Press F12!) that has a Network tab that logs requests and the time it took to serve them up. This can help troubleshoot some performance issues. Now switch to the Audits tab -- you'll get some more helpful tips to speed up your load time.
Also make sure you have some form of serverside caching enabled. Even the slowest of all servers can serve up a page in under a second with proper caching and a method of priming the cache.
-
Hi Diane!
Upgrading your server will help you a few steps, but, if you want to get big improvements you better take a look at these recommendations:
http://developer.yahoo.com/performance/rules.html
Good luck!
-
Hi Diane
How big is your site? I've seen it and doesn't seem like its 100,000s of pages.
What's weird is, just connecting to the site was slow for me today, but the downloading speed of the bones was acceptable. Why don't you speak with TMD before upgrading? There may be no need to pay more, might be some other issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find out that none of the images on my site violates copyrights? Is there any tool that can do this without having to check manually image by image?
We plan to add several thousand images to our site and we outsourced the image search to some freelancers who had instructions to just use royalty free pictures. Is there any easy and quick way to check that in fact none of these images violates copyrights without having to check image by image? In case there are violations we are unaware of, do you think we need to be concerned about a risk of receiving Takedown Notices (DMCA) before owner giving us notification for giving us opportunity to remove the photo?
Web Design | | lcourse1 -
Best course of action when removing 100's of pages from your site?
We had a section on our site Legal News (we are a law firm). All we did there was rehash news stories from news sites (no original content). We decided to remove the entire Legal News section and we were left with close to 800 404's. Around this same time our rankings seemed to drop. Our webmaster implemented 301's to closely related content on our blog. In about a weeks time our rankings went back up. Our webmaster informed us that we should submit each url to Google for removal, which we did. Its been about three weeks and our Not Found errors in WMT is over 800 and seems to be increasing daily. Moz's crawler says we have only 35 404's and they are from our blog not the legal news section we removed. The last thing we want is to have another rankings drop. Is this normal? What is the best course of action when removing hundreds of pages from your site?
Web Design | | MFC0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Will changing content managment systems affect rankings?
We're considering changing our content management system. This would probably change our url structure (keep root domain name, but specific product pages and what not would have different full urls). Will our rankings be affected if we use different urls for current pages? I know we can do 401 redirects, but anything else I should consider? Thanks, Dan
Web Design | | dcostigan0 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
SEO and Server Connectivity....
Good Morning/Evening Mozzers, I arrive at work this morning with 5 emails from GWT for my separate domains reading, **"Googlebot can't access your site - **Over the last 24 hours, Googlebot encountered 39 errors while attempting to connect to your site. Your site's overall connection failure rate is 15.1%." I have passed this on to the Web Dev team to resolve ASAP. My Question, will server connectivity issues harm my rankings? Is there a danger if this continues that URL's could be de-indexed? Input would be greatly appreciated.
Web Design | | RobertChapman0 -
Site Review Please
A few weeks back I posted a question regarding a client's website and the very high Bounce Rate. We were looking at about 70 to 80% BR. The site was rough, and so I asked for some feedback that can be seen here: http://www.seomoz.org/q/high-bounce-rate-on-this-specific-page Since that time we've changed the WP Theme and the client would like specific feedback from the SEO experts. The BR has dropped by roughly 53% over the past month since we changed things, as traffic has risen by roughly 20%. We have lost some rankings for our targeted keywords, even as nothing has greatly changed on the site other than the theme. But even as we've lost some rankings for targeted keywords, we've gained for others and increased traffic dramatically. So, what do you think of the new site? Any feedback would be greatly appreciated. Thank you! http://EraseDisease.com
Web Design | | Linwright0 -
Do you think it will be a good idea to delete old blog pages off the server
I paid somebody to build my website using Dreamweaver, and at one point I didn't know how to use the template which automatically updates every page in the menu section so I stupidly broke the template on every new page when I made the websites blog and put the pages into a subfolder. I realised this was a silly thing to do and now and I now know how to use the template correctly I've copied every single page over from the subfolder and put it into the main template. Now I can update the template menu and every page changes automatically. The only problem is I've now got two versions of every page of my blog on the website. For some reason when I do a sitemap it comes up with a links to the old blog pages I, don't know why when I've removed the links from the blog page? and also the new copies also. I have basically got a copys of all blog pages. Do you think it will be a good idea to delete old indexed blog pages off the server so that when Google spiders the site it will pick up only the new links to the copy pages?
Web Design | | whitbycottages0