Long load time
-
My site takes double the time per kb than my competitors.
it hosted on shared hosting with Godaddy.com
Any ideas why this may be happening?
-
To be fair, your site isn't really overly slow.
www.appliance-repair-ny.com loads in an average of 3.9 seconds and is 194kb
www.all-appliance-repair-ny.com loads in an average of 5.6 seconds and is 327kb
www.newyorkappliancerepair.net loads in an average of 1.5 seconds and is 115kbAnd I think that's from Sweden. Your server is in Arizona so will be quicker from NY.
You could gzip your css, but it's not going to really give you a big improvement.
Yes, shared hosting will always be slower than a dedicated server but for the cost I don't think it will be worth going for a dedicated server and CDN delivery.
If you really wanted to track it you could add webmaster tools and (do you not have analytics on the page?) _gaq.push(['_trackPageLoadTime']); into your Google Analytics. This would let you see what times Google thought your page was loading in.
Speed is unlikely to be a defining ranking factor for you and you should concentrate your efforts more on acquiring links, reviews, and local optimisation.
-
-
What is your site and your competitors sites? It could be a lot of things.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URLs too long, but run an eCommerce site
Hi, When I started out I was pretty green to SEO, and didn't consider the usability/SEO impact of URL structure. Flash forward, I'm 5 years deep into using the following: mysite.com/downloads/category/premium-downloads/sub-category/ ("category" is quite literally one rung on the link - thanks, WordPress - however "sub-category" is a placeholder) I run a digital downloads store, and I now have 100s ofinternal links beholden to this hideous category linking structure. Not to mention external links at Google Ads, etc. I would LOVE to change this, but if I were to do so, what should I consider? For instance, is there a checklist for making a change like this? I was thinking of changing it to something like the following: mysite.com/shop/c/premium/sub-category/ And also, how much damage, if any, would this be doing to my SEO? Thanks in advance,
Technical SEO | | LouCommaTheCreator
Lou1 -
Is a short URL path stronger than a long one for an eshop?
Hello Moz Fans ! I'm building an eshop website and reviewing a few competitors website I found something interesting on which I don't have the full answer. Will it be better for me to organize the products in sub folder or in the root folder (option 1 or option 2) _Competitor link _shop.com/en/2628-buy-key-origin-the-sims-4-seasons/ Option 1 - Normal organization _+ _We can add relevant KW in /products/ product url will be one folder deep more home/ home/category/ (category page) home/category/subcategory1 home/category/subcategory2 home/products (this page does not exist really) home/products/product1 home/products/product2 Option 2 - less folders _+ _We can add all KW in the link directly it may be less organize for Google home/ home/category/ (category page) home/category/subcategory1 home/category/subcategory2 home/product1 (all product in the direct folders) home/product2
Technical SEO | | kh-priyam0 -
Google Sitemap - How Long Does it Take Google To Index?
We have changed our sitemap about 1 month ago and Google is yet to index it. We have run a site: search and we still have many pages indexed but we are wondering how long does it take for google to index our sitemap? The last sitemap we put up had thousands of pages indexed within a fortnight, but for some reason this version is taking way longer. We are also confident that there are no errors in this version. Help!
Technical SEO | | JamesDFA0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
One H1 tag Dead Long Live multiple H1 tags?
Good afternoon from 9 degrees C mostly cloudy Wetherby UK, Ive been holding on to the mantra of one h1 tag per page but a developer has challenged me on this by stating you can have multiple h1 tags on the condition the page is HTML 5 & each h1 tag is within its own section or article tag. So the question is do i need to change my tune? Thanks in advance, David
Technical SEO | | Nightwing0 -
How long to reverse the benefits/problems of a rel=canonical
If this wasn't so serious an issue it would be funny.... Long store cut short, a client had a penalty on their website so they decided to stop using the .com and use the .co.uk instead. They got the .com removed from Google using webmaster tools (it had to be as it was ranking for a trade mark they didn't own and there are legal arguments about it) They launched a brand new website and placed it on both domains with all seo being done on the .co.uk. The web developer was then meant to put the rel=canonical on the .com pointing to the .co.uk (maybe not needed at all thinking about it, if they had deindexed the site anyway). However he managed to rel=canonical from the good .co.,uk to the ,com domain! Maybe I should have noticed it earlier but you shouldn't have to double check others' work! I noticed it today after a good 6 weeks or so. We are having a nightmare to rank the .co.uk for terms which should be pretty easy to rank for given it's a decent domain. Would people say that the rel=canonical back to the .com has harmed the co.uk and is harming with while the tag remains in place? I'm off the opinion that it's basically telling google that the co.uk domain is a copy of the .com so go rank that instead. If so, how quickly after removing this tag would people expect any issues caused by it's placement to vanish? Thanks for any views on this. I've now the fun job of double checking all the coding done by that web developer on other sites!
Technical SEO | | Grumpy_Carl0 -
How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases. Anyone been through this? Any sort of timeline for a recovery? Much appreciated!
Technical SEO | | bheard0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0