Can the template increase the loading time of the site?
-
Hi,
My site was built with WordPress. Very recently I had it redesigned. The problem is that now it takes a long time to download. I have spoken with a web designer who checked my site and said that after it was rebuilt, the template that was created included a lot of hard coding. Can this be the reason why my site now takes a long time to load? The hard coding factor?
Thank you for your help.
- Sal
P.S.: FYI the site only has a few plug-ins and the server is a good one.
-
Hi,
I think that is exactly what happened: a bunch of new Javascript and css files.
Thank you.
-
I saw a site that was redesigned and then became incredibly slow. Their issue was that there were a number of widgets that loaded a bunch of different Javascript and css files. There was also an issue where the site wouldn't load until all the images (and there were many dozen) on the page loaded. I use Chrome's developer tools to see all the elements that are loaded on the page. It's very useful in determining what's taking the most time to load.
-
Hi,
Thank you for the response. I've hired a new web designer with expertise in WordPress who will take care of the factors you have mentioned.
All best.
-
Thank you for informing me about WpEngine. That must be the right solution for making WordPress sites faster and more reliable.
All best.
-
Google Page Speed provides very helpfull information on what factors are import for your site. It determines what is slowing your pages down, and what you can do to fix it. It also distinguishes between various degrees of importance. You can check out the speed test at: https://developers.google.com/pagespeed/.
Good luck!
-
Depends on db requests per page and all sorts of stuff.
Anyway SEOMOZ recommend these guys and wordpress backs them as well - http://wpengine.com/
Check it out
-
Your redesign guy could have used enormous-size images in the design. He could have added code that takes time to execute. There could be calls from other websites for ads or widgets or images that are slow.
-
There are lots of factors which can affect the loading time of a web page, without a URL I cannot give you an indication of where the performance issue is likely to be.
With a url I can tell you which areas the issues are caused by i.e. creation of page at server, poor css, e.t.a.
It is very unlikely (but I can't rule out completely) that hard coding will cause performance issues. Hard code does cause problems when making changes or updates to a site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Can Google read content/see links on subscription sites?
If an article is published on The Times (for example), can Google by-pass the subscription sign-in to read the content and index the links in the article? Example: http://www.thetimes.co.uk/tto/life/property/overseas/article4245346.ece In the above article there is a link to the resort's website but you can't see this unless you subscribe. I checked the source code of the page with the subscription prompt present and the link isn't there. Is there a way that these sites deal with search engines differently to other user agents to allow the content to be crawled and indexed?
Intermediate & Advanced SEO | | CustardOnlineMarketing0 -
Checking Mobile Site Response Time
What is the best way to check the response time of a mobile site? Can this be done in Google Analytics/Webmasters?
Intermediate & Advanced SEO | | theLotter0 -
Site speed tests
In webmaster tools my site is showing that it is taking longer and longer to load, and it has now doubled. Is there a way to check which pages are the problem? The site is quite large so I can't check them one at a time.
Intermediate & Advanced SEO | | EcommerceSite0 -
How can this site rank post panda/penguin?
I am doing link building for an adult dating comparison website. One of the main competitors though, having checked their backlink profile have anchor text that is not varied at all. In fact many, many links that are all the same. How can they possibly rank in the post panda/penguin era? In fact they're at number 2! The site is an adult site and it www.f hypen buddy.co.uk if anyone wants to runa backlink check on OSE. Any help greatly appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Do I have to tell WBT site moved to a subdirectory on another internal site?
I am moving content from one site to another and redirecting the DNS from www.oldsite.com to www.newsite.com/old-site. I have put the 301 in place but I wanted to make sure I have to also tell Webmaster Tools to change the old site to the new domain? We still want the old domain name to answer and redirect to www.newsite.com/old-site. Thanks
Intermediate & Advanced SEO | | GeorgeLaRochelle0 -
Issues with Load Balancers?
Has anyone ran into SEO issues with sites utilizing load balancing systems? We're running into some other technical complications (for using 3rd party tracking services), but I'm concerned now that the setup could have a not-so-good impact from an SEO standpoint.
Intermediate & Advanced SEO | | BMGSEO0