Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
-
Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
-
Thanks a lot for this usefull info, it helped me understand this better.
-
Hi,
From a code management point of view - as Peter says it's very common practice to split your CSS into different files as they are then much easier to manage and maintain. You can use a tool like Yahoo's YUI compressor to minify - as Bradley says - and aggregate (merge) these files.
From a web performance point of view, less files does not always mean better performance. Web browsers used to only download up to 2 files per domain, but now it's pretty standard for them to support 6 or more. See a browser breakdown for Max Connections and Connections per hostname here: http://www.browserscope.org/?category=network&v=top. I wouldn't recommend trying to split across 6 files, but you might find that if you have one massive CSS file it will download quicker when split up.
There is another disadvantage to having a single, CSS file in that you're not making the most of web browser caching. Every time you change any CSS, all users will have to download the entire file again. Again this may not be a problem for you, but something to bear in mind.
My advice would be to point Google Pagespeed at your website's key pages and act on as much as the feedback as possible: https://developers.google.com/speed/pagespeed/. It is a fantastic resource and presents its findings very clearly.
George
@methodicalweb -
That's what I was thinking too.. Currently, most of my frameworks have 10 CSS files, which means you have 10 server requests. Page speed as in my eyes a very important factor, therefore this question...
-
You could split them up based on where they are needed but that would become complicated. The advantage of splitting CSS on a large site is really to better organise the functionality of the CSS, e.g. system.css.
Peter
-
For a production environment, I would suggest having one minified CSS file. This will reduce file size (minifying) and server requests (1 file as opposed to 10). This will help reduce page load time.
Of course, on your staging environment, or in an archive of the website, it would be best to have your stylesheets broken down into an easier to manage system. That might mean multiple CSS files, it might not - it's up to you to manage.
-
Thanks for your answer!
It makes sense, because on large sites you will need different styling on different type of pages? So when you would put it all in 1 file, al this CSS would be loaded on ALL pages, while it's only needed on some particular?
Or what's the advantage here?
-
It really depends on how big your site is and how complex your CSS. On a small site or if it has minimal CSS one is perfectly adequate. On a larger site with lots of pages and CSS it makes sense to break down the the CSS around their function.Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am trying to use Page Optimization feature but it is giving me error.
Hi, I am trying to track a page optimization feature for one of my project, https://www.360degreespropertyinspections.com.au for keyword: property inspections melbourne but i keep getting this below error: "Page Optimization Error There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?
On-Page Optimization | | Abhijay191 -
Proper Use and Interpretation of new Query/Page report
When I'm in WMT/Search Console - I start a process of looking at all of the data initially unfiltered Then I select a query. Let's say its a top query for starters and I filter my results by that top query (exactly) With the filter on, I flip over to Pages and I get about a dozen results. When I look at this list, I get the normal variety of output: impressions, clicks, CTR, avg. position One thing that seems a bit odd to me is that most of the average positions for each of the URLs displayed is about the same. Say they range from 1.0 to 1.3. Does this mean that Google is displaying the dozen or so URLs to different people and generally in the 1st or 2nd position. Does this mean that my dozen or so pages are all competing with each other for the same query? On one hand, if all of my dozen pages displayed most of the time in the SERP all at the same time, I would see this as a good thing in that I would be 'owning' the SERP for my particular query. On the other hand, I'm concerned that the keyword I'm trying to optimize a particular page for is being partially distributed to less optimized pages. The main target page is shown the most (good) and it has about a 15x better CTR (also good). But all together, the other 11 pages are taking in around 40% of impressions and get a far lower CTR (bad). Am I interpreting this data correctly? Is WMT showing me what pages a particular query sends traffic to? Is there any way to extract the keywords that a particular page receives? When I reset my query and then start by selecting a specific page (exact match) and then select queries - is this showing my the search queries that drove traffic to that page? Is there a 'best practices' process to try to target a keyword to a specific page so that it gets more than the 60% of impressions I'm seeing now? Obviously I don't want to do a canonical because each keyword goes to many different pages and each page receives a different mix of keywords. I would think there would be a different technique when your page has an average position off of page 1.
On-Page Optimization | | ExploreConsulting0 -
Keyword density or No. of Time keyword used
Now, I know that there is no set figure to be used here, whichever metric you are using and it will depend on the article and what is natural. However, lets suppose for a minute that we are taking a keyword in isolation, and I have a 2000 word article using the keyword 17 times and rank no. 3 in Google SERPS. The no. 1 slot uses the keyword 8 times but only has a 800 word article and only a B grade on the onpage ranker. Of course, there are off page factors as well, but just wondering what your thoughts are on whether you look at density or total keyword usage. It is easy to just write without think about keyword density or usage, but occasionally you end up using the keyword about 50 times, and it is then I have to actually think about it. Other articles I barely use the keyword because the article just writes itself and it works out fine, but these are generally shorter. With longer articles on my best converting pages, I can't help but think about it more and it ends up a little hit and miss.
On-Page Optimization | | TheWebMastercom1 -
Does Google Analytics' Enhanced Link Attribution cause any SEO problems?
We are looking to implement Google Analytics Enhanced Link Attribution on our site. Our tech person says that this will cause SEO problems because of "duplicate URLS." I am not technical, so I don't understand this at all and can't find any research on the topic. I would like to know if there are any known SEO problems caused by putting in Enhanced Link Attribution.
On-Page Optimization | | DGM0 -
My competitors are using blackhat. What should i do.?
My competitors are using on page black hat methods They are using like keyword stuffing What should i do.?
On-Page Optimization | | aman1231 -
Problem with Occurrences of Keyword
At "On-Page Report " i have noticed that the only important problem my site has is the "Occurrences of Keyword " it says that i have ONLY 14.156 keyword repeat. My page ofcourse does not have so many repeats of same keyword. In fact this keyword is shown 10 times as i saw at source code of this page i tested. This report is for one page or for all pages of domain ? My keyword was two words keyword if that matters but there is no way that keywords to be repeated so many times.
On-Page Optimization | | Web-Builders0 -
Appropriate Use of Rel Canonical
Hello, in on page report card , for a kyeword: armadi portafucili blindati URL: http://www.bighunter.net/shop/searchresult.seam?codiceSettoreSel=CACCIA&codiceCategoriaSel=Armadi Blindati&codiceSottoCategoriaSel=Linea Legno DeLuxe&codiceMarcaSel=SILMEC i have a Critical Factor that don't undestand. It 's not ok "appropiate Use of Rel Canoncal, but in my page i have <link href="http://www.bighunter.net/shop/searchresult.seam?codiceSettoreSel=CACCIA&codiceCategoriaSel=Armadi Blindati&codiceSottoCategoriaSel=Linea Legno DeLuxe&codiceMarcaSel=SILMEC" rel="canonical"> and the link is the same of the url . I don't undestand where is the problem . Who can help me? Best Regards Luca
On-Page Optimization | | lbecarelli0 -
CSS and Spiders
I am using website auditor for my onsite SEO and I am getting a ton of CSS warnings. How important is to fix these issues? Where would you place this in you priority list?
On-Page Optimization | | SEODinosaur0