CSS and Spiders
-
I am using website auditor for my onsite SEO and I am getting a ton of CSS warnings. How important is to fix these issues? Where would you place this in you priority list?
-
None of those is there anything else we should avoid?
-
Minify css, use shorthand, avoid @import, old unsupported css code.
Since I'm not sure what kind of warning you are getting, it is really tough to answer your question.
Here is a great css tool: http://www.minifycss.com/css-compressor/
and you can Validate it here: http://jigsaw.w3.org/css-validator/
-
Can you please give me an example?
-
Again, it all depends on the error. There are css errors that hurt performance of the site.
-
None of those, these are only CSS errors and they target styling only.
-
Definitely.
The error could be a broken url (404) not found, or maybe a redirected url (301 or 302) redirect.
It is really hard to diagnose unless we know the exact warnings you are getting.
-
Oh, Is there a difference in the severity of the errors? I thought errors were just errors...
-
It all depends on the warning. What are they?
I treat every warning or error as a priority since if a tool can find them, so can Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content that's behind CSS..
For content that's been loaded onto the page.. but it requires a click for it to be revealed.. as in a slider, or a tab, to save space or for a page's organization.. what are your thoughts on Google counting or weighting this content? It would make sense for Google to give it partial or no weighting as if Google attributes the content to being there, its confusion for the user to land on the page and have to find it/click around to find it.. Sorry if this is an obvious question to SEOs.. I've always assumed as long as it was loaded, it'd be mostly counted.. but I'm beginning to doubt my assumption. Thanks!
On-Page Optimization | | speedcommerce0 -
Optimize CSS Delivery
Hi, I am loading 3 CSS files here: http://www.viatrading.com/wholesale/9/Domestics.html PageSpeed is telling me I "should fix" the delivery of these CSS Files (see image). I read https://developers.google.com/speed/docs/insights/OptimizeCSSDelivery , but can't figure out which is my case. The CSS are big, but even if I split them in several, all CSS files are still showing up as render-blocking. I moved them to the header/footer, but the message is still appearing. Do you know what might be the problem and how to solve it? Thank you, Screen_Shot_2015_09_10_at_4_44_23_PM.png
On-Page Optimization | | viatrading10 -
Do Search Engine Spiders Read Commented Out Content?
Do Search Engine Spiders Read Commented Out Content? Is commented out content detrimental?
On-Page Optimization | | lbohen0 -
Having a terrible time ordering the CSS Styles and Scripts in my header
Hi Guys, I am having a terrible time trying to get the correct optimized (for speed, none blocking etc) order for loading my external css and JS. I follow the recommendations from Google Page Speed or Chrome Audit and it seems no matter where I move the CSS file too (top or bottom) it complains about more blocking and stopping rendering of the page. My URL is http://www.MyFairyTaleBooks.com if some smart person out there could help me figure out what I am doing wrong and the order in which my should be organized I'd appreciate it! Oh I'm not a developer but I can re-arrange text in a file! 😉 Thank you!
On-Page Optimization | | MyFairyTaleBooks
Dinesh0 -
Content in forum signatures being spidered, does it matter?
Hello, first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions. The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example 1968 Car A - 1987 Car B - 1998 Car D and so on.... These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes. What I'm noting is a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess. and of more interest b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search. So what is the best approach? Leave as is? Hide the signatures from the BOTs? Another approach?
On-Page Optimization | | rutteger0 -
Unused CSS
Is there any advantage to removing unused CSS on a wp theme template. If removed will it not merely be added with the next update?
On-Page Optimization | | casper4340 -
Will google see you bold/emphasis words if done in css?
We do not use header tags in our website. I understand bolding or emphsising words can be equally as effective but if done so in css will the google crawlers and spiders etc be able to put a weighted value on this style of code???
On-Page Optimization | | gsbureau0 -
How deep should I let my forum be spidered
I run quote a niche website that's been running since late 1999 and over that time I've built up something like 4000 resources which consist of either text articles or image galleries and reviews along side another few thousand news stories relating to the niche interest. On top of the main site I also have a forum which isn't especially optimised for SEO and I was wondering, whilst was cleaning it up, whether anyone has any tips / suggestion / best practices for forum SEO. Because it is all UGC the quality of the posts can be quite weak so I was wondering whether I should block robots completely from the forum, which seems a little harsh, whether I should let the whole forum be spidered (which seems a little excessive and potentially a bad thing) or whether I should restrict things to that only the main index and perhaps one page of topics and their posts be accessible to robots and then nofollow the rest? Any thoughts?
On-Page Optimization | | StevenMapes0