CSS Issue or not?
-
Hi Mozzers,
I am doing an audit for one of my clients and would like to know if actually the website I am dealing with has any issues when disabling CSS. So I installed Web developer google chrome extension which is great for disabling cookies, css...
When executing "Disable CSS", I can see most of the content of the page but what is weird is that in certain sections images appear in the middle of a sentence. Another image appears to be in the background in one of the internal link section(attached pic)
Since I am not an expert in CSS I am wondering if this represents a CSS issue, therefore a potential SEO issue? If yes why it can be an SEO issue?
Can you guys tell me what sort of CSS issues I should expect when disabling it? what should I look at? if content and nav bar are present or something else?
Thank you
-
Thank you John for your help!
-
The point I'm trying to make is that a CSS problem likely won't result in any huge changes in your SEO. There's a CSS problem if you can visually see something positioned or sized incorrectly on your pages with CSS enabled, not disabled.
Search bots will do some CSS/Javascript rendering, but more towards seeing how large things are on the pages (& trying to find your headers), making sure you're not hiding text (setting text colors the same as background colors), and things of that nature.
-
Hey John,
So how can I judge if there is a CSS problem? Should I go through all the CSS code? Can you give me an example. A screenshot would help.
Thanks!
-
Not to identify anything in particular, it's just that's what the bots are reading and indexing. The bots read the source of the page, and will attempt to do some CSS and javascript rendering, but they're not reading the page like it's seen within a browser.
-
Hey John,
I realized I never answered you sorry about that! Thanks for the help!
One quick question though: "If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS."
View page of the source and identify what exactly?
Thanks John!
-
CSS positions things on the page, so if you remove it, it's not surprising that lots of elements overlap. The page isn't going to look good. This is nothing to worry about.
If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
404 issues
Hello, Some time ago, something like a month and a half) I have removed all 404 errors from the google index and the webmaster tools have removed them already, however yesterday moz found the same 404 errors that i have removed from indexing (tose pages are deleted or redirected by the site developer). What could be an issue here and why webmaster tools are not registering those 404 errors but moz analytics does. And the other question is if those pages do not exist can i track where the placed? I tried dowloading moz crawl test, but the refering source was not provided. I would highly appreciate anyones help. Thank you
Technical SEO | | rikomuttik0 -
Issue Missing Meta Description Tag
Hello Friends, Today I found missing meta description tag when Seomoz update my website crawl diagnostics. I recovered other type missing meta description tag but I don't understand how can I recover this type page. Here is the examples. http://www.example.com/blog/page/2/ http://www.example.com/blog/page/3/ http://www.example.com/blog/page/4/ Links continue...... Thanks KLLC
Technical SEO | | KLLC0 -
Penguin update: Penalty caused from onsite issues or link profile?
Back in April before the Penguin update, our website home page ranked in the #1 position for several of our keywords and on page 1 for dozens of other keywords. But immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords. The sharp drop was obviously a penalty of some kind. We worked on removing some bad back links that were questionable. Over the past 7 months many of the bad links have dropped off and our link profile is improving. Our rankings, however, have not improved at all. In Yahoo and Bing we remain strong and rank on page 1 for many of our keywords. I joined SEOmoz because I’ve heard about their great tools and resources for SEO. The first thing I learned is that I had a lot of errors and warnings that need to be addressed and I’m optimistic that these items once addressed will get me out of that dreadful penalty box we’ve been in for 7 months now. So with that quick summary of our SEO problems I have a few questions that I hope to get some direction on. 1. Crawl Diagnostics for my site in SEOmoz reports 7 errors and 19 warnings including missing meta description tags, temporary redirects, duplicate page content, duplicate page title, 4xx client error, and title element too long. Could these errors and warnings be what has landed my website in some kind of penalty or filter? 2. A couple of the errors were duplicate page title and duplicate page content. So there appears to be a duplicate home page. Here are the two pages: howtomaketutus.com/ howtomaketutus.com/?p=home They are the same page but it looks like Google is seeing it as duplicate content. Do I need to do a 301 redirect in the .htaccess file? I’m not sure how that would work since they are the same page. If that is possible how would I go about doing that? 3. Finally based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile? We would really appreciate any help or direction anyone can offer on these issues. Thanks
Technical SEO | | 123craft0 -
Standard Responses Causing Duplication Issues
Hi Guys We have a Q&A section on our site which we reply to customers using standard responses which have already been approved. This is causing a lot of duplication errors, however due to the nature of our business we need to use these responses. Is there anything that we can do to stop this? Matthew
Technical SEO | | EwanFisher0 -
Issues with Google Analytics since 3/15 @ 6:00AM ET
Our site, IrishCentral.com has been experiencing issues with GA since 6:00AM ET 3/15. Our "realtime" analytics withing the new GA interface have been fine and no changes have been made to the site code at all. I'm wondering if anyone else is experiencing these issues and if there is a resolution. We are fine w/o them as long as we know that the aggregation of the data is delayed and not forgotten. We are reaching 1 million uniques this month and would be a shame to lose this data. Any help is greatly appreciated. Joe
Technical SEO | | Irishcentral1 -
See any issues with this tabbed content page?
When I view source, and view as Googlebot it's showing as 1 long page of content = good. However, the developer uses some redirects and dynamic page generation to pull this off. I didn't see any issues from a Search perspective but would appreciate a second opinion: Click here Thanks!
Technical SEO | | 540SEO0 -
OnPage Issues with UTF-8 and ISO-8859-1
Hi guys, I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside. If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars. I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8.. You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8 Remove the ?charset parameter and the charset it set to iso-8859-1. Hope somebody has an answer or can push me into the right direction. Thanks in advance and have a great day all. Jan
Technical SEO | | jmueller0