CSS Issue or not?
-
Hi Mozzers,
I am doing an audit for one of my clients and would like to know if actually the website I am dealing with has any issues when disabling CSS. So I installed Web developer google chrome extension which is great for disabling cookies, css...
When executing "Disable CSS", I can see most of the content of the page but what is weird is that in certain sections images appear in the middle of a sentence. Another image appears to be in the background in one of the internal link section(attached pic)
Since I am not an expert in CSS I am wondering if this represents a CSS issue, therefore a potential SEO issue? If yes why it can be an SEO issue?
Can you guys tell me what sort of CSS issues I should expect when disabling it? what should I look at? if content and nav bar are present or something else?
Thank you
-
Thank you John for your help!
-
The point I'm trying to make is that a CSS problem likely won't result in any huge changes in your SEO. There's a CSS problem if you can visually see something positioned or sized incorrectly on your pages with CSS enabled, not disabled.
Search bots will do some CSS/Javascript rendering, but more towards seeing how large things are on the pages (& trying to find your headers), making sure you're not hiding text (setting text colors the same as background colors), and things of that nature.
-
Hey John,
So how can I judge if there is a CSS problem? Should I go through all the CSS code? Can you give me an example. A screenshot would help.
Thanks!
-
Not to identify anything in particular, it's just that's what the bots are reading and indexing. The bots read the source of the page, and will attempt to do some CSS and javascript rendering, but they're not reading the page like it's seen within a browser.
-
Hey John,
I realized I never answered you sorry about that! Thanks for the help!
One quick question though: "If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS."
View page of the source and identify what exactly?
Thanks John!
-
CSS positions things on the page, so if you remove it, it's not surprising that lots of elements overlap. The page isn't going to look good. This is nothing to worry about.
If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Indexing Issue of Dynamic Pages
Hi All, I have a query for which i am struggling to find out the answer. I unable to retrieve URL using "site:" query on Google SERP. However, when i enter the direct URL or with "info:" query then a snippet appears. I am not able to understand why google is not showing URL with "site:" query. Whether the page is indexed or not? Or it's soon going to be deindexed. Secondly, I would like to mention that this is a dynamic URL. The index file which we are using to generate this URL is not available to Google Bot. For instance, There are two different URL's. http://www.abc.com/browse/ --- It's a parent page.
Technical SEO | | SameerBhatia
http://www.abc.com/browse/?q=123 --- This is the URL, generated at run time using browse index file. Google unable to crawl index file of browse page as it is unable to run independently until some value will get passed in the parameter and is not indexed by Google. Earlier the dynamic URL's were indexed and was showing up in Google for "site:" query but now it is not showing up. Can anyone help me what is happening here? Please advise. Thanks0 -
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
We need a bit of help from someone to fix the following issues causing speed issues on our website.
Hi We need a bit of help from someone to fix the following issues causing speed issues on our website.Does anyone know of someone that can help? Reduce server response time Optimize images Eliminate render-blocking JavaScript and CSS in above-the-fold content Avoid landing page redirects Leverage browser caching Minify CSS Minify JavaScript Minify HTML
Technical SEO | | Bev.Aquaspresso0 -
What is best practice for fixing urls that have duplicate content, non-static and other issues?
Hi, I know there are several good answers regarding duplicate content issues on this website already, however I have a question that involves the best way to avoid negative SEO impacts if I change the urls for an ecommerce site. Basically a new client has the following website http://www.gardenbeauty.co.uk and I notice that it suffers from duplicate content due to the http://www version and the non www version of the pages - this seems quite easy to fix using the guidance on this website. However I notice that the product page urls are far from ideal in that they have several issues including:- (a) they are mostly too long (b) don't include the keyword terms (in terms of best practice) (c) they don't use Static URLS An example of one these product urls would be http://www.gardenbeauty.co.uk/plant-details.php?name=Autumn Glory&p_genus=Hebe&code=heagl&category=hebe I'd like to address these issues, but the pages rank highly for the products themselves, therefore my question is what would you recommend I do to fix the urls without risking the high positions that many of these product pages have? thanks, Ben
Technical SEO | | bendyman0 -
Javascript/CSS popup description box
Hi guys, I am trying to upload the following popup code on my server: http://jsfiddle.net/3wyHJ/ However, I cannot seem to upload the Javascript code on the right place, because the popup is not showing. Here is an example with what I am doing: _ Cookies?_ # of Votes: 123
Technical SEO | | petbiv
% Liked
_ [See User reviews](product review link)_ Could somebody help me?0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0