CSS Issue or not?
-
Hi Mozzers,
I am doing an audit for one of my clients and would like to know if actually the website I am dealing with has any issues when disabling CSS. So I installed Web developer google chrome extension which is great for disabling cookies, css...
When executing "Disable CSS", I can see most of the content of the page but what is weird is that in certain sections images appear in the middle of a sentence. Another image appears to be in the background in one of the internal link section(attached pic)
Since I am not an expert in CSS I am wondering if this represents a CSS issue, therefore a potential SEO issue? If yes why it can be an SEO issue?
Can you guys tell me what sort of CSS issues I should expect when disabling it? what should I look at? if content and nav bar are present or something else?
Thank you
-
Thank you John for your help!
-
The point I'm trying to make is that a CSS problem likely won't result in any huge changes in your SEO. There's a CSS problem if you can visually see something positioned or sized incorrectly on your pages with CSS enabled, not disabled.
Search bots will do some CSS/Javascript rendering, but more towards seeing how large things are on the pages (& trying to find your headers), making sure you're not hiding text (setting text colors the same as background colors), and things of that nature.
-
Hey John,
So how can I judge if there is a CSS problem? Should I go through all the CSS code? Can you give me an example. A screenshot would help.
Thanks!
-
Not to identify anything in particular, it's just that's what the bots are reading and indexing. The bots read the source of the page, and will attempt to do some CSS and javascript rendering, but they're not reading the page like it's seen within a browser.
-
Hey John,
I realized I never answered you sorry about that! Thanks for the help!
One quick question though: "If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS."
View page of the source and identify what exactly?
Thanks John!
-
CSS positions things on the page, so if you remove it, it's not surprising that lots of elements overlap. The page isn't going to look good. This is nothing to worry about.
If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
Magento Rewrite Issue
Moz's Crawler has thrown up a bunch of crawl issue for my site.The site is a magento based site and I recently updated the themes so some routes may have have become redundant. Moz has identified 289 pages with Temporary Redirect. I thought magento managed the redirects if I set the "Auto-redirect to Base URL" to Yes(301 Moved permanently). But this is enabled on my store and I still get the errors. The only thing I could think of was to add a Robots.txt and handle the redirection of these links from here. But handling redirection for 289 links is no mean task. I was looking for any ideas that could fix this without me manually doing this .
Technical SEO | | abhishek19860 -
Penguin update: Penalty caused from onsite issues or link profile?
Back in April before the Penguin update, our website home page ranked in the #1 position for several of our keywords and on page 1 for dozens of other keywords. But immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords. The sharp drop was obviously a penalty of some kind. We worked on removing some bad back links that were questionable. Over the past 7 months many of the bad links have dropped off and our link profile is improving. Our rankings, however, have not improved at all. In Yahoo and Bing we remain strong and rank on page 1 for many of our keywords. I joined SEOmoz because I’ve heard about their great tools and resources for SEO. The first thing I learned is that I had a lot of errors and warnings that need to be addressed and I’m optimistic that these items once addressed will get me out of that dreadful penalty box we’ve been in for 7 months now. So with that quick summary of our SEO problems I have a few questions that I hope to get some direction on. 1. Crawl Diagnostics for my site in SEOmoz reports 7 errors and 19 warnings including missing meta description tags, temporary redirects, duplicate page content, duplicate page title, 4xx client error, and title element too long. Could these errors and warnings be what has landed my website in some kind of penalty or filter? 2. A couple of the errors were duplicate page title and duplicate page content. So there appears to be a duplicate home page. Here are the two pages: howtomaketutus.com/ howtomaketutus.com/?p=home They are the same page but it looks like Google is seeing it as duplicate content. Do I need to do a 301 redirect in the .htaccess file? I’m not sure how that would work since they are the same page. If that is possible how would I go about doing that? 3. Finally based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile? We would really appreciate any help or direction anyone can offer on these issues. Thanks
Technical SEO | | 123craft0 -
Can these Yoast SEO integration issued be solved?
I had a site analysis done by the Yoast team. Of course, one of their recommendations was to install the Yoast SEO product. I see that there are many independent sources highly recommending it. But my programming is coming back with some concerns. My website is www.heartspm.com, for anyone who is interested in further analysis of my issue. My programmer's notes follow. Unfortunately, they are a bit to cryptic: "1. it does not do all for us anyway, so we'd have to keep the one's I made
Technical SEO | | GerryWeitz
before anyway, at least partially, but it will add up to complexety, loading
time as all new, and especially one-size-fits all plugins 2. another problem - Simple Press forum. Yoast does not have an extention for it, nor is it going to work along with our existing forum's SEO handling solution. So, it's
either reverting back to original SEO handling or forgetting Yoast. 3. original handling means no meta description although I think Yoast will put just
one description for each page, and title will be something like "<forum <br="">title> - <topic title="">whatever else you may want" 4. the same thing with XML sitemap 5. we won't be able to use Yoast's because it won't handle cities 6. and hopefully it will work with our existing XML sitemap solution as this can be turned off" Now, The Yoast team admitted and logically said that the Yoast plugin cannot support another plugin and therefore it does not support forums. We currently integrate the forum into our Wordpress site and perhaps imperfectly, but we are able to append forum links to the xml sitemap. Do any of you have insight into how one incorporates a forum into a Wordpress blog while using Yoast SEO?</topic></forum> We have special parameter based handling of cities. We used to pass codes like city=?zip and the like. Now, the URLS are very clean looking, but they use templates to match up cities in a city table with content that is somewhat similar, yet customizable. So my programmer seems to be saying that the Yoast tool doesn't seem to handle templates well or is not able to interpret it and get it into a sitemap. Do any of you have insight into using Yoast SEO with Wordpress sites that use templates and variables for locals or something of a similar nature? The Yoast SEO module also has switches to help resolve issues with duplicate content and archives and authors and multiple pages, i.e. page 1-10 of Latest News. If I can't use the whole Yoast module, are there ways to use only parts of it or to strip it down to what I want? Do you think with the above considerations, that the Yoast SEO module is not worth the trouble? Thanks in advance, Humbly, Gerry0 -
Canonical URL Issue
Hi Everyone, I'm fairly new here and I've been browsing around for a good answer for an issue that is driving me nuts here. I tried to put the canonical url for my website and on the first 5 or 6 pages I added the following script SEOMoz reported that there was a problem with it. I spoke to another friend and he said that it looks like it's right and there is nothing wrong but still I get the same error. For the URL http://www.cacaniqueis.com.br/video-caca-niqueis.html I used the following: <link rel="<a class="attribute-value">canonical</a>" href="http://www.cacaniqueis.com.br/video-caca-niqueis.html" /> Is there anything wrong with it? Many thanks in advance for the attention to my question.. 🙂 Alex
Technical SEO | | influxmedia0 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0 -
Domain Redirect Issues
Hi, I have a domain that is 10 years old, this is the old domain that used to be the website for the company. The company approximately 7 years ago was bought by another and purchased a new domain that is 7 years old. The company did not do a 301 redirect as they were not aware of the SEO implications. They continued building web applications on the old domain while using the new domain for all marketing and for business partner links. They just put in a server level redirect on the folders themselves to point to the new root. I am on Tomcat, I do not have the option of a 301 redirect as the web applications are all hard coded links (non-relative) (hundreds of thousands of dollars to recode) After beginning SEO; Google is seeing them as the same domain, and has replaced all results in Google with the old domain instead of the new one..... My questions is.... Is it better to take the hit and just put a robots.txt to disallow all robots on the old domain Or... Will that hurt my new domain as well since Google is seeing them as the same? Or.... Has Google already made the switch without a redirect to see these as the same and i should just continue on? (even the cache for the new site shows the old domain address) Old Domain= www.floridahealthcares.com New = www.fhcp.com *****Update after writing this I began changing index.htm to all non relative links so all links on the old domain homepage would point to fhcp.com fixing the issue of the entire site being replicated under the old domain. I think this might "Patch" my issue, but i would still love to get the opinion of others Thanks Shane
Technical SEO | | Jinx146780