Are W3C Validators too strict? Do errors create SEO problems?
-
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand."
What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code.
I ask this: If the search engine crawler is reading thru the code and comes upon an error like this:
…ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error).and this...
<code class="input">…t("?");document.write('>');}</code>
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error).
Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?
-
Google is a different case being run through the validator. I actually read an article on why google's site do not validate. The reason is that they send so much traffic, it actually saves them a good amount of money not closing tags that do not matter. Things like adding a self closing / to an img tag and the sorts.
While I do not think that validation is a ranking factor, I wouldn't totally dismiss it. It make code easier to maintain, and it has actually gotten me jobs before. Clients have actually ran my site through a validator before and hired me.
Plus funny little things work out too, someone tested my site on nibbler and it came back as one of the top 25 sites. I get a few hundred hits a day from it. I will take traffic any where I can get it.
-
I agree with Sheldon, and, just for perspective....try running http://www.google.com through the same w3c HTML validator. That should be an excellent illustration. A page with almost nothing on it, coded by the brilliant folks at Google still shows 23 errors and 4 warnings. I'd say not to obsess over this too much unless something is interfering with the rendering of the page or your page load speed.
Hope that helps!
Dana
-
Generally speaking, I would agree that validation is often too strict.
Google seems to handle this well, however. In fact, I seem to recall Matt C. once saying that the VAST majority of websites don't validate. I think he may have been talking strictly about HTML, though.
Validation isn't a ranking factor, of course, and most prevalent browsers will compensate for minor errors and render a page, regardless. So I really wouldn't be too concerned about validation just for validation's sake. As long as your pages render in most common browsers and neither page functionality nor user experience is adversely affected, I'd consider it a non-issue. As to whether a bot could be fooled into thinking the head had ended and the body had begun, I suppose it's possible, but I've never seen it happen, even with some absolutely horrible coding.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Question - Are 503/504 errors an issue?
Lately I've noticed more and more 503/504 errors being flagged in my MOZ reports. One week I had over 1300 errors show up. I checked Google Webmaster Tools and Bing Webmaster tools and noticed they were showing up in there too, although not near as many (50 or less per day). I contacted my hosting company about it and they said these were normal and that it was due to one nameserver reaching capacity, but that there was a backup nameserver that kicks in. I've seen one or two of these errors show up before, but never more than one or two a week. Is this something I should be concerned about?
Technical SEO | | Kyle Eaves0 -
Homepage 301 and SEO Help
Hi All, Does redirecting alternate versions of my homepage with a 301 only improve reporting, or are there SEO benefits as well. We recently changed over our servers and this wasn't set-up as before and I've noticed a drop in our organic search traffic. i.e. there was no 301 sending mywebsite.com traffic to www.mywebsite.com Thanks in advance for any comments or help.
Technical SEO | | b4cab0 -
Site redesign. Possible SEO problems?
Hi! Our website has enjoyed good rankings for lots of our keywords for the past 9years. Over the years the site became heavy, so we want to change the design radically. Content, filenames, titles, urls etc. will all remain identical just the change of a template. The new one is minimalistic and responsive design. We are worrying this may drop our domain or page authority and kill all our past SEO efforts. **How to do this with the least harm to our rankings? Anything we need to avoid?**Articles/advice/suggestions comments on design and SEO would be greatly appreciated. Thank you for your help! Alina
Technical SEO | | skalfa0 -
SEO Ultimate Plug in problems for SEO MOZ
Hi Newbee here! SEO ultimate seems to work ok for other url I have on this problem. http://www.pureescapism.co.uk and performing the on page grader for hair salon broadstairsI have canonicalizer on for the plug in and I have made it the rel=canonical targetIt tells me I haven't.Further down I don't get a tick also and am told : Remove all but a single canonical URL tag.Not aware I have more than one.I am also told : No More Than One Meta Description Element ( Cant see how I can do that either as I havent changed any code)Help please
Technical SEO | | Agentmorris0 -
Types of SEO Help
I have a web site that is going well but I think it could be better as far as usability and design. Also, I am sure an SEO professional would have some things to do to optimize. It seems though, that all the SEO companies either want to have along term contract or they don't work with my technology. Does anyone know of a company that would take my Visual Studio/C# project and tweak it for usability, design and SEO features for an hourly or set price?
Technical SEO | | Banknotes0 -
Drupal Updates = errors
We have worked diligently to correct our SEO Moz crawl diagnostic errors to below 20. However, at least twice now, our coder updates the drupal security warnings and BINGO- every time - our errors go sky high again. Any thoughts?
Technical SEO | | Stevej240 -
Changing DNS -- SEO implications?
Hey Moz, We're migrating an old site on an old server over to a new server/DNS. The plan is to keep the same URL structure and reuse our existing URL's. As long as we make minimal changes to each page's content, we should be able to update our DNS entry and get all the pages recreated and assigned to their correct URLs without any reduction in SEO rankings. Is this correct? This site gets a lot of organic traffic and ranks highly on some challenging keywords, so it's key that we retain our rankings as much as possible. I've read that it's wise to lower the DNS time-to-live to one hour, about a day before the move, to help Google crawl the DNS a little quicker. Are there any other recommendations you guys can offer or past experiences?
Technical SEO | | stephen_reply0