The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.
W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The use of Markup language
Hi, We were thinking of adding markup language to our site. We have been reading about it to understand the actual benefits of doing so (we have seen many brands are not using it, including moz.com). So I have two questions: 1- Would you recommend using it for our site? www.memoq.com 2- If yes, would it be better to create a snippet of code for our home page as an "organization" and then different snippets for our product pages as "products". Looking forward to your comments,
Technical SEO | | Kilgray0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Disavow everything or manually remove bad links?
Our site is likely suffering an algorithmic penalty from a high concentration of non-branded anchor text that I am painstakingly cleaning up currently. Incremental clean-ups don't seem to be doing much. Google recommends I 'take a machete to them' and basically remove or disavow as much as possible, which I am now seriously considering as an option. What do you guys recommend, should torch the earth (disavow all links with that anchor text) or keep it on life support (slowly and manually identify each bad link)?
Technical SEO | | Syed_Raza0 -
Bad SEO is being done to www.myproactivechiro.com
Since January of 2013 my site's rankings have tanked, I don't know who, or old SEO firm that I left is doing this. A lot of bad links are coming from Russia and are killing my rankings, I need help what do I do, do I just start a new domain? thanks Alec
Technical SEO | | akhlebo0 -
Could schema.org and GoodRelations be bad for SEO?
One of my clients is going through a redesign and I am considering implementing schema.org and GoodRelations as it is an e-commerce website. The site sells cutting edge products and competes with some of the top tech blogs for rankings on the first page. Essentially, this means that e-commerce product listings are competing with news stories. It is becoming more and more difficult to rank as Google puts more emphasis on news over products in the serps, especially prior to a product release. My concern is that in implementing schema.org and GoodRelations, detailing to seach engines that this is in-fact a product page and not news could harm rankings. What opinions do others have on this?
Technical SEO | | pugh0 -
Know of a decent hosting service in France?
I'm looking for decent hosting services in France? Any recommendations? Thanks
Technical SEO | | Martin_S0 -
Should I add author markup to sales pages?
Adding author markup to the homepage or to SEO optimised sales landing pages is possible. However it doesn't really seem to be using the feature in the spirit of it's purpose. It makes sense for blog posts. It's possible for other pages and will likely improve CTRs from SERPs. But is it against the spirit of it's purpose?
Technical SEO | | designquotes0 -
What are some good ways to define yourself as a store to Goolge?
Google has the results that appear at the top of the search results when searching for a product that shows stores. I was trying to figure out some good ways to get yourself noticed as a store to Google. I have attached an image of what I am talking about What do you guys think? Thanks! arPjm
Technical SEO | | Gordian0