The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.
W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using 2 cache plugin good or not?
Hi, Can anyone tell me - whether using 2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some. Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.
Technical SEO | | nazfazy0 -
Data-vocabulary.org for structured markup in 2019
Hi MOZ friends, One of our clients has used data-vocabulary.org for structured markup. Schema.org says: "If you are already publishing structured data markup and it is already being used by Google, Microsoft, Yandex or Yahoo!, the markup format will generally continue to be supported. Changing to the new markup format could be helpful over time because you will be switching to a standard that is accepted across several companies, but you don't have to do it." Although there is such statement, as schema.org is the common vocabulary in 2019, should I keep it or change it with schema.org? Thanks in advance! 🙂
Technical SEO | | bbop330 -
Intentional Duplicate Content - Great UX, Bad for Ranking?
I'll try to keep this as clear and high level as possible. Thank you in advance for any and all help! We're managing a healthcare practice which specializes in neurosurgical treatments. As the practice is rather large, the doctors have several "specialties" in which they focus in, i.e. back surgery, facial surgery, brain surgery, etc. They have a main website (examplepractice.com) which holds ALL of their content on each condition and treatment in which they deal with. So, if someone enters their main homepage they will see conditions and treatments for all the specialties categorized together. However, linked within the main site are "mini-sites" for each specialty (same domain, same site) (examplepractice.com/brain-surgery), but with a different navigation menu to give the illusion of "separate website". These mini-sites are then tailored from a creative, content and UX perspective to THAT specific group of treatments and conditions. Now, anyone who enters this minisite will find information pertaining to only that specialty. The mini-sites are NOT set up as folders, but rather just a system of URLs that we have mapped out to each page. We set up the pages this way to maintain an exclusive feel for the site. Instead of someone drilling into a specific condition and having the menu change, we created the copies. But, because of how this is set up, we now have duplicate content for each treatment and condition child page (one on the main site, one on the minisite). My question (finally) is will this cause a problem in the future? Are we essentially splitting the "juice" between these two pages? Are we making it easier for our competitors to outrank us? We know this layout makes sense from the perspective of a user, but we're unclear how to move forward from a search perspective. Any tips?
Technical SEO | | frankmassanova1 -
SSL for subdomain is good or bad?
Hello, We have SSL certificate for our domain only for *.website.com, And now, we have few subdomains, as you know, we have two choices: 1. Using HTTPS for subdomain https://me.website.com, while it has problem with https://www.me.website.com (SSL error) 2. Using HTTP for subdomain, which has www and non-www with redirects. Which one is good for us?
Technical SEO | | Anetwork0 -
Word mentioned twice in URL? Bad for SEO?
Is a URL like the one below going to hurt SEO for this page? /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html I like the match the URL and H1s as close as possible but in this case it looks a bit funky. /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html
Technical SEO | | jsilapas0 -
Suggestions on good framework/code for building an optimized website?
There seem to be quite a few template, framework, and theme options for building a site optimized for search. I'm currently looking at Socrates and Genesis premium themes for Wordpress. Does anyone have experience or suggestions on these resources?
Technical SEO | | ksracer0 -
Page rank 2 for home page, 3 for service pages
Hey guys, I have noticed with one of our new sites, the home page is showing page rank two, whereas 2 of the internal service pages are showing as 3. I have checked with both open site explorer and yahoo back links and there are by far more links to the home page. All quality and relevant directory submissions and blog comments. The site is only 4 months old, I wonder if anyone can shed any light on the fact 2 of the lesser linked pages are showing higher PR? Thanks 🙂
Technical SEO | | Nextman0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0