The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.
W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Traffic goes down after migration from aws to Microsoft azure cloud service
After migration of web application from aws ec2 instance to Microsoft azure web App service, we observed that we lost our 50% traffic. Our site custom domain is ihealthmantra.com and azure web App has default domain azurewebsites.net . Azure WebApp service has drawback that default domain gets in picture after mapping to my custom domain .We have mapped azure webAPP host name to our custom domain as CNAME record in DNS Table . Now same site working with two domains i.e ihealthmantra.com as well ass azurewebsites.net . As we seen this issues we made 301 redirection from azure default domain to our custom domain, Still no change in traffic.Google is now showing external links from azurewebsites.net to healthmantra.com . We are totally confused now . We don't know what exactly affected to our search traffic . Please Help us.
Technical SEO | | DivyaDubey0 -
Webpages & Images Index Graph Gone Down Badly in Google Search Console Why?
Hello All, What is going on with Sitemap Index Status in Google Search Console :- Webpages Submitted - 35000 index showing 21000 whereas previously approx 34500 were index. Images Submitted - 85000 index showing - 11000 whereas previously approx 80000 were index. Whereas when I search in google site:abcd.com is it showing approx 27000 index for webpages. No message from google for penalty or warning etc.Please help.
Technical SEO | | wright3350 -
SSL for subdomain is good or bad?
Hello, We have SSL certificate for our domain only for *.website.com, And now, we have few subdomains, as you know, we have two choices: 1. Using HTTPS for subdomain https://me.website.com, while it has problem with https://www.me.website.com (SSL error) 2. Using HTTP for subdomain, which has www and non-www with redirects. Which one is good for us?
Technical SEO | | Anetwork0 -
Am I using pagination markups correctly?
Hey Mozzers! I am receiving duplicate title tag errors from Search Console on paginated pages (blog.com/chlorine, blog.com/chlorine-2, blog.com/chlorine-3). I do not currently have a view all page. If I were to create one, would I add all the content from chlorine-2 and chlorine-3 to the blog.com/chlorine page? Then use the rel=cononical on chlorine-2 and chlorine-3 to blog.com/chlorine? If I move forward without the view all page, I could implement the next/prev HTML markups but can I do this without dev help? I am currently using the Yoast SEO plugin and do not see the option. Would I use the text editor to add the markups directly before the content? I think I have a grasp on this, but this will be my first time implementing and I want to double check first! Thanks!
Technical SEO | | localwork0 -
Ranking for multiple locations for the same service
I'm currently developing my brothers new website and taking care of the SEO. He provides roofing services and uPVC fascias, soffits & guttering service. He is looking to target multiple towns and cities within a region (Yorkshire). Each service has its own page but I'm wondering if it would be better to create a service page for each town with different content? It's quite difficult to re-write the service content for each town and not repeat yourself. For example, we're looking to target "roofer in leeds" "roofer in sheffield" "roofing services wakefield" etc etc Obviously it's more difficult to rank outside your physical town as the registered address is on Google maps but with content and link building we should see some results. I look forward to hearing some feedback.
Technical SEO | | Jseddon920 -
I'm looking for a good SEO
I need someone that can help me with my SEO. I am too busy to do it and the last person that did it, I really feel didn't do a good job. Please message me (If that is possible). I am looking for on-page, probably disavowing a good bit of links, and anything else that someone can point me in the right direction to do. I'm having some pretty major issues with my guy right now and I just feel like my rankings are falling off of the map because of it. Thanks!
Technical SEO | | Veebs0 -
To avoid errors in our Moz crawl, we removed subdomains from our host. (First we tried 301 redirects, also listed as errors.) Now we have backlinks all over the web that are broken. How bad is this, from a pagerank standpoint?
Our MOZ crawl kept telling us we had duplicate page content even though our subdomains were redirected to our main site. (Pages from Wineracks.vigilantinc.com were 301 redirected to vigilantinc.com/wineracks.) Now, to solve that problem, we have removed the wineracks.vigilantinc.com subdomain. The error report is better, but now we have broken backlinks - thousands of them. Is this hurting us worse than the duplicate content problem?
Technical SEO | | KristyFord0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0