Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.

W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are on-site content carousel bad for SEO?
Hi, I didn't find an answer to my question in the Forum. I attached an example of content carousel, this is what I'm talking about. I understand that Google has no problem anymore with tabbed contents and accordeons (collapsible contents). But now I'm wondering about textual carousels. I'm not talking about an image slider, I'm talking about texts. Is text carousel harder to read for Google than plain text or tabs? Of course, i'm not talking about a carousel using Flash. Let's say the code is proper... Thanks for your help. spfra5
Technical SEO | | Alviau0 -
Product Schema Markup for All Products
Hi Team, Google search console used to allow you to use their structured data markup helperhttps://www.google.com/webmasters/markup-helper/u/0/ to markup multiple product pages at once that were similar. I do not see this feature anymore with the new search console. Does anyone have a recommendation for marking up multiple product pages without having to have schema markup firing in GTM for each product page?
Technical SEO | | agrier0 -
Subpage with own homepage and navigation good or bad?
Hi everybody, I have the following question. At the company I work, we deliver several services. We help people buy the right second hand car (technical inspections). But we also have an import-service. Because those services are so different, I want to split them on our website. So our main website is all about the technical inspections. Then, when you click on import, you go to www.example.com/import. A subpage with it's own homepage en navigation, all about the import service. It's like you have an extra website on the same domain. Does anyone has experience with this in terms of SEO? Thank you for your time! Kind regards, Robert
Technical SEO | | RobertvanHeerde0 -
Adding Reviews to JSON Product Schema Markup
Hi everyone, Below is an example of some JSON product schema markup I'd like to integrate into my site. My question is, what do I need to do to incorporate the individual reviews on a product page as well? I've tried a few different things but I can't get it to validate.
Technical SEO | | VDigitalServices0 -
Word mentioned twice in URL? Bad for SEO?
Is a URL like the one below going to hurt SEO for this page? /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html I like the match the URL and H1s as close as possible but in this case it looks a bit funky. /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html
Technical SEO | | jsilapas0 -
Are subdomains a good seo strategy for a multistore e-commerce?
Hi there I'm wondering what is the best strategy to work with multi-stores on magento: to use or not to use subdomains? Suppose we have the www.website.com and we configure it to use multistore. The url base will not have the store id on it so it will not be like www.website.com/store1 and www.website.com/store2. It will simply rely on the user session so if we have two categories for each store it will acces using: www.website.com/category1 (for store 1) www.website.com/category2 (for store 2) The homepage will allways be set on www.website.com so we should have a single page for several "home pages" (depending on the user session / store he is accessing). I guess this is not a good option if we want to rank for different keywords (for each store). So I was wondering if it is a good solution to set: store1.website.com store2.website.com This way we have 2 "home pages" each one able to rank. Does it make sense? Is it good or bad for seo? Another option I was considering was: www.website.com (for store 1) store2.website.com (for store 2) store3.website.com (for store 3) www.website.com/blog (for blog) Can this work? Good or bad for seo? best regards
Technical SEO | | qgairsoft0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Move established site from .co.uk to .org - good or bad idea?
I am currently considering moving our site from the current .co.uk domain to the .org version which we also own. The site is established and indexed for 7 years, ranks well and has circa 10k traffic per month which is mainly UK & US traffic. The reason for the change to the .org domain is to make the site more global facing and give us the opportunity to develop the site into multi language within directories (.org/es/ etc.) and then target those to the local search engines. For the kind of site it is (community based) it wouldn’t really work to split this into lots of separate country targeted domains. So the choice is to either stick with the .co.uk and add the other foreign language specific content in directories within the .co.uk or move to the .org and do the same (there is also a potential third option of purchasing the .com which is currently unused but that could be pricey!) We are also planning a big overhaul of the site with redesign, lots of added content and reorganisation of the site – but are thinking that it would be better to move the domain on a 1:1 basis first with the current design, content and URL structure in place and then do the other changes 2 or 3 months down the line. I have read up on SEOmoz, google guidelines etc on moving a site to a new domain and understand the theoretical approach of moving the site and the steps to take (1to1 301 redirects, sitemaps on old and new etc) and I will retain ownership of the .co.uk so the redirects can remain in place indefinitely. However having worked so hard to get the site to where it is in the search engines and traffic levels I am very worried about whether the domain change is a good move. I am more than happy to accept a temporary fluctuation in rankings & traffic for 1 – 4 weeks as reported may happen as long as I can be sure it will return after a temporary period and be as strong (or almost as strong) as the previous rankings / traffic. Looking for peoples experiences to give me the confidence / reassurance to go ahead with this or any info on why I shouldn’t Thanks in advance for your advice. Adrian.
Technical SEO | | Zilla0