Should we get our W3 Validation Errors Fixed for SEO. How important is it ?
-
Hi All,
We implement most things on our Website that is recommended and most recently we did Schema.org. However, one area which we haven't done is fix our W3 Validation Errors.
My developer thinks they are not so as such and it's more about ticking the boxes but does anymore have any experience whereby fixing all these did actually have an SEO /Ranking Benefit ?..
Most of our URL'S are indexed and google recrawls regularly so I am not sure as to it's importance.
Also we have a mobile responsive version so I wasn't sure if it more important because of this.
From what I read, I can't see to any benefit from fixing it all but just wanted some other opinions?
thanks
Pete
-
Many thanks LindaLV
Pete
-
Matt Cutts has said a number of times that invalid HTML in and of itself does not cause a penalty. Here's one example: http://youtu.be/j3KgrbiB1pc
"So Google does not penalize you if you have invalid HTML because there would be a huge number of webpages like that and some people know the rules and then decided to make things a little bit faster or to tweak things here there and so their pages don't validate and there are enough pages they don't validate that we said OK this would actually hurt search quality if we said only the pages that validate are allowed to rank or rank those a little bit higher."
Yes, you should write good HTML. But, unless it is so bad that it causes a bad user experience, simply having some W3C errors is not going to be a problem for SEO. (At least for now...)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Webjaguar SEO shortcomings
Hey All. I have a client whose ecommerce site is build in Webjaguar. Does anyone have experience with this platform. It appears to be loaded with technical SEO challenges (duplicate content, weird URLs, etc). Interestingly, when I Google "webjaguar SEO challenges" and things like that....nothing comes up. Suspicious, methinks. I appreciate any thoughts from SEO folks. Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Should i fix 404 my errors?
We have about 250, 404 errors due to changing alot of page names throughout our site. I've read some articles saying to leave them and eventually they will go away. Normally I would do a 301 redirect. What's the best solution?
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Are Silos Still Important for SEO?
I am in the process of migrating www.nyc-officespace-leader.com from Drupal to Wordpress and my developer is of the opinion that it is not necessary to implement silos to achieve favorable ranking for competitive keywords. I know a lot has changed in the last two years with Panda and Penguin. Is it SEO best practices to implement silos in the course of the redesign? Will this make a significant difference for SEO? Thanks, Alan Rosinsky
Intermediate & Advanced SEO | | Kingalan10 -
Optimal URLs for SEO and UX
We are considering restructuring the URL scheme on one of the websites we maintain. We have a few options. Currently news article URLs are as follows:
Intermediate & Advanced SEO | | Peter264
http://domain.com/news/1234/article-title-name/ Download section URLs are as follows:
http://domain.com/downloads/files/1234/file-title-of-download-here/ Forum URLS:
http://forum.domain.com/forum/topic/1234/title-of-forum-topic-here/ We feel that these are a bit too long for both SEO and user experience. We want to remove as many directories from the URLs as possible. From experience, what do you recommend changing for the example URLs above? We have some ideas below...and we need to keep the ID in the URLs...however I know this is a little frustrating. Some ideas we have for news articles:
http://domain.com/news/article-title-shorter-1234
http://domain.com/article-title-shorter-n1234 Some ideas for the download pages:
http://domain.com/downloads/file-title-shorter-d1234
http://domain.com/downloads/files/file-title-shorter-1234
http://domain.com/file-title-shorter-d1234 Some ideas for the forum URLs:
http://forum.domain.com/topic-title-shorter-t1234
http://forum.domain.com/topic/topic-title-shorter-1234 What do you think of these suggestions? Any other URL ideas? Recommended URL length? The purpose of is question was to find the perfect URLs for the site we are working on; your thoughts, suggestions and tips are very much appreciated.0 -
How important are domain names?
Hi All, Question: How important are domain names when trying to rank for a competitive keyword? Thanks
Intermediate & Advanced SEO | | wazza19850