Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something
But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
Inspection of the website is required.
Hello, Recently We have a taken a project " http://customerconnect-services.com/ " and are responsible for the Digital Marketing for the website. FYI, we have been following the best On-Page SEO practices ever since we took the project: Performing Keyword Research, Finalizing the Keywords,Using those in the Page-Title, Meta-Descriptions, Heading Tags and of course in the Content as per MOZ suggestions & SEO standards. But, we are unable to rank in first page; which is a serious matter of concern for us. We have also checked whether the Domain / URL has been blacklisted, but it's not (Not even by Google). We are therefore unable to figure out what is going wrong even after following so many best practices to get the keywords a good ranking (1st & 2nd page of SERPs). Therefore I would like to request you to provide your expert opinions in this regard by checking what is it that we are not getting right. The website url is http://customerconnect-services.com/. As this is a high-priority issue for us & the client is a prestigious one of course, please help. Looking forward to hearing from you at the earliest. Thanks & Regards,
Technical SEO | | Harini.M
Harini0 -
Website of Products by a Brand on a Sub-Domain or a Sub-Directory?
Hi, We have a brand website under which we plan to develop various apps. We think that putting each website on its own sub-domain would be a good option. E.g. Main website – abc.com; app1 website – app1.abc.com; app2 website – app2.abc.com etc. Would it be the right way to do it or should we put the apps in a sub-directory. E.g. abc.com/app1; abc.com/app2 etc. These apps would either be totally unrelated or may be in future we may provide something so as to enable their integration. Thanks
Technical SEO | | IM_Learner0 -
How to deal with 80 websites and duplicated content
Consider the following: A client of ours has a Job boards website. They then have 80 domains all in different job sectors. They pull in the jobs based on the sectors they were tagged in on the back end. Everything is identical across these websites apart from the brand name and some content. whats the best way to deal with this?
Technical SEO | | jasondexter0 -
Document.referrer Is that harmful to my website?
Someone (maybe my Competitors) open a subdomain on tumblr.com. Just like keywordxxx.tumblr.com , and use following script to redirect to my website. var s=document.referrer; if(s.indexOf("google")>0 || s.indexOf("bing")>0 || s.indexOf("yahoo")>0 || s.indexOf ("aol")>0){ self.location='mywebsiteurl'; } so , If anyone seach in google ,and click on keywordxxx.tumblr.com , this script will direct the user to my website. I have found many cases like that, is that harmful ? how can i avoid that?
Technical SEO | | sunvary0 -
SeoMoz robot is not able to crawl my website.
Hi, SeoMoz robot crawls only two web pages of my website. I contacts seomoz team and they told me that the problem is because of Javascript use. What is the solution to this? Should I contact my webdesign company and ask them to remove Javascript code?
Technical SEO | | ashish2110 -
Struggling to get my lyrics website fully indexed
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research! My website is http://www.lyricstatus.com - basically it's a lyrics website. Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now. My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed. I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed! I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google. Any help or advice greatly appreciated guys!
Technical SEO | | SEOed0 -
Google crawl index issue with our website...
Hey there. We've run into a mystifying issue with Google's crawl index of one of our sites. When we do a "site:www.burlingtonmortgage.biz" search in Google, we're seeing lots of 404 Errors on pages that don't exist on our site or seemingly on the remote server. In the search results, Google is showing nonsensical folders off the root domain and then the actual page is within that non-existent folder. An example: Google shows this in its index of the site (as a 404 Error page): www.burlingtonmortgage.biz/MQnjO/idaho-mortgage-rates.asp The actual page on the site is: www.burlingtonmortgage.biz/idaho-mortgage-rates.asp Google is showing the folder MQnjO that doesn't exist anywhere on the remote. Other pages they are showing have different folder names that are just as wacky. We called our hosting company who said the problem isn't coming from them... Has anyone had something like this happen to them? Thanks so much for your insight!
Technical SEO | | ILM_Marketing
Megan0