Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I nofollow/noindex the outgoing links in a news aggregator website?
We have a news aggregator site that has 2 types of pages: First Type:
Technical SEO | | undaranfahujakia
Category pages like economic, sports or political news and we intend to do SEO on these category pages to get organic traffic. These pages have pagination and show the latest and most viewed news on the corresponding category. Second Type:
News headlines from other sites are displayed on the category pages. The user will be directed to that news page on the main site by clicking on a link. These links are outgoing links and we redirect them by JavaScript (not 301).
In fact these are our websites articles that just have titles (linked to destination) and meta descriptions (reads from news RSS). Question:
Should we have to nofollow/noindex the second type of links? In fact, since the crawl budget of websites is limited, isn't it better to spend this budget on the pages we have invested in (first type)?0 -
Website is not indexing
Hi All, My website URL is https://thepeopeople.com and it is neither caching nor indexing in Google. Earlier the URL was https://peopeople.com. I have redirected it to https://thepeopeople.com by using 301 redirections. I have checked the redirection and everything else is fine and I have submitted all the URLs in search console also, still the website is not indexing. Its been more than 5 months now. Please suggest a solution for this. Thanks in Advance.
Technical SEO | | ResultfirstGA0 -
Amp version of website
Hello & thanks for reading its maybe the monday morning blues but i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/ the pages on the amp version have canonicals pointing to the "normal" website Should the links on "www.example.com/AMP/" point to the amp website or the normal website? what are your thougths?
Technical SEO | | livingphilosophy0 -
Website affected by Penguine / Panda
Dear All,
Technical SEO | | omverma
We have some websites. How we can check if site got affected by penguine / panda. We have observed few things since last few days like impressions are going down and keyword ranking is going down too. Any tools or any steps, to detect it will help us.
Thanks,
Om1 -
Google not crawling the website from 22nd October
Hi, This is Suresh. I made changes to my website and I see that google is unable to crawl my website from 22nd October. Even it is not showing any content when I use Cache:www.vonexpy.com. Can any body help me in knowing why Google is unable to crawl my website. Is there any technical issue with the website? Website is www.vonexpy.com Thanks in advance.
Technical SEO | | sureshchowdary1 -
How can I see if my website was penalize by Google?
Hello, I have a website http://digitaldiscovery.eu that I have been working for 7 months. Everything is alright in the index of the search engines like Google, Bing e Yahoo. I also have like 1000 visits a month wich is not bad for the topic Im pointing at in my country. However my pagerank insist to be on 0, and I really dont understand why. Some of the my competitors that started at the same time, already have a pagerank of 3 and they do not have the same visitors that I do. In the rank system of Alexa im climbing very fast and the visits of my website are growing. So why does the pagerank dont climb aswell?! Tks in advance, Pedro M Pereira
Technical SEO | | PedroM0 -
Website Ranking Issue
Hi, We have been performing our own onsite of offsite SEO along with external assistance and have ranked well over the years with minimal impact from Google updates. Howevr the last so called Panda update has affected us heavily pushing our main phrase 'web design melbourne' from 2nd to 7th where we have been for almost 2 months now on Google.com.au irrespective of onsite or offsite work. We have been trying to find signs of any onsite, IP, duplicate content, titles or other issues that may be holding us back to no avail. The only flag that Google webmaster tools is showing is a number of bad internal site links, which I think is a glitch with the CMS we are using. Even the SEO MOZ tool gives us a higher ranking compared to most competitors on page 1 of Google.com.au for our main phrase. The biggest difference between us and competitors is we chose to target an internal page specific to the topic rather than our homepage. With this sadi we have also reduced our keyword density and content quantity inline with the other sites homepages. Can anyone help shed some light on this? and perhaps something obvious that we have missed, or where we should be looking? Thanks.
Technical SEO | | paulsid0 -
What are SEO factors in re-doing a website?
Most of my work now involves converting older websites to CMS-based sites (in Wordpress) and I'm wondering about best practices here. If I create a "dev" or "sandbox" directory for my development work how do I keep the pages from being indexed while I am working on the new site? Can I "noindex" a directory? What do I do with the old html files when the new site goes live? I'm assuming I will do a 301 redirect from domain.com/index.html to the new domain.com/, and also on all of the inner pages that have equivalent pages in the new site. But there will be a lot of old files left that have no equal in the new site. Do I just delete these, or noindex nofollw them?
Technical SEO | | bvalentine0