Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilingual website
My website is https://www.india-visa-gov.in and we are doing multilingual. There are three options 1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German) 2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German) 3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German) We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA? Thanks
Technical SEO | | amitdipsite150220200 -
Redirecting a single page to another website
We're moving one of our services from the current website to a different domain. That domain will be dedicated just to that service. Currently, we have one page for that service on our current website. It already has been crawled and ranks well for certain keywords. But, we would want to eventually delete that page. Can I set 301 redirection for that page to another domain? At what point should I remove the page from the existing website's back-end?
Technical SEO | | infoAnalytica1 -
E-commerce website
Hi, I have to do a SEO optimization on a huge e-commerce website, but i don´t know if i have to focus in Schema or meta tags ( page title, meta description) which of them is more important? how can I optimize the website? Thanks
Technical SEO | | AbacoDigital1 -
Removing indexed website
I had a .in TLD version of my .com website floated for about 15 days, which was a duplicate copy of .com website. I did not wish to use the .in further for SEO duplication reasons and had let the .in domain expire on 26th April. But still now when I search from my website the .in version also shows up in results and even in google webmaster it shows the the website with maximum (190) number of links to my .com website. I am sure this is hurting the ranking of my .com website. How can the .in website be removed from googles indexing and search results. Given that is has expired also. thanks
Technical SEO | | geekwik0 -
My website keeps getting hit every other month. What should I do?
Since April 2012, my website impressions has dropped about 88% according to GWT. Every other month or so, the impressions are dropping by about 30%. My total traffic (visitors, not impressions) has dropped by about 25% but now I am getting lots of junk traffic. A few of the major keywords I used to rank for are still ranking in the top 10 but only in the USA. Lots of the keywords have gone to page 2 or 3 in the US and are gone to hell in other countries. Now I know I'm mostly responsible for this mess. About 3 years ago, I hired a freelancer to write news for my blog and she did a great job for quite some time so I stopped monitoring her work for duplicated content. Unfortunately, she started to provide me with copied content after a while and did so for almost 9 months before I noticed it. I had also hired RankPoop - errr I mean RankPop - to build some backlinks and that eventually got me in trouble too. I got an unnatural links warning in GWT in July 2012. Since then, I had more than 50% of the bad links taken down. There are still lots of them but they sure account for way less than 50% of all the backlinks. I have not submitted a reconsideration request yet as I haven't compensate for all the links taken down yet. I also started adding LOTS of fresh, unique and useful content to the website. I've added near 400 articles (sometimes up to 7 or 8 articles a day) over the last 5 months. I've also set lots of the duplicated posts to "noindex" and when they're not indexed anymore, I completely removed them in order to avoid any copyright issues (some were 100% identical to the source). I keep doing this gradually to avoid 404 errors. In early March of this year, I did a complete redesign of the site. The navigation structure stayed the same and visually, the layout is quite the same although the graphical elements are much more professional and the site is much faster. As much as I would've like to avoid a complete redesign, major technical issues from the previous design (and development platform) was now forcing me to do so. Unfortunately, I updated the website design right before the last Panda update so now I don't know if the recent traffic lost is due to the new design or because of Panda... or both. Google is like a police officer who repeatedly give you tickets for the same offense, yet they won't tell you what that offense is. My website is located at http://www.thewebhostinghero.com/ Any advice is welcomed. P.S. sorry for my english, I speak french.
Technical SEO | | sbrault740 -
Website Grader Report - Permanent Redirect Not Found
Have you ever checked HubSpot's website grader at www.websitegrader.com? I usually notice that the tool gives an error namely "Permanent Redirect Not Found" with below explanation: "Search engines may think www.example.com and example.com are two different sites.You should set up a permanent redirect (technically called a "301 redirect") between these sites. Once you do that, you will get full search engine credit for your work on these sites. :(Website Grader) Can we trust this tool?
Technical SEO | | merkal20050 -
Removing Out of Stock Items from an E-Commerce website
I have a dilemma. We have over 500 out of stock items that are still listed on our ecommerce website. I'm thinking it would be a good idea to leave them up because they are all considered content by google, and the keywords might drive traffic. On the other hand, the customers might be disappointed if the items are out of stock (we don't restock our sold out items), and many times, they will not lead to a conversion if the customer is looking for something very specific. Considering all these factors (and some unmentioned ones), my main question is: If I remove content, does that make all of the other content on our website stronger by having more pagerank and link juice flow to them, or do I hurt our rankings?
Technical SEO | | 13375auc30