W3C My site has 157 Errors, 146 warning(s) Is it an issue?
-
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance?
When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has?
Your advice please
Thanks
Ash
-
My website that is ranking well has errors too:
Result: 59 Errors, 4 warning(s)
So far the site is still ranking well so i am not very worried.
But i know my friend who engaged in an India SEO company to solve his problem. -
"On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad."
I use that test mainly to test the load times 1st and reload times, here are your results:
http://www.webpagetest.org/result/140208_3Y_B5M/
This was run for 1.5DSL connection from London, UK. 18sec and 10sec is def on the slower side, and caching could definitely help. As you want to aim for 2-5sec loads.
But roughly looking at the validation errors I didnt spot anything that would decrease your performance, in your case the performance is mostly likely not due to the validations. Again I just did a rough look through, so dont take my words for fact. Maybe others can chime in if they see something out of the ordinary with the errors.
-
Hi Vadim here is the URL for the W3C results
On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad.
I queried these issues with my developer when they launched the site but was told not to worry about it, but as my knowledge has grown I am not sure whether i should be worrying about it or not!?
I have posted other questions on Moz recently about why i can see x2 menus when looking at google cache of site & why you can only see the site in text and not the full version?
Plus using SEOTools crawler tool it shows me all the HTML coding aswell as body text when i do a keyword search?
Hopefully the W3C results will help you or someone shed some light on whether i am ok of not?
Thanks
Ash
-
Hi Ash,
Most would say the validator is at times too strict or in some ways moving slower to the changing technology. A good example is doing a validation for facebook.com still yields 45 errors, and 4 warnings.
Now 156 errors may be significant, depending on the errors. But assuming your developers did a good job, those might be things that may or may not be significant. For example nytimes.com has 500+ errors. Does that mean their site is slow or broken, not necessarily. It is interesting that you are asking about this 6 months since this issue was brought up and not right then and there
The first check I would run, is actual performance check on your site here is are good options for you: http://www.webpagetest.org/
You can test various browsers, internet connections, and locations, to see if the results are reasonable for your situation. Also posting your error codes here would help and one of us can help you and actually tell you if your errors are a big deal or not!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
Is it Bad to Break Up A Site into Multiple Sites?
I have a big cluttered website with endless pages. It's a non-profit that has content for patients, researchers, therapists, etc.. Would it be a bad idea to turn this cluttered site into 3 or more completely different sites, each focused on their specific demographic? Or should I just figure out how to organize the one site better? Thanks for your help!!!
Web Design | | bosleypalmer0 -
URL Re-Mapping Question ?. Do I need to the theme of my business in my url struture even though GWT knows what my site is about
Hi All, I have currently planning to do some url remapping on my Hire Website as alot of most important pages are far to many levels deep from the root domain. This is also making my sitemap not tidy etc. In GWT, Google knows that the theme is my website is Hire as it's the top word. Therefore do I still need to use the word hire in all my new url categories / structures or not ? Examples http://goo.gl/BFmvk2 I was thinking of remapping to www.xxxxxxx.xco.uk/tool-hire-birmingham http://goo.gl/pC9Bdp I was thinking of remapping to www.xxxxxx.co.uk/cleaning-equipment Notice in the later example , I do not have the word rent in the url. Any advice is much appreciated thanks peter
Web Design | | PeteC120 -
Website Drops Some Traffic after Redesign. What's Happening?
What it is NOT: No Link was broken. I have used Moz, Screaming Frog, Excel, etc - there are not broken links. We have not added spammy links. We kept the same amount of links and content on the homepage - with an exception of 1 or 2. All the pages remained canonical. Our blog uses rel=prev rel=next, and each page is canonicalized to itself. We do not index duplicated content. Our tags are content="noindex,follow" We are using the Genesis Framework (we were not before.) Load time is quicker - we now have a dedicated server. Webmaster tools has not reported any crawl report problems. What we did that should have improved our rankings and traffic: Implemented schema.org Responsive design Our bounce rate is down - Average visit length is up. Any ideas?
Web Design | | Thriveworks-Counseling0 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
Ajax pagination and filters for ecommerce site
Hi There, Is it ok to use ajax for product filters and pagination? In this case url doesn't change when you navigate to 2nd or 3rd page also when you filter by colours, etc. If not what's your advise?
Web Design | | Jvalops0 -
Landing Page/Home Page issues
Hi. I was speaking with my designer last night (we are setting up a new website) and we were discussing the design of our homepage, now the designer said he wanted the first page of the website to be a sort of landing page page were the visitor has to click and enter, im sure everyone has all come across these before. However, I am concerned as to the SEO implications of this? Any help guys?
Web Design | | CompleteOffice0