W3C My site has 157 Errors, 146 warning(s) Is it an issue?
-
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance?
When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has?
Your advice please
Thanks
Ash
-
My website that is ranking well has errors too:
Result: 59 Errors, 4 warning(s)
So far the site is still ranking well so i am not very worried.
But i know my friend who engaged in an India SEO company to solve his problem. -
"On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad."
I use that test mainly to test the load times 1st and reload times, here are your results:
http://www.webpagetest.org/result/140208_3Y_B5M/
This was run for 1.5DSL connection from London, UK. 18sec and 10sec is def on the slower side, and caching could definitely help. As you want to aim for 2-5sec loads.
But roughly looking at the validation errors I didnt spot anything that would decrease your performance, in your case the performance is mostly likely not due to the validations. Again I just did a rough look through, so dont take my words for fact. Maybe others can chime in if they see something out of the ordinary with the errors.
-
Hi Vadim here is the URL for the W3C results
On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad.
I queried these issues with my developer when they launched the site but was told not to worry about it, but as my knowledge has grown I am not sure whether i should be worrying about it or not!?
I have posted other questions on Moz recently about why i can see x2 menus when looking at google cache of site & why you can only see the site in text and not the full version?
Plus using SEOTools crawler tool it shows me all the HTML coding aswell as body text when i do a keyword search?
Hopefully the W3C results will help you or someone shed some light on whether i am ok of not?
Thanks
Ash
-
Hi Ash,
Most would say the validator is at times too strict or in some ways moving slower to the changing technology. A good example is doing a validation for facebook.com still yields 45 errors, and 4 warnings.
Now 156 errors may be significant, depending on the errors. But assuming your developers did a good job, those might be things that may or may not be significant. For example nytimes.com has 500+ errors. Does that mean their site is slow or broken, not necessarily. It is interesting that you are asking about this 6 months since this issue was brought up and not right then and there
The first check I would run, is actual performance check on your site here is are good options for you: http://www.webpagetest.org/
You can test various browsers, internet connections, and locations, to see if the results are reasonable for your situation. Also posting your error codes here would help and one of us can help you and actually tell you if your errors are a big deal or not!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long should an old site redirecting to a new site remain activated on a server?
Once I switch a site to a new domain (with links to corresponding/relative pages), will I have to keep the old site live forever for those links to work, or how long should I wait before I inactivate the old site on our server?
Web Design | | jwanner0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Site Doing Horrible After Redesign
Hello Fellow Forum Members: Thank you all for taking the time to read this. This is in follow up to one of my previous questions, but I now have more information. I will try to be as concise as possible and want to sincerely thank anybody who invests time in answering this. Around February 9, 2013, we launched our new site on the Bigcommerce platform. We moved from Volusion after 6 years. We had paid the Bigcommerce partner for an upgraded 301 redirect package as I was thoroughly concerned about losing rankings. By the end of February our rankings were diminishing. We expected a slight dip due to the new site. As of May, our organic traffic had dropped by 82%. Google WMT is showing 1500+ 404 errors. Many have to do with review page type URLs and some were just plain never redirected apparently. In May, we hired a wonderful SEO company that is a heavy contributor to the Moz community. They have been generous and wonderful to work with. By the end of this last week it was determined that most of the coding suggestions our SEO was making could NOT be implemented in Bigcommerce because Bigcommerce will not allow access to the PHP files by our developer, thus hindering the execution of these suggestions. Some of these were move the blog to the root, use canonical on the home page, use canonical for pagination, stop the indexing of https URLs and a few more. Today, June 25 we are at a complete loss and trying to just keep our business alive. The opinion of both the SEO and the developer is that my choice of Bigcommerce as a platform was not the best. So my main question is what are the odds our rankings have decreased due to the lack of 301 redirects during our migration to Bigcommerce versus the rankings decreasing do to Bigcommerce being a bad choice as a platform? We are being advised to redevelop our entire site on an Open Source platorm such as Wordpress or Magento, but if that's not needed I certainly don't want to have to do that. I hope I have provided a decent amount of history and information. Thank you for any help/advice you are willing to offer.
Web Design | | josh3300 -
Site health - webmaster tools
A bit of an odd one. In Webmaster Tools, there's the option to order sites by site health. When we do this our site - http://www.neooptic.com/ - is near the bottom, despite there being little or no crawl errors. Any ideas why this could be happening?
Web Design | | neooptic0 -
Wordpress Blog Providing SEO to Main Site
Hi, I recently started a very much "learn on the job" SEO position, transitioning from a copywriting background. We currently have a wordpress blog up and running (and producing some decent quality content too I hope!) at example.com/blog/ and a sign up page located at example.com (sorry, can't put the address right now) for a site that is being custom built as it's got some nifty software linking to back end systems. My question is whether the content on the blog will bring SEO benefits to the main domain or whether it'll just be for the blog itself? If the latter, should we navigate the blog onto the a separate page of the main site? Thanks so much! I'm learning as much as I can as quickly as I can, but somethings still get me in a little bit of a tizzy.
Web Design | | LeahHutcheon0 -
Time On Site and SEO?
Does time on site impact rankings? If a person visits your site from the serps or directly visits it by typing in your name in the search field and then leaves within a minute, will that impact your serps? What is the best way to increase time on site?
Web Design | | bronxpad0 -
How is an SEO's time best used?
We have over 50 highly varied and niche sites in our company. Each website is for an annual event spread across the calendar. I am the solo SEO person here and was wondering what your opinions are about what would bring in the greatest SEO power in my limited daily allotment; link building? Keywords? Content? Oh, and to make my life even easier - its all based on SharePoint 2007!
Web Design | | DaveGerecht0