W3C My site has 157 Errors, 146 warning(s) Is it an issue?
-
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance?
When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has?
Your advice please
Thanks
Ash
-
My website that is ranking well has errors too:
Result: 59 Errors, 4 warning(s)
So far the site is still ranking well so i am not very worried.
But i know my friend who engaged in an India SEO company to solve his problem. -
"On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad."
I use that test mainly to test the load times 1st and reload times, here are your results:
http://www.webpagetest.org/result/140208_3Y_B5M/
This was run for 1.5DSL connection from London, UK. 18sec and 10sec is def on the slower side, and caching could definitely help. As you want to aim for 2-5sec loads.
But roughly looking at the validation errors I didnt spot anything that would decrease your performance, in your case the performance is mostly likely not due to the validations. Again I just did a rough look through, so dont take my words for fact. Maybe others can chime in if they see something out of the ordinary with the errors.
-
Hi Vadim here is the URL for the W3C results
On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad.
I queried these issues with my developer when they launched the site but was told not to worry about it, but as my knowledge has grown I am not sure whether i should be worrying about it or not!?
I have posted other questions on Moz recently about why i can see x2 menus when looking at google cache of site & why you can only see the site in text and not the full version?
Plus using SEOTools crawler tool it shows me all the HTML coding aswell as body text when i do a keyword search?
Hopefully the W3C results will help you or someone shed some light on whether i am ok of not?
Thanks
Ash
-
Hi Ash,
Most would say the validator is at times too strict or in some ways moving slower to the changing technology. A good example is doing a validation for facebook.com still yields 45 errors, and 4 warnings.
Now 156 errors may be significant, depending on the errors. But assuming your developers did a good job, those might be things that may or may not be significant. For example nytimes.com has 500+ errors. Does that mean their site is slow or broken, not necessarily. It is interesting that you are asking about this 6 months since this issue was brought up and not right then and there
The first check I would run, is actual performance check on your site here is are good options for you: http://www.webpagetest.org/
You can test various browsers, internet connections, and locations, to see if the results are reasonable for your situation. Also posting your error codes here would help and one of us can help you and actually tell you if your errors are a big deal or not!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
| |
Web Design | | ahtisham2018
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> | and Please tell me effect on seo or not1 -
Drupal Domain Access SEO Issues
I'm working with a new developer to redesign several Drupal sites and have 3-4 sites with similar designs and modules. The developer is keen on using Drupal Domain Access to make maintenance and sharing user information easier. Each site currently has a unique domain and content (although the sites are in related niches). Are there issues from an SEO perspective with the Drupal Domain Access Module? With only one instance of Drupal on the backend will Google somehow not view these as independent sites? Thanks for any info!
Web Design | | talltrees0 -
'Security error' for links accessed via Facebook on Android phones
Hi, This is not strictly a SEO/inbound marketing question, so please excuse me for that--- but I think this awesome community could certainly help 🙂 We recently migrated a client website to https (SSL from Godaddy; the hosting provider is a different one). All that went fine. The problem though is that when a link from the website is shared on Facebook or sent via Whatsapp, and a user tries to open the page on any Android device, it throws up a Security Error. On the Facebook app, it doesn't allow the user to go any further. It seems that this problem is not unique and many others have raised it in various forums -- we've tried many of the options mentioned; have tried to work with Godaddy support as well ---- but the problem persists. Any solution(s)/fixes will be greatly appreciated. Thanks, Manoj
Web Design | | ontarget-media0 -
Large Global Site Structure
Hi, I have a question about the advised structure for a website that I'm currently building. It's a large international brand with it's main office in the UK. The main website is the .com but there is a growing international franchisee network. I've built the .com site on Wordpress but I'm not sure if I the best way forward would be to create each international website on a separate hosted site or just include it in the .com Wordpress structure using the The WordPress Multilingual Plugin. So to sum up... should I build the entire global network on one domain and then use WPML plugin or should I build separate websites for each International franchisee? Hope some one can educate me on the best route to take. Thanks Moz Community
Web Design | | SeoSheikh0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
I am looking to build an informational site that consists of a few landing pages. What kind of platform would you recommend?
The site would consist of an initial homepage, about us page, products & services page, sub-category pages that consist of the products and services in greater detail, and a contact us page. What platform would you recommend building this site on? I currently use Miva Merchant for an ecommerce platform, however this new site will not require the customization that Miva provides, and will also not need to have ecommerce capabilities. This will strictly be an informational site for prospective and current clients.
Web Design | | djlittman0 -
URLs appear in Google Webmaster Tools that I can't find on my own site?!?
Hi, I have a Magento e-commerce site (clothing) and when I had a look through some of the sections in Google Webmaster Tools I found URLs that I can't find on my site. For example, a product url maybe http://www.example.co.uk/product-url/ which is fine. In that product there maybe three sizes of the product (Small, Medium, Large) and for some reason Googlebot is sometimes finding a url like: http://www.example.co.uk/product-url/1202/ has been found and when clicked on is a live url (Status code: 200) with is one of the sizes (medium). However I have ran a site crawl in Screaming Frog and other crawl tests and can't seem to find where Googlebot is finding these URLs. I think I need to: 1. Find how Googlebot is finding these urls? 2. Find out how to keep out of index (e.g. robots.txt, canonical etc.... Any help would be much appreciated and I'm happy to share the URL with members if they think they can have a look and help with this problem. I can share specific URLs which might make the issue seem clearer, let me know? Thanks, Darrell
Web Design | | clickyleap0 -
Question about web site structure
Is there an SEO advantage for individual pages to be in sub folders vs not being in a folder? Of course site managemnt is easier with folders if you have 100;s of pages...clearly a shorter URL is easier for humans to naviagte. store.com/gadgets store.com/lasers vs. store.com/gadgets/lasers
Web Design | | johnshearer0