Improvement in Page Speed worth Compromise on HTML Validation?
-
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious?
-
Fatal Error: Cannot recover after last error. Any further errors will be ignored.
From line 699, column 9; to line 699, column 319
>↩ ↩
`OUR DEVELOPER'S COMMENT:
| This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance |
The domain URL is www.nyc-officespace-leader.com`
-
-
Yeah sequence of load is also important when its time to go granular to find the true opportunities. Because the up-front evaluation time that can identify issues, can often result in faster-easier-more template-driven ways to speed up everything on a larger scale with less effort needed.
That doesn't mean its okay to ignore other bottlenecks. Just that the more clarity of understanding, the more likely real, sustainable success can be achieved.
-
I agree with Alan's points. I have also found WebSiteTest.com really useful. It allows for multiple runs on multiple devices and you can download the results in CSV. Expanding on Alan's point around looking at bottleneck points, when you use these tools, you need to take time to understand the waterfall chart as that is where you can see how the browsers interact with all of these files (html, css, js, images etc).
I have been doing a ton of reading on front end optimization recently. Aside from all of the above, you could have issue with the critical rendering path (great resources here and here). Many times folks look at a single asset and say, "This javascript file is too big, lets minify it and get faster!" That is a good thing and will help you. That said, you have to look at the render path as you may have that same smaller JS file blocking other downloads that need to be downloaded first to render the page faster. Optimizing the render path can give you some additional gains.
Good luck!
-
Kingalan1
I'm not a programmer by trade - the way I begin even considering these things is by running tests on various tool platforms.
For example, put a page you think is slow into URIValet.com - test as Googlebot. The resulting report has a block of information in it regarding total size of files processed. It breaks that data down to file types. Look at the CSS/JS lines - if they are more than 50k to 100k total for either CSS or JSS, there is almost certainly inefficiency in there, and likely unnecessary bloat.
Go to WebPageTest.org and do the same - put in the URL you want to check - choose a server location and DSL (which gives a fair mid-range speed evaluation), and Chrome as the browser emulator. The resulting report gives you a lot of information, however the one page in that report that may be most helpful in this situation is the "Details" report - if you go there, and scroll down, you'll get to the section that lists, line by line, every single file, script, image and asset processed for that page, and all of the data on speed of processing each step of the way (such as First Byte Time, DNS lookup, SSL lookup, and more). Those can reveal several individual bottleneck points.
-
Thanks for your excellent, highly detailed response!!
Is there a way to test the CSS files that my developer has created to see if they are coded in an efficient and concise manner?
We use a virtual private server at Inmotion Hosting and Amazon CDN for for images. So I would think that the hosting service is adequate. Traffic does not exceed 3000 unique visitors a month so the load on the server is minimal.
-
1. Taking shortcuts that are not sound sustainable based methods to gain value somewhere else is almost certainly going to become a problem when you least expect it at some future date and this is a great example. Moving CSS and or JS to below the proper location is a recipe for complete page display failure on any number of devices that may or may not current exist.
Have you tested your pages with Google's Fetch and Render to ensure they properly load, or where they may get a "partial" result? If they get a "partial" result, that's a red flag warning that you ignore at your own peril.
2. You haven't provided numbers - is the page speed improvement a case of going from 20 seconds to down to 5 seconds? Or is it going from 8 seconds to 6 seconds? Or what? This matters when evaluating what to care about and expend resources on.
3. If just moving those to their proper place in the page header section is causing speeds to slow down dramatically, you have bigger problems. First one that comes to mind is "why do those scripts / CSS files cause so much speed slowdown? Its likely they're bloated and need to be reduced in size, or they're housed on a pathetic cloud server that is itself doing you more harm than good.
-
I'm not sure if it would affect the current page speed but it would fix the invalid HTML error from the validator. If the validation errors concern you it might be worth giving it a try and testing the result? It's good to make sure that pages validate all the high issues at least to be sure of no possible display or rendering issues in different browsers now or in the future.
-
Would correcting the code in this manner so the html validates result in a slower page load timE?
-
That error is coming up from the validator because the links to your stylesheets are outside the ending body and html tags. The stylesheet links normally go within the tags at the top but I understand from what you've said for page speed these have been moved to the bottom page however no tags / html / stylesheets / javascript etc should be outside the ending and tags.
If you move the CSS stylesheet references and the comments so they are where the javascript files are (before the ending tags) that would fix the fatal error you are seeing.
Hope that helps!
-
Thanks so much. I understand most errors are not too important. However I wonder if a "fatal" error should not be of grater concern.
Thanks, Alan
-
I am not a developer so any developer with a SEO background can tell you better but in general page load speed is important both from user point of view as well as search engine rankings and as far as W3C validation is concern, there are quite a few errors that you can ignore in order to stick with your page load speed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doorway page penalty
Has Google changed their interpretation of Doorway pages?We do not sell widgets but allow me to use Widget for this example;If we sold 25 very different widgets an online vendor would typically have 1 "mother" website with 25 different inner pages, each page to explain each type of widget they sell.However, for the past 9 years our approach is to have 25 different websites, one for each widget. With these 25 sites we concentrated on ranking the home page only . All these sites link back to our (No idexed) "Mother' site via no follow links where we have our Shopping Cart and Terms of Business. We did this partly to avoid having 25 separate Shopping Carts and to avoid having to change our Terms 25 times each time that became necessary. But yes we also did this as it was so much easier to rank each different type of widget in the SERPS. Also we think its a better user experience as in our business buyers of yellow widgets will not be interested in blue widgetsWe have been reading for years that google does not like doorways pages but we were not 100% certain if they might regard our sites as such .This is because our approach has worked great for nine years. That is until December last year when all 95% our sites fell dramatically in the SERPS usually from page 1 to page 2 or 3. First thing we did was to go through all our sites and search for the obvious; toxic links, duplicate content, keyword density, https issues, mobility issues, anchor text, etc etc and of course content. We found no obvious problems that could affect 95% of the sites at the same time but we ordered new homepage content for most of our sites from expert seo writers. However, after putting on this new content 3 -4 weeks ago our sites have not moved up the SERPS at all.So we are left with the inescapable conclusion that our problem is because google sees and devalues our sites as doorway pages especially as 95% of your sites have been affected all at the same time Would any SEO experts on this forum agree or be able to offer an opinion?If so, what might be the solution going forward? We have 2 solutions under consideration;1) Remove all links from each of our 25 sites to our "mother Site" and put a shopping cart and our TOS on each of the 25 sites so they are all truly independent stand alone websites.2) Create 25 inner pages on our mother site (after removing the no index) , for each of the 25 widgets we sell , then 301 each of the 25 individual sites home pages to its inner page on the mother site . I think this might be the best solution partly as almost all of our higher ranking competitors are ranking their inner pages not their homepage. But I worry if these 25 sites will really pass much link juice if they have been devalued by Google.?Any advice will be gratefully received.
Intermediate & Advanced SEO | | apcsilver90 -
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Exact match .org Ecommerce: Reason why internal page is ranking over home page
Hello, We have a new store where an internal category page (our biggest category) is moving up ahead of the home page. What could be the reason for this? It's an exact match .org. Over-optimization? Something else? It happened both when I didn't optimize the home page title tag and when I did for the main keyword, i.e. mainkeyword | mainkeyword.org, or just mainkeyword.org Home Page. Both didn't help with this. We have very few backlinks. Thanks
Intermediate & Advanced SEO | | BobGW0 -
Awesome Ecommerce category pages
Hi! We are in the process of overhauling our websites, and I am hoping that some of you can post URLs for websites that are ranking well and using lots of creative content to help rank their ecommerce category pages. You can post your own, or others that you admire.
Intermediate & Advanced SEO | | AMHC1 -
What constitutes a duplicate page?
Hi, I have a question about duplicate page content and wondered if someone is able to shed some light on what actually constitutes a "duplicate". We publish hundreds of bus timetable pages that have similar, but technically with unique urls and content. For example http://www.intercity.co.nz/travel-info/timetable/lookup/akl The template of the page is oblivious duplicated, but the vast majority of the content is unique to each page, with data being refreshed each night. Our crawl shows these as duplicate page errors, but is this just a generalisation because the urls are very similar? (only the last three characters change for each page - in this case /akl) Thanks in advance.
Intermediate & Advanced SEO | | BusBoyNZ0 -
Encoding Apostrophes in HTML
One question I have recently been pondering is that the use of apostrophes in ALT attributes and anchor text seems to cause some erratic results from the 'Fetch as BingBot' tool... GWT doesn't color code the results so i only notice it with BingBot, although i can't necessarily tell that it causes any negative effects. My rankings on Bing are erratic though so who knows... Should I encode apostrophes as ' when i use them in html? Or will it not really matter?
Intermediate & Advanced SEO | | SEO-Pump.com0 -
Can use of the id attribute to anchor t text down a page cause page duplication issues?
I am producing a long glossary of terms and want to make it easier to jump down to various terms. I am using the<a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p=""></a> <a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p="">Does anyone know whether Google will pick this up as separate duplicate pages?</a> <a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p="">If so any ideas on what I can do? Apart from not do it to start with? I am thinking 301s won't work as I want the URL to work. And rel=canonical won't work as there is no actual page code to add it to. Many thanks for your help Wendy</a>
Intermediate & Advanced SEO | | Chammy0 -
List of Off page techniques
Hello Everyone, Please share the list of off page techniques to improve ranking and which techniques completely avoided?
Intermediate & Advanced SEO | | Alick3000