Improvement in Page Speed worth Compromise on HTML Validation?
-
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious?
-
Fatal Error: Cannot recover after last error. Any further errors will be ignored.
From line 699, column 9; to line 699, column 319
>↩ ↩
`OUR DEVELOPER'S COMMENT:
| This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance |
The domain URL is www.nyc-officespace-leader.com`
-
-
Yeah sequence of load is also important when its time to go granular to find the true opportunities. Because the up-front evaluation time that can identify issues, can often result in faster-easier-more template-driven ways to speed up everything on a larger scale with less effort needed.
That doesn't mean its okay to ignore other bottlenecks. Just that the more clarity of understanding, the more likely real, sustainable success can be achieved.
-
I agree with Alan's points. I have also found WebSiteTest.com really useful. It allows for multiple runs on multiple devices and you can download the results in CSV. Expanding on Alan's point around looking at bottleneck points, when you use these tools, you need to take time to understand the waterfall chart as that is where you can see how the browsers interact with all of these files (html, css, js, images etc).
I have been doing a ton of reading on front end optimization recently. Aside from all of the above, you could have issue with the critical rendering path (great resources here and here). Many times folks look at a single asset and say, "This javascript file is too big, lets minify it and get faster!" That is a good thing and will help you. That said, you have to look at the render path as you may have that same smaller JS file blocking other downloads that need to be downloaded first to render the page faster. Optimizing the render path can give you some additional gains.
Good luck!
-
Kingalan1
I'm not a programmer by trade - the way I begin even considering these things is by running tests on various tool platforms.
For example, put a page you think is slow into URIValet.com - test as Googlebot. The resulting report has a block of information in it regarding total size of files processed. It breaks that data down to file types. Look at the CSS/JS lines - if they are more than 50k to 100k total for either CSS or JSS, there is almost certainly inefficiency in there, and likely unnecessary bloat.
Go to WebPageTest.org and do the same - put in the URL you want to check - choose a server location and DSL (which gives a fair mid-range speed evaluation), and Chrome as the browser emulator. The resulting report gives you a lot of information, however the one page in that report that may be most helpful in this situation is the "Details" report - if you go there, and scroll down, you'll get to the section that lists, line by line, every single file, script, image and asset processed for that page, and all of the data on speed of processing each step of the way (such as First Byte Time, DNS lookup, SSL lookup, and more). Those can reveal several individual bottleneck points.
-
Thanks for your excellent, highly detailed response!!
Is there a way to test the CSS files that my developer has created to see if they are coded in an efficient and concise manner?
We use a virtual private server at Inmotion Hosting and Amazon CDN for for images. So I would think that the hosting service is adequate. Traffic does not exceed 3000 unique visitors a month so the load on the server is minimal.
-
1. Taking shortcuts that are not sound sustainable based methods to gain value somewhere else is almost certainly going to become a problem when you least expect it at some future date and this is a great example. Moving CSS and or JS to below the proper location is a recipe for complete page display failure on any number of devices that may or may not current exist.
Have you tested your pages with Google's Fetch and Render to ensure they properly load, or where they may get a "partial" result? If they get a "partial" result, that's a red flag warning that you ignore at your own peril.
2. You haven't provided numbers - is the page speed improvement a case of going from 20 seconds to down to 5 seconds? Or is it going from 8 seconds to 6 seconds? Or what? This matters when evaluating what to care about and expend resources on.
3. If just moving those to their proper place in the page header section is causing speeds to slow down dramatically, you have bigger problems. First one that comes to mind is "why do those scripts / CSS files cause so much speed slowdown? Its likely they're bloated and need to be reduced in size, or they're housed on a pathetic cloud server that is itself doing you more harm than good.
-
I'm not sure if it would affect the current page speed but it would fix the invalid HTML error from the validator. If the validation errors concern you it might be worth giving it a try and testing the result? It's good to make sure that pages validate all the high issues at least to be sure of no possible display or rendering issues in different browsers now or in the future.
-
Would correcting the code in this manner so the html validates result in a slower page load timE?
-
That error is coming up from the validator because the links to your stylesheets are outside the ending body and html tags. The stylesheet links normally go within the tags at the top but I understand from what you've said for page speed these have been moved to the bottom page however no tags / html / stylesheets / javascript etc should be outside the ending and tags.
If you move the CSS stylesheet references and the comments so they are where the javascript files are (before the ending tags) that would fix the fatal error you are seeing.
Hope that helps!
-
Thanks so much. I understand most errors are not too important. However I wonder if a "fatal" error should not be of grater concern.
Thanks, Alan
-
I am not a developer so any developer with a SEO background can tell you better but in general page load speed is important both from user point of view as well as search engine rankings and as far as W3C validation is concern, there are quite a few errors that you can ignore in order to stick with your page load speed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I optimise my key pages for new (related) key phrases as they arise, without compromising the original optimised keywords?
I have an important product page that I've fully optimised for a couple of specific key phrases. There are some other really good (related/similar) key phrases it's starting to rank on the first page for which I'd like to increase ranking on further. How can I optimise/improve ranking for these without compromising ranking for other keywords?
Intermediate & Advanced SEO | | nicolewretham0 -
SEO - is it site or page
Hi When we're talking about SEO does the search engine only look at the whole site in general or do they look at the individual page when we're talking about SERP? So if you have a keyword "my search term" Does the search engine look at the site first or the page with the term on then rank you or is it the page then the site.
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Website Does not index in any page?
I created a website www.astrologersktantrik.com 4 days ago and fetch it with google but still my website does not index on google as the keywords I use is with low competition but still my website does not appear on any keywords?
Intermediate & Advanced SEO | | ramansaab0 -
My home page is not found by the "Grade a Page" tool
My home page as well as several important pages are not found by the Grade a Page tool. With our full https address I got this http://screencast.com/t/s1gESMlGwpa With just the www address I got this http://screencast.com/t/BMRHy36Ih https://www.joomlashack.com
Intermediate & Advanced SEO | | etabush
https://www.joomlashack.com/joomla-templates We recently lost a lot of positions for our most important keyword: Joomla Templates Please help us figure this out. Whats screwy with our site?0 -
To Many Links On Page Problem
Hello My Moz report is showing I have an error for too many links on my sitemap and blog. The links on both pages are relevant and I'm not sure if this has to be sorted out, as I would have thought Google would expect sitemaps and blogs to have lots of links. If I were to reduce the number of links how much of a positive affect would it have on my site? If any of you feel it is best practice to reduce number of links on these particular pages, do you have any suggestions on how I can tackle this? http://www.dradept.com/blog.php http://www.dradept.com/sitemap.php Thank you Christina
Intermediate & Advanced SEO | | ChristinaRadisic0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
ETags - Is it worth it?
I've been meaning to try out the eTag entity for a while now. They seem like a great way to notify the bot when to and when not to fetch your content. Fruthermore, it is impliad that proxy servers can make use of them and help your site load faster by not fetching a newer copy if one is not available. This is not something that is easy to test on a small site and implementation on bigger sites is in my case a one way road and a few weeks in hell with the developers on staff. Will eTags take some load off the a site with a lot of traffic and dynamically generated content? Is this a good practice, as far as search engines are concerned?
Intermediate & Advanced SEO | | Svetoslav0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0