Improvement in Page Speed worth Compromise on HTML Validation?
-
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious?
-
Fatal Error: Cannot recover after last error. Any further errors will be ignored.
From line 699, column 9; to line 699, column 319
>↩ ↩
`OUR DEVELOPER'S COMMENT:
| This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance |
The domain URL is www.nyc-officespace-leader.com`
-
-
Yeah sequence of load is also important when its time to go granular to find the true opportunities. Because the up-front evaluation time that can identify issues, can often result in faster-easier-more template-driven ways to speed up everything on a larger scale with less effort needed.
That doesn't mean its okay to ignore other bottlenecks. Just that the more clarity of understanding, the more likely real, sustainable success can be achieved.
-
I agree with Alan's points. I have also found WebSiteTest.com really useful. It allows for multiple runs on multiple devices and you can download the results in CSV. Expanding on Alan's point around looking at bottleneck points, when you use these tools, you need to take time to understand the waterfall chart as that is where you can see how the browsers interact with all of these files (html, css, js, images etc).
I have been doing a ton of reading on front end optimization recently. Aside from all of the above, you could have issue with the critical rendering path (great resources here and here). Many times folks look at a single asset and say, "This javascript file is too big, lets minify it and get faster!" That is a good thing and will help you. That said, you have to look at the render path as you may have that same smaller JS file blocking other downloads that need to be downloaded first to render the page faster. Optimizing the render path can give you some additional gains.
Good luck!
-
Kingalan1
I'm not a programmer by trade - the way I begin even considering these things is by running tests on various tool platforms.
For example, put a page you think is slow into URIValet.com - test as Googlebot. The resulting report has a block of information in it regarding total size of files processed. It breaks that data down to file types. Look at the CSS/JS lines - if they are more than 50k to 100k total for either CSS or JSS, there is almost certainly inefficiency in there, and likely unnecessary bloat.
Go to WebPageTest.org and do the same - put in the URL you want to check - choose a server location and DSL (which gives a fair mid-range speed evaluation), and Chrome as the browser emulator. The resulting report gives you a lot of information, however the one page in that report that may be most helpful in this situation is the "Details" report - if you go there, and scroll down, you'll get to the section that lists, line by line, every single file, script, image and asset processed for that page, and all of the data on speed of processing each step of the way (such as First Byte Time, DNS lookup, SSL lookup, and more). Those can reveal several individual bottleneck points.
-
Thanks for your excellent, highly detailed response!!
Is there a way to test the CSS files that my developer has created to see if they are coded in an efficient and concise manner?
We use a virtual private server at Inmotion Hosting and Amazon CDN for for images. So I would think that the hosting service is adequate. Traffic does not exceed 3000 unique visitors a month so the load on the server is minimal.
-
1. Taking shortcuts that are not sound sustainable based methods to gain value somewhere else is almost certainly going to become a problem when you least expect it at some future date and this is a great example. Moving CSS and or JS to below the proper location is a recipe for complete page display failure on any number of devices that may or may not current exist.
Have you tested your pages with Google's Fetch and Render to ensure they properly load, or where they may get a "partial" result? If they get a "partial" result, that's a red flag warning that you ignore at your own peril.
2. You haven't provided numbers - is the page speed improvement a case of going from 20 seconds to down to 5 seconds? Or is it going from 8 seconds to 6 seconds? Or what? This matters when evaluating what to care about and expend resources on.
3. If just moving those to their proper place in the page header section is causing speeds to slow down dramatically, you have bigger problems. First one that comes to mind is "why do those scripts / CSS files cause so much speed slowdown? Its likely they're bloated and need to be reduced in size, or they're housed on a pathetic cloud server that is itself doing you more harm than good.
-
I'm not sure if it would affect the current page speed but it would fix the invalid HTML error from the validator. If the validation errors concern you it might be worth giving it a try and testing the result? It's good to make sure that pages validate all the high issues at least to be sure of no possible display or rendering issues in different browsers now or in the future.
-
Would correcting the code in this manner so the html validates result in a slower page load timE?
-
That error is coming up from the validator because the links to your stylesheets are outside the ending body and html tags. The stylesheet links normally go within the tags at the top but I understand from what you've said for page speed these have been moved to the bottom page however no tags / html / stylesheets / javascript etc should be outside the ending and tags.
If you move the CSS stylesheet references and the comments so they are where the javascript files are (before the ending tags) that would fix the fatal error you are seeing.
Hope that helps!
-
Thanks so much. I understand most errors are not too important. However I wonder if a "fatal" error should not be of grater concern.
Thanks, Alan
-
I am not a developer so any developer with a SEO background can tell you better but in general page load speed is important both from user point of view as well as search engine rankings and as far as W3C validation is concern, there are quite a few errors that you can ignore in order to stick with your page load speed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
Intermediate & Advanced SEO | | MJTrevens0 -
Duplicate content but different pages?
Hi there! Im getting LOTS of "duplicate content" pages but the thing is they are different pages. My website essentially is a niche video hosting site with embedded videos from Youtube. Im working on adding personal descriptions to each video but keeping the same video title (should I re-word it from the original also? Any help?
Intermediate & Advanced SEO | | sarevme0 -
What is the benefit of directory pages?
I recently started at a new job running ecommerce websites. We sell yoga equipment and on 2 of our sites we built directory pages for yoga studios to list their calendars and whatnot. They are pretty old and out of date, but my question is, is there any benefit to these types of directories? If they do, we need to look at refreshing them. But if not, then they need to go. One of them is here. http://www.everythingyoga.com/studios.aspx Like I said, it is out of date.
Intermediate & Advanced SEO | | ShockoeCommerce0 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
SEO and Internal Pages
Howdy Moz Fans (quoting Rand), I have a weird issue. I have a site dedicated to criminal defense. When you Google some crimes, the homepage comes up INSTEAD of the internal page directly related to that type of crime. However, on other crimes, the more relevant internal page appears. Obviously, I want the internal page to appear when a particular crime is Googled and NOT the homepage. Does anyone have an explanation why this happens? FYI: I recently moved to WP and used a site map plugin that values the internal pages at 60% (instead of Weebly, which has an auto site map that didn't do that). Could that be it? I have repeatedly submitted the internal pages via GWT, but nothing happens. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120 -
Does a page on a site with high domain authority build page authority easier? i.e. less inbound links?
Is this also why people build backlinks to their BBB profiles, Yellowpages Profiles, etc. i.e. why do people build backlinks to other pages that link to them? Wouldn't it be more beneficial to just build that backlink directly to your target?
Intermediate & Advanced SEO | | adriandg0