Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you use multiple videos without sacrificing load times?
-
We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference.
I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach.
Thanks all, appreciate any guidance!
Matt
-
Thank you very much for that, my guys are having a look into both Wistia and also if/how we can defer videos using either Vidyard or YouTube.
Thanks again,
Matt
-
I use Wistia as well and recommend them I do not recommend using their plug-in
You can defer loading of the video and make it so that the site very quickly and is almost not affected at all.
- https://varvy.com/pagespeed/defer-videos.html
- https://varvy.com/pagespeed/defer-many-javascripts.html
- USE this to get JavaScript queries https://varvy.com/tools/js/
- This for an overall https://varvy.com/pagespeed/ test
- **Best practices **https://kinsta.com/learn/page-speed/
- https://varvy.com/pagespeed/defer-loading-javascript.html
- https://varvy.com/pagespeed/critical-render-path.html
How to defer videos
To do this we need to markup our embed code and add a small and extremely simple javascript. I will show the method I actually used for this page.
The html
<iframe width="560" height="315" src="" data-src="//www.youtube.com/embed/OMOVFvcNfvE" frameborder="0" allowfullscreen=""></iframe>
In the above code I took the embed video code from Youtube and made two small changes. The first change is that I made the "src" empty by removing the url from it as below.
src=""
The second change I made is I put the url I cut from "src" and added it to "data-src".
data-src="//www.youtube.com/embed/OMOVFvcNfvE"
The javascript
Script to call external javascript file
This code should be placed in your HTML just before the tag (near the bottom of your HTML file). So "**defer.js" is **the name of the external JS file.
I hope this helps, Tom
-
I'm very doubtful hosting the video off-site would have much effect on the site speed especially YouTube, Personally I use Wistia mainly due to the level of analytics that they provide. The only time this may be an issue if you have a quantity on a single page, in that case I would try and split it onto several different pages by means of categories or something.
To me it sounds like there may be a programming problem.
The other thing is it may not be the videos that is slowing the site down.
Just a few thoughts don't know if it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using 2 cache plugin good or not?
Hi, Can anyone tell me - whether using 2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some. Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.
Technical SEO | | nazfazy0 -
Can a H1 Tag Have Multiple Spans Within It?
H1 tags on my client's website follow the template [Service] + [Location]. These two have their own span, meaning there are two spans in an H1 tag. class="what">Truck Repair near class="where">California, CA How do crawl bots see this? Is that okay for SEO?
Technical SEO | | kevinpark1910 -
Do YouTube videos in iFrames get crawled?
There seems to be quite a few articles out there that say iframes cause problems with organic search and that the various bots can't/won't crawl them. Most of the articles are a few years old (including Moz's video sitemap article). I'm wondering if this is still the case with YouTube/Vimeo/etc videos, all of which only offer iFrames as an embed option. I have a hard time believing that a Google property (YT) would offer an embed option that it's own bot couldn't crawl. However, let me know if that is in fact the case. Thanks! Jim
Technical SEO | | DigitalAnarchy0 -
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
Can I have an H1 tag below an H2?
Quick question for you all - Is there an issue with me having an H1 tag physically below an H2 tag on a web page??
Technical SEO | | Pete40 -
Speed benefits from loading images from a subdomain
I have read that loading images from a subdomain of your site instead of the main domain will give you speed benefits on load time. Has anyone actually seen that to be the case? Thanks!
Technical SEO | | Gordian0 -
What can I do if my reconsideration request is rejected?
Last week I received an unnatural link warning from Google. Sad times. I followed the guidelines and reviewed all my inbound links for the last 3 months. All 5000 of them! Along with several genuine ones from trusted sites like BBC, Guardian and Telegraph there was a load of spam. About 2800 of them were junk. As we don't employ any SEO agency and don't buy links (we don't even buy adwords!) I know that all of this spam is generated by spam bots and site scrapers copying our content. As the bad links have not been created by us and there are 2800 of them I cannot hope to get them removed. There are no 'contact us' pages on these Russian spam directories and Indian scraper sites. And as for the 'adult book marking website' who have linked to us over 1000 times, well I couldn't even contact that site in company time if I wanted to! As a result i did my manual review all day, made a list of 2800 bad links and disavowed them. I followed this up with a reconsideration request to tell Google what I'd done but a week later this has been rejected "We've reviewed your site and we still see links to your site that violate our quality guidelines." As these links are beyond my control and I've tried to disavow them is there anything more to be done? Cheers Steve
Technical SEO | | SteveBrumpton0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0