Page load increases with Video File - SEO Effects
-
We're trying to use a flash video as a product image, so the size increase will be significant. We're talking somewhere around 1.5 - 2mb on a page that is about 400kb before the video. So the increase is significant. There is SEO concern with pages peed and thinking perhaps having the flash video inside an iframe might overcome the speed issues.
We're trying to provide a better experience with the video, but the increase in page size, and therefore speed, will be significant. The rest of the page will load, including a fallback static image, so we're really trying to understand how to mitigate the page load speed impact of the video. Any Thoughts?
-
Thanks Adam for all your comments. I agree Google encourages the use of video and my concern is the load time of moving from 2 seconds to 11 seconds. This will be a good test!
-
I do not think you will experience any drop in rankings due to embedding a flash video on your webpage. I certainly have not seen any ranking drops on pages with embedded videos. Remember, page speed is a very small signal in Google's ranking algorithm and Google encourages the use of web videos (Video sitemaps, etc.).
-
Our concerns are with significant page size increase. Are you still saying not to be concerned with the algorithm? If so, would we loose placement because our competitors have a quicker page speed?
-
Just answered a nearly identical question the other day that you may find useful: http://www.seomoz.org/q/google-s-algorithm-on-file-size-use-iframe-or-not
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2/ Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?
Intermediate & Advanced SEO | | shift-inc0 -
SEO mobile app optimization: multi tag link alternate media per every devices is acceptable in the desktop page?
Hi All, Hi hope someone could answer to this question because on internet I haven't found a clear solution so far: I have: 1 desktop website (let's make www.example.com) and different mobile websites for each main device (let's make iphone.example.mobi; android.example.mobi; winphone.example.mobi) In order to optimize my mobile websites, According to the Google guideline of the above separate urls configuration , I should add a tag link alternate media in the desktop page and a canonical tag in the corresponding mobile page in order to create a connection between them. But, I need to keep a 1-to-1 connection between desktop page and mobile page (Google recommends to have 1 desktop page linked to 1 mobile page and viceversa and discourages the 1-to-multi connections). What I would like: In my case, I have to add the a single desktop page of desktop site (example www.example.com/category1/), 3 links alternate media tag,( one for iphone.example.mobi, one for android.example.mobi and one for winphone.example.mobi). Furthemore, I have to add a canonical tag in every corresponding mobile page of the 3 mobile site version, a canonical tag pointing to my sektop page www.example.com/category1/. Now my worries are: having a single desktop page with 3 different link alternate tags pointing to 3 different mobile websites (one each), is something or not aligned to the google seo mobile guideline? If not, How should I configure my desktop website and my 3 mobile web applications(iphone, android, winphone) in order to follow the Google requirements for Separate urls apllication? Thanks, Massimliano
Intermediate & Advanced SEO | | AdiRste0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
Effect duration of robots.txt file.
in my web site there is demo site in that also, index in Google but no need it now.so i have created robots file and upload to server yesterday.in the demo folder there are some html files,and i wanna remove all these in demo file from Google.but still in web master tools it showing User-agent: *
Intermediate & Advanced SEO | | innofidelity
Disallow: /demo/ How long this will take to remove from Google ? And are there any alternative way doing that ?0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0