Fading in content above the fold on window load
-
Hi,
We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO.
While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible.
Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be?
We're eager to hear any advice or comments on this. Thanks a lot.
-
Hi,
For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed.
If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user.
I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled:
This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled.
Hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much content does Google Crawl on your site?
Hi, We've had a debate around the office where some people believe that Google only crawls the first 150-200 words on a page and some people believe that they priority content that is above the fold and other people believe that all content has the same priority. Can you help us? Thanks,
On-Page Optimization | | mdorville
Matt0 -
Pagination on related content within a subject
A client has come to us with new content and sections for their site. The two main sections are "Widget Services" - the sales pages, and "Widget Guide" - a non-commercial guide to using the widgets etc. Both the Services and Guide pages contain the same pages (red widgets, blue widgets, triangle widgets), and - here's the problem - the same first paragraph. i.e. ======== Blue widget services Blue widgets were invented in 1906 by Professor Blue. It was only a coincidence that they were blue. We stock a full range of blue widgets, we were voted best blue widget handler at widgetcon 2013. Buy one now See our guide to blue widgets here Guide to blue widgets Blue widgets were invented in 1906 by Professor Blue. It was only a coincidence that they were blue. The thing about blue widgets as they're not at all like red widgets at all. For starters, they're blue. Find more information about our blue widgets here ======== In all of these pages, the first paragraph is ~200 words and provides a great introduction to the subject, and the rest of the page is 600-800 words, making these pages unique enough to justify being different pages. We want to deal with this by declaring each page as a paginated version of a two page article on each type of widget (using rel=prev/next). Our thinking is that Google probably handles introuctions/headers on paginated content in a sensible way. Has anyone experienced this before? Is there any issues on using rel="prev" and rel="next" when they're not strictly paginated?
On-Page Optimization | | BabelPR0 -
Duplicate Content Again
Hello Good People. I know that this is another duplicate post about duplicate content (boring) but i am going crazy with this.. SeoMoz crawl and other tools tells me that i have a duplicate content between site root and index.html. The site is www.sisic-product.com i am going crazy with this... the server is IIS so cannot use htaccess please help... thanks
On-Page Optimization | | Makumbala0 -
Fresh Content Strategy - What does it look like?
I understand the growing importance fo content freshness, but I have some questions about how to incorporate content freshness components into an existing SEO strategy: Here are some specific questions I would love some help with: If I have a specific "product or services" page that is properly optimized, and getting a decent amount of traffic, would I benefit from updating/modifying the content on a routine basis to improve rankings? In general, should I be considering an occasional re-fresh of content on my site even if I don;t necessarily have anything new to say? For my homepage, if I am pulling in headlines from various news and events sections within my own site, and those sections are updated pretty frequently, is my homepage going to be viewed as fresh when the site gets re-crawled? In other words, is updating my homepage via rss feeds that pull from content areas from within my site keeping my homepage "fresh"? Thanks!
On-Page Optimization | | AmyLB0 -
Duplicated Page Content
I have encountered this weird problem about duplicate page content. My site got 3 duplicate content similar on the link structure below. If I'm going to use rel canonical does it help to resolve the duplication problem? Thanks http://www.sample.com http://www.sample.com/ http://www.sample.com/index.php
On-Page Optimization | | mattvectorbpo0 -
Duplicate Content - Site Wide or Internet Wide?
Hello... I am creating a new website and i was wondering how you guys would define duplicate content? If my new site had the same page titles and descriptions as my existing site, would that be duplicate content? Or does duplicate content mean same titles and descriptions in the same site? I'm wondering if i can upload the same database (with page titles and descriptions and alt tags) to my new site or if that would be looked at as duplicate... Thanks
On-Page Optimization | | Prime850 -
Should I use this Facebook comment content on my related blog post?
I have a blog post that ranks pretty high for the term "justin bieber tickets". We are running a ticket giveaway and have received tons of responses on Facebook and G+. The responses are often poorly written in they sense that they are from younger fans, but it is a bunch of related content that I thought could be a "good "addition of unique content to the post. Is this a good idea in general? Is it still a good idea if the comments are poorly written and contain lots of slang an exclamation points? Is it bad form to put people's Facebook comments live on the web, even though it is a public page. Here is the post Example of what this would look like in the post >http://cl.ly/1Q3N0t091V0w3m2r442G Source of comments >http://www.facebook.com/SeatGeek Another less aggressive option would be to curate some of my favorite comments... Thanks for any thoughts.
On-Page Optimization | | chadburgess0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0