Fading in content above the fold on window load
-
Hi,
We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO.
While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible.
Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be?
We're eager to hear any advice or comments on this. Thanks a lot.
-
Hi,
For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed.
If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user.
I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled:
This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled.
Hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content above the Fold, or Below
Hi, I have an ecommerce site with several categories that I consider good landing pages. In order to get better search results I add content to these pages, usually above the fold, then after the content products are listed. Example:https://www.carburetor-parts.com/Carburetor-Kits_c_568.html I worry that customers get to the page and since they don't see the products above the fold, they move on. Should I be putting content in the footer instead of the header and if so how does that effect SEO? This has been bugging me for a long time. Thanks
On-Page Optimization | | MikeCarbs
Mike0 -
What to do with blog content that is no longer relevant to our business
We are a marketing agency and we have LOTS of blog posts still on our website from when we used to specialize in e-commerce services. We've since shifted There are a lot of old and frankly irrelevant blog posts on our website. My questions: Should we remove these from the website so better "shape" our content profile towards the services we actually offer? Should we attempt to update them so they are still relevant even though we don't offer those services? If we get rid of those pages, what should we redirect them to? The main blog page?
On-Page Optimization | | WhittingtonConsulting0 -
What constitutes duplicate content on a page?
I am working on SEO for a Shopify store. Their products are very similar, hence the pages are so similar that Moz shows them as duplicate content. The only difference in the product pages is the title and model number. I am going to "go for the gold" and try re-writing all the product descriptions. It's incredibly difficult due to the products being nearly identical with just a minor variation. I know I could go down the road of just creating variants --- but the customer is not down for that. Here's my question: what constitutes duplicate content? 80% of the content, 90%???? If I can going to re-write the descriptions, what should I aim for? Thank you!
On-Page Optimization | | steve_linn1 -
Duplicate and thin content - advanced..
Hi Guys Two issues to sort out.. So we have a website that lists products and has many pages for: a) The list pages - that lists all the products for that area.
On-Page Optimization | | nick-name123
b) The detailed pages - that when click into from the list page, will list the specific product in full. On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on. For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result. I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another.. Does anyone have any ideas as to if this is a genuine problem and how you would solve? Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think).. Thanks in advance Nick0 -
Google showing my content on the serps in a different domain
Hi all, Recently a partner of ours discovered that Google is showing a meta description on the serps for his homepage that is not his but ours. On his site, he sells add-ons for our software, so the name of our software appears many times and as well there are many links pointing to our site. He claims he hasn´t copied this text from us, and I have used some tools to verify this. I don´t understand how Google can get confused and show our text as the meta desctiption on the serps for his homepage. Any idea on why this happened?
On-Page Optimization | | Paessler0 -
Should I use this Facebook comment content on my related blog post?
I have a blog post that ranks pretty high for the term "justin bieber tickets". We are running a ticket giveaway and have received tons of responses on Facebook and G+. The responses are often poorly written in they sense that they are from younger fans, but it is a bunch of related content that I thought could be a "good "addition of unique content to the post. Is this a good idea in general? Is it still a good idea if the comments are poorly written and contain lots of slang an exclamation points? Is it bad form to put people's Facebook comments live on the web, even though it is a public page. Here is the post Example of what this would look like in the post >http://cl.ly/1Q3N0t091V0w3m2r442G Source of comments >http://www.facebook.com/SeatGeek Another less aggressive option would be to curate some of my favorite comments... Thanks for any thoughts.
On-Page Optimization | | chadburgess0 -
Do videos count as duplicate content?
If we allow users to embed our videos on their site, would that count as duplicate content? I imagine note, given that Google can't usually 'see' the content of videos, but just want to double check.
On-Page Optimization | | nicole.healthline0 -
Cross Domain Duplicate Content
Hi My client has a series of websies, one main website and several mini websites, articles are created and published daily and weekly, one will go on a the main website and the others on one, two, or three of the mini sites. To combat duplication, i only ever allow one article to be indexed (apply noindex to articles that i don't wanted indexed by google, so, if 3 sites have same article, 2 sites will have noindex tag added to head). I am not completely sure if this is ok, and whether there are any negative affects, apart from the articles tagged as noindex not being indexed. Are there any obvious issues? I am aware of the canonical link rel tag, and know that this can be used on the same domain, but can it be used cross domain, in place of the noindex tag? If so, is it exactly the same in structure as the 'same domain' canonical link rel tag? Thanks Matt
On-Page Optimization | | mattys0