Fading in content above the fold on window load
-
Hi,
We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO.
While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible.
Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be?
We're eager to hear any advice or comments on this. Thanks a lot.
-
Hi,
For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed.
If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user.
I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled:
This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled.
Hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Automated checking for broken links within content pieces
Hi, I am wondering if anyone can send me in the right direction on a system suggestion. We have currently grown out amount of content pieces on our website and our manual checking if the links in the content pieces are still 200 status is becoming extremely time consuming. Does anyone have a recommendation of a system that will crawl your pages and check both the internal and external links within the content for a status code (404,200,etc)? Preferably something server side so it can just run on a schedule but really anything would be fine. I have tried things like Screaming frog, etc and it just doesn't seem to be the right tool.
On-Page Optimization | | GoAbroadKP0 -
Avoiding Duplicate Title Tags and Duplicate Content
Hi - I have a question on how to both avoid duplicate title tags and duplicate content AND still create a good user experience. I have a lot of SEO basics to do as the company has not done any SEO to this point. I work for a small cruise line. We have a page for each cruise. Each cruise is associated with a unique itinerary. However the ports of call are not necessarily unique to each itinerary. For each port on the itinerary there are also a set of excursions and if the port is the embark/disembark port, hotels that are associated. The availability of the excursions and hotels depends on the dates associated with the cruise. Today, we have two pages associated with each cruise for the excursions and hotels: mycruisecompany.com/cruise/name-of-cruise/port/excursion/?date=dateinport mycruisecompany.com/cruise/name-of-cruise/port/hotel/?date=dateinport When someone navigates to these pages, they can see a list of relevant content. From a user perspective the list I see is only associated with the relevant date (which is determined by a set of query parameters). Unfortunately, there are situations where the same content is on multiple pages. For instance the exact same set of hotels or excursions might be available for two different cruises or on multiple dates of the same cruise. This is causing a couple of different challenges. For instance, with regard to title tags, we have <title>Hotels in Rome</title> multiple times. I know that isn't good. If I tried to just have a hub page with hotels and a hub page with excursions available from each cruise and then a page for each hotel and excursion, each with a unique title tag, then the challenge is that I don't know how to not make the customer have to work through whether the hotel they are looking for is actually available on the dates in question. So while I can guarantee unique content/title tags, I end up asking the user to think too much. Thoughts?
On-Page Optimization | | Marston_Gould1 -
If others are reposting your content, how do you tell Google you are the original author?
A huge amount of my content is being copy and pasted to facebook, tumblr, wattpad etc... Is there an easy way to tell Google I am the original author? (How can I tell if this content theft affecting my rankings?)
On-Page Optimization | | brianflannery0 -
Is This A Reason To Move Content?
Dear All, I am questioning my initial decisions when I planned a site due to reading lots of info on moz. Although what I have read has made me question what I have already done, I can't find anything that is specific to my exact case, so here goes. I recently built a shopping cart in OpenCart. I want the site to have lots of information on the products it sells. I have populated each category with at least 1000 words of content that is specific to the products in that category, also I have some information pages that have no products in them at all, just copy. So the shopping site actually has a few pages that look like a static website and a few that look like a normal shopping cart. My thought behind this was I wanted the pages with lots of info to rank and become authoritative, in some way elevating the whole site. I have recently put a blog on the site, and a combination of that, and reading Moz has lead me think that I should move all the content from the category pages to the blog, and deep link each blog post to it's relevant products and category. From what I have read it would be easier to get the blog ranking and acknowledged as an authority rather than 30 category pages. Also each 1500+ word category page will make at least 3-4 nice blog posts, and each post can be focused on a single keyword rather than a large category page that has maybe 3-4 keywords it's trying to rank for. Also the blog is much better optimised than a standard OC category page (even using extensions with them). The only negative I can see is moving the content, but the site is less that 2 months old, and the amount of link juice it has is negligible. Does google cut new sites a bit of slack in these situations of moving content around, or will I be seen as 'up to something' by google? I guess my question is, am I barking up the right tree? Or is the old adage 'a little information is dangerous' true in this case, and I just about to make a load of work for the sake of it with no real benefit. However, if I am to make such a dramatic change to the sites architecture I think the time is now, before things start gaining juice & rank. I hope I have explained my situation clearly and I thank anyone who can offer me any advice. Great forum, Thank you, Ian
On-Page Optimization | | cookie7770 -
Content Update
Hello, If I update the existing content i.e.I added some content to the already existing indexed content in a post,how will it effect SEO wise? Venkee
On-Page Optimization | | Venkee0 -
Duplicate content with a trailing slash /
Hi, I 've pages like this: A) www.example.com/file/ B) www.example.com/file Just two questions: Does Google see this as duplicate content? Best to 301 redirect B to A? Many thanks Richard PS I read previous threads re the subject, it sounded like there was a bug in SEOMoz but I was not absolutely clear. Apologies if this is going over old ground.
On-Page Optimization | | Richard5550 -
Is This Duplicate Content Hurting Our SERPs?
We sell 1000s of audio book title, many of which are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher. Currently each title has its own page but the basic description of the title (story) is the same. Here is an example title that is offered in three formats. 44 Charles Street - Danielle Steel - abridged CD audiobook 44 Charles Street - Danielle Steel - MP3 CD audiobook 44 Charles Street - Danielle Steel - CD audiobook Each of the above pages has a different page title, a different URL, a different meta description however much of the body (from [Listen to a FREE Audio Clip] down is the same. Is this duplicate content hurting our SERPs?
On-Page Optimization | | lbohen1 -
HAVING A POPUP WINDOW ON HOMEPAGE AFFECTS SEO?
Good evening, I currently have a blog that uses a popup window after 15 seconds that is used to add visitor to my newsletter. My question is : Does it have a negative effect in SEO? Thanks in advance Maria Jesus
On-Page Optimization | | goperformancelabs0