Fading in content above the fold on window load
-
Hi,
We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO.
While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible.
Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be?
We're eager to hear any advice or comments on this. Thanks a lot.
-
Hi,
For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed.
If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user.
I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled:
This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled.
Hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
"Turning off" content to a site
One site I manage has a lot of low quality content. We are in the process of improving the overall site content but we have "turned off" a large portion of our content by setting 2/3 of the posts to draft. Has anyone done this before or had experience with doing something similar? This quote from Bruce Clay comes to mind: “Where a lot of people don’t understand content factoring to this is having 100 great pages and 100 terrible pages—they average, when the quality being viewed is your website,” he explained. “So, it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.” What are your thoughts? Thanks
On-Page Optimization | | ThridHour0 -
Add content as blog post or to product pages?
Hi, We have around 40 products which we can produce plenty of in-depth and detailed "how to"-type pieces of content for. Our current plan is to produce a "How to make" style post for each as a long blog post, then link that to the product page. There's probably half a dozen or more of these kind of blog posts that we could do for each product. The reason why we planned on doing it like this is that it would give us plenty of extra pages (blog posts) on their own URL which can be indexed and rank for long tail keywords, but also that we can mention these posts in our newsletter. It'd give people a new page full of specific content that they can read instead of us having to say "Hey! We've updated our product page for X!", which seems a little pointless. Most of the products we sell don't get very many searches themselves; Most get a couple dozen and the odd few get 100-300 each, while one gets more than 2,000 per month. The products don't get many searches as it's a relatively unknown niche when it comes to details, but searches for the "categories" these products are in are very well known (Some broad terms that cover the niche get more than 30,000+ searches a month in the UK and 100,000+ world wide) [Exact].
On-Page Optimization | | azu25
Regarding the one product with more than 2,000 searches; This keyword is both the name of the product and also a name for the category page. Many of our competitors have just one of these products, whereas we're one of the first to have more than 6 variations of this product, thus the category page is acting like our other product pages and the information you would usually find on our product pages, is on the category page for just this product. I'm still leaning towards creating each piece of content as it's own blog post which links to the product pages, while the product pages link to the relevant blog posts, but i'm starting to think that it may be be better to put all the content on the product pages themselves). The only problem with this is that it cuts out on more than 200 very indepth and long blog posts (which due to the amount of content, videos and potentially dozens of high resolution images may slow down the loading of the product pages). From what I can see, here are the pros and cons: Pro (For blog posts):
1. More than 200 blog posts (potentially 1000+ words each with dozens of photos and potentially a video)..
2. More pages to crawl, index and rank..
3. More pages to post on social media..
4. Able to comment about the posts in the newsletter - Sounds more unique than "We've just updated this product page"..
5. Commenting is available on blog posts, whereas it is not on product pages..
6. So much information could slow down the loading of product pages significantly..
7. Some products are very similar (ie, the same product but "better quality" - Difficult to explain without giving the niche away, which i'd prefer not to do ATM) and this would mean the same content isn't on multiple pages.
8. By my understanding, this would be better for Google Authorship/Publishership.. Con (Against blog posts. For extended product pages):
1. Customers have all information in one place and don't have to click on a "Related Blog posts" tab..
2. More content means better ability to rank for product related keywords (All but a few receive very few searches per month, but the niche is exploding at an amazing rate at the moment)..
3. Very little chance of a blog post out-ranking the related product page for keywords.. I've run out of ideas for the 'Con' side of things, but that's why I'd like opinions from someone here if possible. I'd really appreciate any and all input, Thanks! [EDIT]:
I should add that there will be a small "How to make" style section on product pages anyway, which covers the most common step by step instructions. In the content we planned for blog posts, we'd explore the regular method in greater detail and several other methods in good detail. Our products can be "made" in several different ways which each result in a unique end result (some people may prefer it one way than another, so we want to cover every possible method), effectively meaning that there's an almost unlimited amount of content we could write.
In fact, you could probably think of the blog posts as more of "an ultimate guide to X" instead of simply "How to X"...0 -
Should I consolidate pages to prevent "thin content"
I have a site based on downloadable images which tend to slow a site. In order to increase speed I divided certain pages up so that there will be less images on each page such as here: http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-a-h/ http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-i-q/ http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-r-z/ The problem is that I now have potential duplicate content and thin content. Should I consolidate them and put all of the content from the 3 pages on one page? or maybe keep them as they are but add a rel previous / next tag? or any other suggestion to prevent a duplicate/thin content penalty while not slowing down the site too much?
On-Page Optimization | | JillB20130 -
Mobile website content
What is the point of optimizing (on-page SEO) a parallel mobile website if the mobile search results are taken from the general (desktop) index?
On-Page Optimization | | echo10 -
What are the guidelines for writing good content ?
Hello Guys I am bit confused on how to write good content. I have read so many articles where they are mentioning you have add your keyword in the first paragraph,the last paragraph and use latex words etc I am confused ! I want to create reviews that rank on the first page and stay their. Please could someone point out what are the basic guidelines for writing good content for google & users ?
On-Page Optimization | | umk0 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1 -
Slow page load times from asynchronous javascript
Our analytics are showing that our homepage takes, on average, ~14 seconds to load. All of the content on the page loads fairly quickly, while it takes a few seconds to load the social media stuff (which is mostly asynchronous javascript that is running in the background). The question is this: Does Her-Magesty-Google take into account the amount of time it takes to load everything, including the social media stuff from asynchronous javascript? Or does Google see that the content loads fairly quickly and doesn't ding me for the js that is running in the background?
On-Page Optimization | | ibfx0