Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Fading in content above the fold on window load
-
Hi,
We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO.
While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible.
Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be?
We're eager to hear any advice or comments on this. Thanks a lot.
-
Hi,
For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed.
If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user.
I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled:
This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled.
Hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I redirect or add content, to 47 Pages?
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this? I'm I right in thinking I have 2 options? Either add new content or redirect the page? Thanks in advance 🙂
On-Page Optimization | | laurentjb1 -
Can lazy loading of images affect indexing?
I am trying to diagnose a massive drop in Google rankings for my website and noticed that the date of the ranking and traffic drop coincides with Google suddenly only indexing about 10% of my images, whereas previously it was indexing about 95% of them. Wondering if addition of lazy load script to images (so they don't load from the server until visible in the browser) could cause this index blocking?
On-Page Optimization | | Gavin.Atkinson1 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Do quotation marks in content effect SERPs?
Some of my art object products have words and phrases engraved on them. The words relate to the images on the product. In the product descriptions, I have been putting quotes around the entire list. Would I get better long tail results if I didn't use the quotation marks? In other words, do the quotes make everything between them an exact match phrase? For example:
On-Page Optimization | | stephenfishman
Current product description:
The worlds around the edge of the lazy susan read, "Explore nature. Dream big. Take time to smell the flowers. Enjoy the changing seasons. Seize the day. Relish the night. Live life to the fullest." Thank you for helping with this, all comments on how to present this kind of content are welcomed- Stephen kSOjt5a0 -
SEO value of old press releases (as content)?
Howdy Moz Community, I'm working with a client on migrating content to a new site/CMS and am wondering whether anyone has thoughts on the value of old press releases. I'm familiar with the devaluation of press release links from early 2013, but I'm wondering more about their value as content. Does importing old press releases (3-5 years old) create contextual depth of content that has some value for the site as a whole (even though the news contained within is useless)? Or, do these old press releases just create clutter and waste time (in migration). The site has a wealth of additional content (articles and videos), so the press releases wouldn't be covering up for thin content. I'm just wondering whether there's any best practices or a general rule of thumb. Thanks!
On-Page Optimization | | MilesMedia0 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
Duplicate Content for Spanish & English Product
Hi There, Our company provides training courses and I am looking to provide the Spanish version of a course that we already provide in English. As it is an e-commerce site, our landing page for the English version gives the full description of the course and all related details. Once the course is purchased, a flash based course launches within a player window and the student begins the course. For the Spanish version of the course, my target customers are English speaking supervisors purchasing the course for their Spanish speaking workers. So the landing page will still be in English (just like the English version of the course) with the same basic description, with the only content differences on that page being the inclusion of the fact that this course is in Spanish and a few details around that. The majority of the content on these two separate landing pages will be exactly the same, as the description for the overall course is the same, just that it's presented in a different language, so it needs to be 2 separate products. My fear is that Google will read this as duplicate content and I will be penalized for it. Is this a possibility or will Google know why I set it up this way and not penalize me? If that is a possibility, how should I go about doing this correctly? Thanks!
On-Page Optimization | | NiallTom0 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | | Shipyard_Agency0