Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Content Below the Fold
-
Hi
I wondered what the view is on content below the fold?
We have the H1, product listings & then some written content under the products - will Google just ignore this?
I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is?
Thank you
-
Hi Becky,
Here is what I found:
The pros and cons of hiding content using JavaScript and CSS (display: none) has been a topic of some debate within the SEO industry, and Google’s comments over time have somewhat added to the confusion.
- **November 2014 **– Google’s John Mueller stated that Google _“may not” _index or rank hidden content. In aGoogle+ Hangout the following month, John repeated this, stating that hidden content would be _“discounted”_and has been for a number of years
- **21 July 2015 **– Google’s Gary Illyes, contributing to a Stack Overflow forum thread , provided clarification of this by stating that this type of content is given “way less weight in ranking”
- **27 July 2015 **– In a separate Stack Overflow thread on the same topic, Gary Illyes again confirmed that _“[Google] will index that but the content’s weight will be lower since it’s hidden” _
So the content will still be indexed, but deemed less important by the crawlers.
-
Yeh it's disappointing.
I've tried having some content behind a tab and some under the products and I am not seeing either one as having much of an effect.
Unless I remove it altogether, I'm not sure what else I can do with it?
-
Hi
Yes I tried different pages and it's still the same. I think it's to do with things we have blocked in robots.txt...
-
I'm not seeing a problem in my GoogleBot simulators, Becky, but the one within your Google Search Console is still the best judge. Have you tried reloading the Fetch as... a couple of times? And tried it on different pages?
-
Yup - Google still says content that can only be seen after a user interaction is given less importance. Kinda stupid, given that things like tabs/accordians are a major usability enhancement, but that's still where we are.
P.
-
Hi
So I did fetch as Google - and I'm seeing the page quite differently if I'm Googlebot vs. visitors.
It just sees a few big images, I can't see it rendering any product listings or content - do I now have a bigger problem?
Thank you
-
Hi
Thank you for the replies. I don't want to hide it, I just can't have it pushing products down the page so they can't be seen..
I thought in Google webmaster guidelines they included a comment to say they will ignore content behind tabs?
Becky
-
Any content below the fold will still be read. Are you trying to hide it but still get the SEO value? If that's the case, I would create a collapsible tab to keep the content on the page but hidden. If you want it to be visible, leave it as is and don't worry about Google not reading it—it will be read.
-
While theoretically logical, Google's own John Mueller stated last week that code to text ratio has absolutely no effect on crawling of a site, and in a followup question, he directly told me text/code ratio has no effect as a ranking factor either.
These used to be very minor considerations back when the search engine crawlers weren't as powerful, but no longer.
Fully agree with Pia that the idea of "above the fold" influencing ranking is nonsense as well. Given that the sweet spot for consistently high-ranking pages is ~2200 words, the idea that only the first paragraph or two are more important is unsupportable.
Hope that helps?
Paul
-
Additionally, do check the content using Fetch as google in Google Search Console / Webmaster tools. It would really help you see how spiders see the content compared to users. This is an important aspect of SEO which a lot of people ignore, you are looking to find that whether the spiders see a structured view of the content and not messy. I hope this helps, if you have further questions, please feel free to ask. Regards, Vijay
-
There's no manipulation whatsoever. In fact, Google encourage website developers and SEOs to optimise/tidy their code and keep a good code-to-content ratio. This is why Google gives us so many tools in order to do so. It makes our sites easier to crawl for Google, and in return Google may even like us more for it!
Just found an article that sums it up quite nicely:
"Essentially what is being stated is a fairly logical conclusion: reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easy to crawl and, additionally, allow you greater control over which portions of your content are given more weight by the search engines. The thinking behind validating code for SEO benefits is that, once you have a compliant site, not only will your site be a better user experience on a much larger number of browsers, but you’ll have a site with far less code, that will rank higher on the search engines because of it."
- http://www.hobo-web.co.uk/official-google-prefers-valid-html-css/
But going back to your original post, "above-the-fold is dead", yadda yadda... So long as your content in the source is metaphorically "above the fold" and not drowning in heavy code, on the page itself just worry about giving your users the "experience" that they're looking for. And not how many pixels from the top of the browser your content is. Hope that makes more sense!
-
Great thank you, you read so many conflicting articles that it's difficult to know.
I'll see if we can look at our code, but I'd want to be mindful of not manipulating Google.
Thank you!
-
I feel prioritising elements to be "above the fold" is a bit of an outdated concept these days.
Where is the fold? Different devices and screen resolutions will have different folds, and more websites are being designed now to make the traditional "above the fold" section more visually interesting and designed for user experience, rather than packed full of content.
The higher the content is in the source code itself, the more weight it will have on the page. This doesn't necessarily translate to the "visually higher the content is on the page". Google is going to be reading from top to bottom of your code, so naturally you want the most important content/links to be found first. As long as you meet (or exceed!) the user's expectation of the content upon arrival, and you keep the code tidy in terms of how much Google has to read before it gets to the real valuable content, I doubt Google's going to worry about whether users have to scroll a little to get to it.
-
Hi Becky,
As far as i understand Google will not ignore however Google do treat some part of the page as more important than other. For instance, if you have written a description of the product and some of the description is been hide.
Google, will take that as the important piece of content been displayed for user and least important been hide.
I do not see any point for Google to ignore the fold one. -
Content below the fold is still read, however less value is placed on it. So it is still worth having content that is produced for below the fold as it will still help that page rank.
Show the user what they want to see when they land on the page, majority of the time in doing this you will actually show Google what they need to rank you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1