Is Having Content 'Above The Fold' Still Relevant for Website Design and SEO
-
Hey there,
So I have a client who recently 're-skinned' their website and now there is little to no content above the fold. Likewise, I've noticed that since the transition to this new front-end design there has been a drop in rankings for a number of keywords related to one of the topics we are targeting.
Is there any correlation here? Is having content 'above the fold' still a relevant factor in determining a websites' searchability? I appreciate you reading and look forward to hearing from all of you. Have a great day!
-
Above the fold my site always has a title that perfectly matches the query, a seductive subtitle that I hope will elicit reading, a spectacular image that I am willing to spend big bucks for, and a carefully crafted opening paragraph that states the basics of the topic.
What else is above the fold? Not much, just my domain name, which is killer. The page looks like a newspaper with the full intent that nothing else will detract from the content or the domain (there are a couple of visible ads that do not interfere with the content - that pay for this stuff).
I know for a fact that having your relevant content above the fold is essential. Someone once designed me a fantastic template and the image at the top was killer killer killer... it was spectacular but irrelevant for most pages of the site. Visitors bounced and I had enough traffic at the time to know that it tanked within a couple hours. Ad revenue sucked too.
I removed that image and the only thing left above the fold was the domain, the titles, the image and the text (and a couple ads that interfere with nothing). In the next 60 minutes the people started exploring the site and the ad revenue was up multiples.
It's like some athletic events... run as close to naked as you can.
-
Unfortunately, there is actually a correlation there--Google wants the content above the fold, and so do users. You shouldn't have to scroll to see the site's main content. Google's Page Layout algorithm has been around for several years now, and we've seen time and time again over the past few years that it pays to have your content above the fold.
To look into this further, I would study Google's Page Layout algorithm, and the penalties associated with it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Less relevant/not optimized competitor sites ranking higher in SERPs?
Has anyone else noticed their rank positions falling to competitor sites that aren't optimized and are less relevant? I've noticed that we've lost some rankings or have dropped over the past few weeks and the competitor pages that have replaced us haven't been optimized, aren't as relevant, and it doesn't look like there has been any updates (looking through archived versions). For example, their main "shoes" gallery is ranking for more specific shoe types, like "sandals", and "sandals" isn't even mentioned in their metadata and they have no on-page copy. Their DA is slightly higher, but our sites have a denser link profile (although, yes, I do need to go through and see what kind of links, exactly, we've gained). Has anyone else seen this happen recently, or have any ideas of why or what we could do to get our rank positions back? My main initiatives have been to create and implement fresh on-page copy, metadata, and manage 404s/301 redirects, but I'm thinking this issue is beyond a quick copywriting tweak.
Algorithm Updates | | WWWSEO0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Images added to website automatically become URLs - is this an issue?
Hello Mozzers! I've just been trawling through a website and noticed all of the images had their own URLs. There's a bespoke CMS and that's how it works with images... So out of 1447 urls, 1314 are images. Firstly, is this an issue / problem from an SEO perspective. If it is, how should I deal with it? Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
SEO For sub locations for specific services
Hey Guys, I am currently creating a website for my business that will be marketing through SEO very heavily. I Live in NYC, and i'd like to rank up for the individual locations such as Queens, Brooklyn, Long Island and eventually if my domain authority and other long hall metrics kick in NYC. What I find very tiring is targeting these locations all separately, it means I need to create the same site 4 times with completely different and unique content. Should this setup work for me, and is there a risk that Google will see 4 web design pages, and basically say even though the content is unique your ranking up for web design with a location too many times? from my understanding this is not a problem now, but is this a future risk? It also becomes extremely difficult for site navigation with about us pages, contact us pages, and other pages that either have to be duplicated or all pages shown on sidebar for navigation. Please share your thoughts with me, THANKS!!!
Algorithm Updates | | tonyr70 -
How do you get great content for a small business?
We always talk about great engaging content being the way forwards for sites. As a small business this is an expensive commodity to outsource when you have probably in the region of 250 pages that could probably all use some work. To that end I have some questions. how do do you make a product or category description engaging? Should they still contain a certain number of words ( personally I hate ready reams of text) As on-page SEO what should we be striving to achieve? I am sure this has all been asked before but what the general consensus right now?
Algorithm Updates | | Towelsrus0 -
Changing in website design reduce traffic? I don't think so.
HI, Around the month of Nov I was working on the website. Due to some reasons I have to change the design of website. I saw my traffic going down and down(70 - 100/day) so roll back it on previous one. after that it improve little bit but not as on previously. (traffic 250 - 300/day). Question: All Urls, content and links are same then how that can effect on the traffic. We have removed all the errors that was shown in the seomoz report.But traffic is still the issue here. We are working on SEO area enough and try to recover from it. Your suggestion may be helpful for us.So I am looking forward for your answers. how i can over come with it. Thanks Regards
Algorithm Updates | | lucidsoftech0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1