Is there still a fold, Virginia. Or has scroll taken away the need?
-
Some people have declared the ‘fold’ dead because people scroll. Others using eye tracking studies hold that most attention is "still be focused on the top of pages. 80.3% of users attention was focused on above the fold (top 600-800 pixels). The case becomes especially strong with mobile devices. It is more inconvenient than ever to see content far down the page when looking at a screen that ranging from 3.5″-5″.
Opinons?
-
The_Sage answer is excellent in my opinion.
Personally I am a modern user, but the large majority of the visitors to websites I manage are not.
There are few ways of checking what kind of visitors you have using google analytics: https://www.google.it/webhp?q=google+analytics+users+scroll
-
My theory is that there are now two ways of using the Web. Modern, experienced Web users don't really rely on the "fold" to read a site. Their first action is to skim. There's still a class of Web users who treat the Web like a television. They click onto a site and "view" it. How does your audience use your website?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with list schema!
Hi all, I am trying out list schema on my site, but in Google's structured data testing tool I'm having an issue with the URL section. Whenever I have the same URL for each position is says that duplicate URLs aren't allowed, then when I have different URLs it says that they all have to be the same URL. Does anyone have any pointers that can help make my list schema error free!? Heres my schema:
Technical SEO | | Saba.Elahi.M.0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Google still listing pages from old domain after 2 change requests
Good Morning I put forward the following question in December 2014 https://moz.com/community/q/google-still-listing-old-domain as pages from our old domain www.fhr-net.co.uk were still indexed in Google. We have submitted two change request in WMT, the most recent was over 6 months ago yet the old pages are still being indexed and we can't see why that would be Any advice would be appreciated
Technical SEO | | Ham19790 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
Recommendations. Need Hosting Company
I need a new hosting company asap. I am based in Costa Rica but need a reliable international service that supports Modx. Any suggestions would be greatly appreciated!
Technical SEO | | Llanero0 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0 -
We have a ton of legacy links that include /?ref=tracking-goes-here. We need concile this, can the conical tag be used to fix this? How?
www.firehost.com/?ref=pressrelease example - http://cl.ly/2O1d1x2m3b1b3K1K0h2J
Technical SEO | | FirePowered0