Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
-
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article.
My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar.
They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too.
My question is, is VentureBeat's implementation actually that SEO-friendly or not.
VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles?
Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/
Would be interesting to know if someone has dealt with this first-hand or just has an opinion.
Thanks in advance!
Daniel
-
Totally agreed, Daniel! I'd also say it's our job to set expectations and be clear about when something is a test vs when something will more than likely work. Consulting is all about setting expectations!
-
Thanks a lot for your thoughts on this John. Really appreciate you taking the time to look into it.
You make a great point about not always copying competitors without testing first. If it's rolled out on such a wide scale, it's always going to be a hard case to put to the client knowing that they're going to lose out in the short-term when it comes to advertising revenue but regardless, I think it's our job as SEOs to first and foremost propose the most SEO-friendly implementation possible.
-
This is actually a really interesting question. I looked at their category pages (eg http://venturebeat.com/tag/ar-vr-weekly/) and those seem to be set up correctly to handle infinite scroll as it sends the search engines to the next page.
I've not come across this with infinite scroll on articles, though. I'm sure they've tested it extensively to figure out the best way to send search engines to future articles, but who really knows if it's being effective. If it's still there, I'd assume that they've seen positive signs but it is definitely a non-standard implementation of rel-next/prev!
This does bring up a good point about copying/not copying a competitor's strategy. They have this implemented, but would it work for your own site/business? Maybe, but maybe not. We can't be sure until we test it ourselves (or speak with someone at VentureBeat who wants to share their learnings :-)). If you know when it was rolled out you could benchmark there and look at SEMrush or another tool to see their organic visibility and from there draw at least some correlation, if not causation.
Thanks for flagging this up! It's cool to see.
-
IT depends on application and other design aspects.
I have seen websites that implement the same thing and like morons keep a never accessible footer there as well... you have no idea how impossible it was to get to the social bar/links at the bottom.
You have to think of the user experience to be honest, while there may be good technical reasons for such a design, you must in the end consider what the user goes through and wants to get out of. A/B testing these kinds of things would not hurt either.
But honestly only "feeds" should be this way. Facebook feed, twitter feed, news feed and even then applications should be considered with care.
Disclosure: I personally hate this behavior by default... basically the only place I find it acceptable is on facebook and twitter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
SEO Template Recommendations - example provided but would welcome any advice
Hi there, I'm trying to improve the templates used on our website for SEO pages aimed at popular search terms. An example of our current page template is as follows: http://www.eteach.com/teaching-jobs Our designers have come up with the following new template: http://www.eteach.com/justindaviesnovemeber I know that changing successful pages can be risky. One concern is putting links behind JQuery, where the 'More on Surrey' link is. Does anyone had any strong suggestions or observations around our new template? Especially through the eyes of Google! Thanks in advance Justin
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Best Link Building Strategies in Modern SEO
Hello, In light of all the updates and also in guest blogging being only for nofollow links now, what's some of the best strategies for link building for ecommerce sites? We're in an industry where the content doesn't get linked to very much. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
No Follows - Sister/manufacturer sites
What is the best practice nowadays for linking to sister sites? Should you do it, shouldn't you, and/or should you list them with no follows? What about the reverse - having them link to us. Is this bad for us in anyway? Should we have them no follow their link to us? We are a distributor so manufacturers link to us as well, should we have them no follow their links? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
So what's up with UpDowner.com?
I've noticed these guys in link profiles for several sites I manage. They'll usually show up around 1,000-10,000 times in the backlink profile. From what I can tell they index websites, build up keyword relationships, and then when you search for something on their site (e.g. poker) they'll present a list of related sites with stats about them. The stats seem to be yanked straight from Alexa. Where the backlink comes from is that every time 'your' site shows up for a search result they'll put a little iframe that contains your site. This means if your site's name/keywords are pretty broad, you could be showing up thousands and tens of thousands of times as being linked from these guys on their pages that Google indexes. And Google indexes, boy do they ever. At the height, they had over 53 million pages indexed. That has apparently shrunk now to around 25 million. I believe their strategy is to generate a crap-load of automated content in the hopes they can cash in on obscure long tails. So my questions for you guys are: Are you seeing them in your backlinks too? Should I block their spider/referrers? What is their deal man?
White Hat / Black Hat SEO | | icecarats0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0