Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
-
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article.
My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar.
They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too.
My question is, is VentureBeat's implementation actually that SEO-friendly or not.
VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles?
Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/
Would be interesting to know if someone has dealt with this first-hand or just has an opinion.
Thanks in advance!
Daniel
-
Totally agreed, Daniel! I'd also say it's our job to set expectations and be clear about when something is a test vs when something will more than likely work. Consulting is all about setting expectations!
-
Thanks a lot for your thoughts on this John. Really appreciate you taking the time to look into it.
You make a great point about not always copying competitors without testing first. If it's rolled out on such a wide scale, it's always going to be a hard case to put to the client knowing that they're going to lose out in the short-term when it comes to advertising revenue but regardless, I think it's our job as SEOs to first and foremost propose the most SEO-friendly implementation possible.
-
This is actually a really interesting question. I looked at their category pages (eg http://venturebeat.com/tag/ar-vr-weekly/) and those seem to be set up correctly to handle infinite scroll as it sends the search engines to the next page.
I've not come across this with infinite scroll on articles, though. I'm sure they've tested it extensively to figure out the best way to send search engines to future articles, but who really knows if it's being effective. If it's still there, I'd assume that they've seen positive signs but it is definitely a non-standard implementation of rel-next/prev!
This does bring up a good point about copying/not copying a competitor's strategy. They have this implemented, but would it work for your own site/business? Maybe, but maybe not. We can't be sure until we test it ourselves (or speak with someone at VentureBeat who wants to share their learnings :-)). If you know when it was rolled out you could benchmark there and look at SEMrush or another tool to see their organic visibility and from there draw at least some correlation, if not causation.
Thanks for flagging this up! It's cool to see.
-
IT depends on application and other design aspects.
I have seen websites that implement the same thing and like morons keep a never accessible footer there as well... you have no idea how impossible it was to get to the social bar/links at the bottom.
You have to think of the user experience to be honest, while there may be good technical reasons for such a design, you must in the end consider what the user goes through and wants to get out of. A/B testing these kinds of things would not hurt either.
But honestly only "feeds" should be this way. Facebook feed, twitter feed, news feed and even then applications should be considered with care.
Disclosure: I personally hate this behavior by default... basically the only place I find it acceptable is on facebook and twitter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor has same site with multiple languages
Hey Moz, I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc. My first thought is this is basically a way to game the system but I could be wrong. They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it. TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
White Hat / Black Hat SEO | | HashtagHustler0 -
A semi client of mine reached out to me recently that another site scraped his whole site and traffic dropped significantly.
Someone that reaches out to me on occasion for help but is mostly an SEO DIYer recently had his site copied completely and his traffic dropped significantly immediately after. I have never had a client experience this in the past. Does anyone have suggestions or expertise on this? See his question and what he has done below. Jeremy This site scraped my credit site. Its appearance coincides with a dramatic sitewide decrease in Google traffic.I submitted a takedown request by paying this company $200. No results yet. My hosting company also placed blocks on the site HTML which pings my server for CSS and picture files. My Google Webmaster tools account shows inbound links coming from the copycat. Is there something more I should be doing? Copy Site: http://masqueros.com/Real Site: https://www.savvyoncredit.com/
White Hat / Black Hat SEO | | jeremyskillings0 -
Opinion on Gotch SEO methods & services
I would love to get you all's take on Gotch SEO. I am gearing up to link build for a site in the next several months, and have been reading up from sources other than Moz, in preparation. (Need to re-read Moz's guide, too, but I have already read it last year) I'm reading Gotch SEO's main link building method articles right now, and am wondering what you all think. Do you think they have a good approach and are generally reliable? Likewise, has anyone used their service for getting a link? What was your experience? Or if you haven't used the service, any quick takes on it?
White Hat / Black Hat SEO | | scienceisrad0 -
Bad keywords sending traffic my site, but can't find the source. Advice?
Hi! My site seems to be the target of negative SEO (or some ancient black hat work that's just now coming out of the woodwork). We're getting traffic from keywords like "myanmar girls" and "myanmar celebrities" that just started in late June and only directs to our homepage. I can't seem to find the source of the traffic, though (Analytics just shows it as "Google," "Bing," and "Yahoo" even though I can't find our site showing up for these terms in search results). Is there any way to ferret out the source besides combing through every single link that is directing to us in Webmaster Tools? I'm not even sure that GWT has picked up on it since this is fairly new, and I'd really love to nip this in the bud. Thoughts? Thanks in advance!
White Hat / Black Hat SEO | | 199580 -
What's the deal with Yext?
Ok, the "SEO" in me says don't sign my clients up for this. But their ads are EVERYWHERE. All the time. Is this bad/good? thoughts? Have you ever used Yext? I can't find a review online that I don't think is biased. Should I trust my gut on this one and pass?
White Hat / Black Hat SEO | | cschwartzel0 -
Subdomain and root domain effects on SEO
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
White Hat / Black Hat SEO | | herlamba0 -
Hit by Negative SEO
I've seen some discussion here about whether or not negative seo is real. I've just spent 6 months recovering from Penguin, rewriting content, removing hundreds of bad links, and seeing our traffic slowly improve. Yesterday we noticed in Google webmasters tools that we're ranking for the term "Free Sex." Here... http://screencast.com/t/ezoo2sCRXQ Now we have discovered that thousands of "sex" links have been directed at our improving domain. I am convinced I know who the culprit is. What would you advise a client to do in my situation? Forget about removing these damn links. I don't have the time, money or energy to go through that again. I'm sure he can add them much faster than I can ever remove them. Is the disavow tool best answer in this case? Or is there an international court of seo justice that I can appeal to?
White Hat / Black Hat SEO | | DarrenX0 -
Multiple domains pointed at one site
I know things are changing and the things Google thinks are cheating searchers from finding what they are really looking for are changing too. So, I have multiple domain names that are related to my site, but not the actual site name. For instance, I have a certification program called Certified NetAnalyst that has a few domains for it... .com, .org and other derivatives like NetAnalyst. I would like to point the domains to my main company web site and not create a site just for the certification. Does Google think it is cheating to point domain names with my company branding names to my main web site? What about domain name forwarding to a specific URL, like taking the certification name domains and pointing them to the certification page instead of the main site? Wondering if one could no follow (don't know how to do that) the domain forwarding links so it is not duplicate content? Is that possible in some way? Could you put another robots.txt file with excludes in the domain forwarding url landing page so it would not be duplicate content? For the future I want all SEO "juice" to go to the main domain, but the keyword value of the domain names is valuable. I sure would be grateful if someone that has a good understanding and specific recent experience with Google policy and enforcement could offer some sage and practical advice and perhaps a case study example where Google "likes it" or on the other hand a good explanation of why I may not wish to do this! Thank You! Bill Alderson www.apalytics.com
White Hat / Black Hat SEO | | Packetman0071