What's the best way to test Angular JS heavy page for SEO?
-
Hi Moz community,
Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works.
I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed.
https://sitebulb.com/resources/guides/javascript-seo-resources/
However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index.
Any thoughts on this, is this concern valid?
Thanks!
-
Hi Zack,
I think your concern here is valid (your render with Screaming Frog or any other client is unlikely to be precisely representative of what Googlebot will see/index). That said, I'm not sure there's much you can do to eliminate this knowledge gap for your QA process.
For instance, while we have seen Googlebot timing out JS rendering around the ~5s mark using the "Fetch & Render as Googlebot" functionality in Search Console (see slide 25 of Max Prin's slide deck here), there's no confirmation this time limit represents Googlebot's behavior in the wild.
Additionally, we know that Googlebot crawls with limited JS support - for instance, when a script uses JS to generate a random number, my colleague Tom Anthony found that Googlebot's random() JS function is deterministic (returns a predictable set) - so it's clear they have modified the headless version of Chrome they use to conserve computational expenses in this way. We can only assume they've taken other steps to save computing costs. This isn't baked-into Screaming Frog or any other crawling tool.
We have seen that with a 5s timeout set in Screaming Frog, the rendered result is pretty close to what "Fetch & Render as Googlebot" functionality demonstrates. And with the ubiquity of JS-driven content on the web today, provided links and content are rendered into the DOM fairly quickly (well ahead of that 5s mark), we've seen Google rendering and indexing JS content fairly reliable.
The ideal would be for your dev team to code these pages to degrade gracefully - so that even with JS support totally disabled, navigation and content elements are still rendered (they should be delivered in the page source, then enhanced with JS, if possible).
Failing that, the best you're likely to achieve here is reasonable confident that Googlebot can crawl, render and index these pages - there'll be some risk when you publish them to production.
Hope this helps somewhat - best of luck!
Thanks,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Community Discussion - What's been your experience with accessibility?
When Laura Lippay came to me with the idea to write a series of posts on the Moz blog about SEO and accessibility, it really got my gears turning. As the blog manager, I realized I'd been thinking about all sorts of ways to make the blog the best it can be, but accessibility was one place I had yet to explore in-depth. While I have my own goals and projects around this topic churning along in the background, I'd love to hear what the community's done to be inclusive to all users of the Internet. What've you struggled with in terms of making sites you've worked on accessible -- both technically and as an initiative in general? What's often missing that you've become passionate about including? Do you have any big wins you're especially proud of and want to share? Looking forward to reading your thoughts and stories, folks! 🙂
Technical SEO | | FeliciaCrawford1 -
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
Should I worry about these 404's?
Just wondering what the thought was on this. We have a site that lets people generate user profiles and once they delete the profile the page then 404's. I was told there is nothing we can do about those from our developers, but I was wondering if I should worry about these...I don't think they will affect any of our rankings, but you never know so I thought I would ask. Thanks
Technical SEO | | KateGMaker1 -
Blank pages in Google's webcache
Hello all, Is anybody experiencing blanck page's in Google's 'Cached' view? I'm seeing just the page background and none of the content for a couple of my pages but when I click 'View Text Only' all of teh content is there. Strange! I'd love to hear if anyone else is experiencing the same. Perhaps this is something to do with the roll out of Google's updates last week?! Thanks,
Technical SEO | | A_Q
Elias0 -
Negative effect on google SEO with 301's?
Cleaning up the website by consolidating pages - each with a little bit of useful info - into one definitive page that is really useful and full of good content. Doing 301's from the many old pages to the one new really good one. Didn't want to do rel canonicals because I don't want the old pages around, I want to get rid of them. Will google see the 301s and go nuts or see that there is one definitive, really good page with no duplicate content? The change is very good from a user perspective. Also, On-Page Report Cards on SEOMoz suggests that you put a rel canonical on a page to itself to tell google that this page is the definitive page. What do you think? Thanks so much for anyone who has time to answer - so many gurus - this is a great forum. - jean
Technical SEO | | JeanYates0 -
What is the best SEO URL design for keywords with a period?
Quick question, Assume my keyword is the German soccer team "1.FC Nuremberg". What is the best URL design to target that keyword? domainname.com/1fc-nuremberg or domainname.com/1-fc-nuremberg Any thoughts? Does it make - even a tiny - difference? /Thomas
Technical SEO | | tomypro0 -
Access To Client's Google Webmaster Tools
Hi, What's the best/easiest way for a client to grant access to his Google Webmaster Tools to me? Thanks! Best...Michael
Technical SEO | | 945010 -
Best way to Handle Pagination?
At the moment I my blog is paginated like so: /blogs > /blogs/page/2 > /blogs/page/3 etc What are the benefits of paginating with dynamic URLs like here on SEOmoz with /blog?page=3
Technical SEO | | NickPateman810