What's the best way to test Angular JS heavy page for SEO?
-
Hi Moz community,
Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works.
I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed.
https://sitebulb.com/resources/guides/javascript-seo-resources/
However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index.
Any thoughts on this, is this concern valid?
Thanks!
-
Hi Zack,
I think your concern here is valid (your render with Screaming Frog or any other client is unlikely to be precisely representative of what Googlebot will see/index). That said, I'm not sure there's much you can do to eliminate this knowledge gap for your QA process.
For instance, while we have seen Googlebot timing out JS rendering around the ~5s mark using the "Fetch & Render as Googlebot" functionality in Search Console (see slide 25 of Max Prin's slide deck here), there's no confirmation this time limit represents Googlebot's behavior in the wild.
Additionally, we know that Googlebot crawls with limited JS support - for instance, when a script uses JS to generate a random number, my colleague Tom Anthony found that Googlebot's random() JS function is deterministic (returns a predictable set) - so it's clear they have modified the headless version of Chrome they use to conserve computational expenses in this way. We can only assume they've taken other steps to save computing costs. This isn't baked-into Screaming Frog or any other crawling tool.
We have seen that with a 5s timeout set in Screaming Frog, the rendered result is pretty close to what "Fetch & Render as Googlebot" functionality demonstrates. And with the ubiquity of JS-driven content on the web today, provided links and content are rendered into the DOM fairly quickly (well ahead of that 5s mark), we've seen Google rendering and indexing JS content fairly reliable.
The ideal would be for your dev team to code these pages to degrade gracefully - so that even with JS support totally disabled, navigation and content elements are still rendered (they should be delivered in the page source, then enhanced with JS, if possible).
Failing that, the best you're likely to achieve here is reasonable confident that Googlebot can crawl, render and index these pages - there'll be some risk when you publish them to production.
Hope this helps somewhat - best of luck!
Thanks,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to use Google search console's 'Name change' tool?
Hi There, I'm having trouble performing a 'Name change' for a new website (rebrand and domain change) in Google Search console. Because the 301 redirects are in place (a requirement of the name change tool), Google can no longer verify the site, which means I can't complete the name change? To me, step two (301 redirect) conflicts with step there (site verification) - or is there a way to perform a 301 redirect and have the tool verify the old site? Any pointers in the right direction would be much appreciated. Cheers Ben
Technical SEO | | cmscss0 -
Sitemap issue? 404's & 500's are regenerating?
I am using the WordPress SEO plugin by Yoast to generate a sitemap on http://www.atozqualityfencing.com. Last month, I had an associate create redirects for over 200 404 errors. She did this via the .htaccess file. Today, there are the same amount of 404s along with a number of 503 errors. This new Wordpress website was constructed on a subdirectory and made live by simply entering some code into the .htaccess file in order to direct browsers to the content we wanted live. In other words, the content actually resides in a subdirectory titled "newsite" but is shown live on the main url. Can you tell me why we are having these 404 & 503 errors? I have no idea where to begin looking.
Technical SEO | | JanetJ0 -
Local SEO - Page Titles
Hi Folks, Complete newbie (well last 12 months) I have recentley added a blog to my site and have been doing quite a bit of quite word researching through google. I have found some good keywords that have up till now escaped me! Heres my question because I trying for local traffic, mainly newcastle durham and sunderlanddo i go with one of the following two options get two very similar keywords in my article and go for both and rely on google to bring up local listings for the end user in my area e.g Small garden design | Garden design from the experts. (keywords bold ) or Garden Design | Newcastle | Sunderland | Durham | so I have geo locations in title either way I will obviously have both keywords and locations in the artcle Help please I dont want to write many hours and find I have missed a trick! Many thank guys n girls!
Technical SEO | | easigrassne0 -
Is it worth changing our blog post URL's?
We're considering changing the URL's for our blog posts and dropping the date information. Ex. http://spreecommerce.com/blog/2012/07/27/spree-1-1-3-released/ changes to http://spreecommerce.com/blog/spree-1-1-3-released/ Based on what I've learned here the new URL is better for SEO but since these pages already exist do we risk a minor loss of Google juice with 301 redirects? We have a sitemap for the blog posts so I imagine this wouldn't be too hard for Google to learn the new ones.
Technical SEO | | schof0 -
What is the best URL designed for a product page?
Should a product page URL include the category name and subcategory name in it? Most ecommerce platforms it seems are designed to do have the category and sub-category names included in the URL followed by the product name. If that is the case and the same product is listed in more then 1 category and sub-category then will that product have 2 unique urls and as a result be treated as 2 different product pages by google? And then since it is the same product in two places on the site won't google treat those 2 pages as having duplicate content? SO is it best to not have the category and sub-category names in the URL of a product page? And lastly, is there a preferred character limit for a URL to be less than in size? Thanks!
Technical SEO | | gallreddy0 -
Anchor Text the best way to use it
Hi i am wanting to gain links to my site and make the links relevant and have seen many sites that do not have the keyword on their site and have been told as well as having the keywords in the image that they will also have anchor text from other sites. Can anyone please tell me the important of anchor text and how i should use it. Is it wise to build anchort text links within your own site and what other sites should you be looking for. I have used a few article sites to gain anchor text links but most of these are no follow which does not help me. any advice would be great
Technical SEO | | ClaireH-1848860 -
What is the best way to replace a .co.uk with a .com name
Hi i would like to know about my site which is www.in2town.co.uk which i am currently revamping and i am now in the process of buying a .com name and would like to know the best way to uise it. What i mean is, i have a lot of links going to the www.in2town.co.uk and would like to know should i do a permantent redirect to the .co.uk with the .com or is it possible to have the co.uk replaced with the .com i am trying to work out the best way to do this at the moment as i have never done this before. now after buying the .com for my domain name i would like to know should i use it as a redirect to my main site, or should i develop a sister site and use it. any advice would be great.
Technical SEO | | ClaireH-1848860 -
Crawl Tool Producing Random URL's
For some reason SEOmoz's crawl tool is returning duplicate content URL's that don't exist on my website. It is returning pages like "mydomain.com/pages/pages/pages/pages/pages/pricing" Nothing like that exists as a URL on my website. Has anyone experienced something similar to this, know what's causing it, or know how I can fix it?
Technical SEO | | MyNet0