Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best way to test Angular JS heavy page for SEO?
-
Hi Moz community,
Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works.
I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed.
https://sitebulb.com/resources/guides/javascript-seo-resources/
However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index.
Any thoughts on this, is this concern valid?
Thanks!
-
Hi Zack,
I think your concern here is valid (your render with Screaming Frog or any other client is unlikely to be precisely representative of what Googlebot will see/index). That said, I'm not sure there's much you can do to eliminate this knowledge gap for your QA process.
For instance, while we have seen Googlebot timing out JS rendering around the ~5s mark using the "Fetch & Render as Googlebot" functionality in Search Console (see slide 25 of Max Prin's slide deck here), there's no confirmation this time limit represents Googlebot's behavior in the wild.
Additionally, we know that Googlebot crawls with limited JS support - for instance, when a script uses JS to generate a random number, my colleague Tom Anthony found that Googlebot's random() JS function is deterministic (returns a predictable set) - so it's clear they have modified the headless version of Chrome they use to conserve computational expenses in this way. We can only assume they've taken other steps to save computing costs. This isn't baked-into Screaming Frog or any other crawling tool.
We have seen that with a 5s timeout set in Screaming Frog, the rendered result is pretty close to what "Fetch & Render as Googlebot" functionality demonstrates. And with the ubiquity of JS-driven content on the web today, provided links and content are rendered into the DOM fairly quickly (well ahead of that 5s mark), we've seen Google rendering and indexing JS content fairly reliable.
The ideal would be for your dev team to code these pages to degrade gracefully - so that even with JS support totally disabled, navigation and content elements are still rendered (they should be delivered in the page source, then enhanced with JS, if possible).
Failing that, the best you're likely to achieve here is reasonable confident that Googlebot can crawl, render and index these pages - there'll be some risk when you publish them to production.
Hope this helps somewhat - best of luck!
Thanks,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
How to rank 1 page for multiple keywords in the new way
Hi There It has been a little while since I was involved with KW's in earnest. 1.5 years ago and beyond I did really well with SEO. I'm not in a hugely competitive market but we found our keywords, we wrote great web pages for 1,2,3 keywords and when we found more great keywords that we built a new page to rank for. For example: One big hitting keyword was "Rugged PDA", we created a category page for Rugged PDA's. Another was "Rugged Handheld" so we had a new page for that. We then long tailed "semi rugged PDA", "waterproof rugged PDA" etc etc and built sub category pages. We were legit, did lots of content marketing, ran a blog tweeted etc and we did really well to be honest. However these days it's not working, One of Rand's whiteboard sessions stated that you need to build bigger topic based pages that delivered on more keywords (The one about shoes!). This is great as we love that idea as we can have 1 big category page that offers great value to the visitor, however I am struggling to work out how we target a bigger list of keywords to the one page or to fewer pages. To underline this the MOZ page rankers also still seem to work in the same way where they expect 1 or 2 KW's per page to get A ranks to them, so I'm confused!! For example Rugged PDA is an old term, Google trends is showing that it's glory days are over and we know that the term "Rugged Smartphone" is the one to use as we all use smartphones not PDAs these days. However we also see a lot about Rugged + Phone, Mobile, Cell, Handheld, tablet, device, phablet... all relevant to one big category page. So I run these KW's through google search to see if the same pages come up as a test to see if Google thinks they all mean the same, I get a few, but not much overlap. How do we therefore have 1 page that talks about all kinds of great stuff about the "Rugged smartphone" but one that also targets rugged handheld, rugged android device etc etc? I've spent 2 days catching up, i'm none the wiser on this specific element but i'm sure I am just missing one key element of common sense here and any help is very much appreciated. Regards Dave
Technical SEO | | Raptor-crew0 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
Why are my URL's changing
My rankings suddenly dropped and when trying to understand why I realized that nearly all images in Google's cached version of my site were missing. In the actual site they appear but in the cached version they don't. I noticed that most of the images had a ?6b5830 at the end of the URL and these were the images that were not showing. I am hoping that I found the reason for the drop in rankings. Maybe since Google cannot see a lot of the content it decided not to rank it as well (particularly since it seems to happen on thousands of pages). This is a cached version of my site I am using the following plugins that might be causing it: Yoasts SEO plugin, W3 total cache. Does anyone know what is causing ?6b5830 to be added to the end of most of my URL's? Could this be the reason for the ranking drop? Thanks in advance!
Technical SEO | | JillB20130 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Best way to setup large site for multi language
Hello, I am setting up a new site which is going to be very large, over 250,000 products. Most of our customers are in the UK (45%), the rest are from various European countries and the USA. Unfortunately we only have a team of two people writing content for these pages in English. I would value some input on the best way to setup my website structure for ranking. Obviously the best would be individual country oriented domains I.e. domain.fr domain.de domain.co.uk . However we wouldnt have the time to create content for every page and most pages would contain the same content as the English domain. Would I get a penalty for this from google? The second choice is to follow the example of overstock.com and pull in information relating to each country I.e. currency and delivery time. this would be a lot easier but I am concerned that the lack of geo focus would effect my rankings. Does any one have any ideas?
Technical SEO | | DavidLenehan0 -
What's the best way to transplant a blogger blog to another domain?
So I have this client who's got a killer blogger blog—tons of inbound links, great content, etc. He wants to move it onto his new website. Correct me if I'm wrong, but there isn't a single way to 301 the darn thing. I can do meta refresh and/or JavaScript redirects, but those won't transfer link juice, right? Is there a best practice here? I've considered truncating each post and adding a followed "continue reading…" link, which would of course link to the full post on the client's new site. It would take a while and I'm wondering if it would be worth it, and/or if there are any better ideas out there. Sock it to me.
Technical SEO | | TheEspresseo0 -
Oh! best community of seo the seomoz team! question:
Hi the best of the best on the seo world. I like to ask something, i like to know: So, from your strongest-bestest-proffessionallll experience, the web-scripts you use,must be the best of best scripts, i like to ask: for your newsletter, are you using external service? client managment,pro members,are you using any external service from another website or have you do the programming itself? and dont forget to say, your services without any comment are number one on the world,but your community now and your design,and your big heart to leave us talking together here and unlimited questiond to ask, you have attested that you are the best. But tell us more of your services what you use on your encyclopedy of seo, we like to know more from you. Thanks Meti.
Technical SEO | | leadsprofi0