Is using REACT SEO friendly?
-
Hi Guys
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
Many thanks for your help in advance.
Cheers
Martin
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React itself isn't inherently bad for SEO, but extra care must be taken with regards to optimizing its use for search. Many successful websites use React, yet SEO optimization remains essential.
Consider frameworks such as Next.js, which handles server-side rendering for SEO-friendly development. For ultimate efficiency, however, a static site generator might be better.
If you're interested in SEO, you can join a digital marketing course in Kolkata!
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React can be SEO-friendly, but there are considerations to keep in mind due to its default client-side rendering. When search engines crawl websites, they traditionally expect server-rendered HTML for indexing. React applications often render content on the client side, which can pose challenges for search engine optimization (SEO).
To address this issue, there are a few strategies:
-
Server-Side Rendering (SSR):
- SSR involves rendering React components on the server before sending HTML to the client. This ensures that search engines receive fully rendered HTML, making content easily indexable.
- Tools like Next.js, a React framework, support SSR, providing a smoother SEO experience.
-
Static Site Generation (SSG):
- SSG generates static HTML files during the build process. This approach ensures that content is pre-rendered, enhancing SEO performance.
- Next.js also supports SSG, making it a versatile choice for projects requiring strong SEO.
-
Prerendering:
- Prerendering involves generating static HTML for specific pages at build time. This approach combines the benefits of SSR and SSG, allowing developers to target critical pages for SEO optimization.
Several companies and developers have successfully implemented React with SEO in mind. By using SSR or SSG, they've achieved positive results in search engine rankings and overall visibility.
It's essential to note that while React can be SEO-friendly, other frameworks like Angular or Vue.js may also offer SEO solutions. The choice depends on the project's specific requirements and the developer's familiarity with the framework.
In summary, React can be made SEO-friendly through practices like SSR, SSG, or prerendering. Many developers have experienced success in maintaining good SEO performance with React, especially when using tools like Next.js. However, the decision should be based on the project's needs, available resources, and the development team's expertise. Always ensure that your chosen approach aligns with current SEO best practices to achieve optimal results.
-
-
I have doing some research on this issue since there are lots of mixed opinion on this. Per my friends who work on this matter closely, Google, Bing, Yahoo, and DuckDuckGo should all be able to fetch the React based single page applications.
Custom Mat Board (which cuts customized mat boards for any Amazon or IKEA picture frames) is a React based application, and it works well. Please check out Fetch as Google and note if there are any major difference between what Google bot sees and what humans can see. If there are significant differences, you should do something about it. But per my experience, Google bots and humans do see the same thing.
PM me if you have any questions. Cheers!
WJ
-
Thanks for discussing this, Martijn.
Aside from Google, is there any concern that other search engines would have issues rendering a JS website, whether the site uses React, Angular or another framework?
Thanks
-SB
-
Hi Martin,
It can be, that's the actual answer. As React is using JavaScript to load its pages and load the content in most cases. Google and other search engines are able to read the content but it's always required in these cases to check what the actual result is. I've worked with many sites using React and it depends if they're using server or client-side rendering. Start there, to figure out what you can be using for your client/company. Some teams are really drawn to the client side rendering which is a little bit more dangerous as not always can Google see the actual content. In case of server-side rendering, I've seen it go well for most of these.
Let me know if you have any specific questions, happy to answer them!
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Coverage impact on SEO
Does the coverage issues on google search console ( Google Webmaster) has an impact on SEO ( CTR or impressions). How much of a difference or impact will fixing these have on Search results and average
Algorithm Updates | | Rishardg0 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
Should I use subdomains?
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site. For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use rose.flower.com
Algorithm Updates | | Gordon_Hall
or
flower.com/rose For SEO purposes, or usability, does it matter? Thanks in advance.0 -
SEO having different effects for different sites
Hi, I hope this isn't a dumb question, but I was asked by a local company to have a look at their website and make any suggestions on how to strengthen and improve their rankings. After time spent researching their competitors, and analysing their own website I was able to determine that they are actually in a good position. The have a well structured site that follows the basic search rules, they add new relevant content regularly and are working on their social strategy. Most of their pages are rated A within Moz, and they spend a lot of time tweaking the site. When I presented this to them, they asked why there are sites that rank above them that don't seem to take as much care over their website. For example, one of their main competitors doesn't engage in any social networking, and rarely adds content to their site. I was just wondering if anyone could shed any light on why this happens? I appreciate there's probably no simple answer, but it would be great to hear some different input. Many thanks
Algorithm Updates | | dantemple880 -
Will Parked Domain hurt My SEO as Duplicate Content?
Hello, I have one website (Migration Lawyers) and I have an extra 8 domains Parked so they are basically cloning the content of the site. so if the main site is: migrationlawyers.co.za and I have an addon domain migration-lawyers.com is that good or bad? is there a proper way to redirect the sites, will redirecting (301) subdomains be more effective? Thanks for your Input 🙂 0i8VXqr.png
Algorithm Updates | | thealika0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Search bots that use referrers?
Can someone point me to a list or just tell me specific search bots that use referrers?
Algorithm Updates | | BostonWright0