Is using REACT SEO friendly?
-
Hi Guys
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
Many thanks for your help in advance.
Cheers
Martin
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React itself isn't inherently bad for SEO, but extra care must be taken with regards to optimizing its use for search. Many successful websites use React, yet SEO optimization remains essential.
Consider frameworks such as Next.js, which handles server-side rendering for SEO-friendly development. For ultimate efficiency, however, a static site generator might be better.
If you're interested in SEO, you can join a digital marketing course in Kolkata!
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React can be SEO-friendly, but there are considerations to keep in mind due to its default client-side rendering. When search engines crawl websites, they traditionally expect server-rendered HTML for indexing. React applications often render content on the client side, which can pose challenges for search engine optimization (SEO).
To address this issue, there are a few strategies:
-
Server-Side Rendering (SSR):
- SSR involves rendering React components on the server before sending HTML to the client. This ensures that search engines receive fully rendered HTML, making content easily indexable.
- Tools like Next.js, a React framework, support SSR, providing a smoother SEO experience.
-
Static Site Generation (SSG):
- SSG generates static HTML files during the build process. This approach ensures that content is pre-rendered, enhancing SEO performance.
- Next.js also supports SSG, making it a versatile choice for projects requiring strong SEO.
-
Prerendering:
- Prerendering involves generating static HTML for specific pages at build time. This approach combines the benefits of SSR and SSG, allowing developers to target critical pages for SEO optimization.
Several companies and developers have successfully implemented React with SEO in mind. By using SSR or SSG, they've achieved positive results in search engine rankings and overall visibility.
It's essential to note that while React can be SEO-friendly, other frameworks like Angular or Vue.js may also offer SEO solutions. The choice depends on the project's specific requirements and the developer's familiarity with the framework.
In summary, React can be made SEO-friendly through practices like SSR, SSG, or prerendering. Many developers have experienced success in maintaining good SEO performance with React, especially when using tools like Next.js. However, the decision should be based on the project's needs, available resources, and the development team's expertise. Always ensure that your chosen approach aligns with current SEO best practices to achieve optimal results.
-
-
I have doing some research on this issue since there are lots of mixed opinion on this. Per my friends who work on this matter closely, Google, Bing, Yahoo, and DuckDuckGo should all be able to fetch the React based single page applications.
Custom Mat Board (which cuts customized mat boards for any Amazon or IKEA picture frames) is a React based application, and it works well. Please check out Fetch as Google and note if there are any major difference between what Google bot sees and what humans can see. If there are significant differences, you should do something about it. But per my experience, Google bots and humans do see the same thing.
PM me if you have any questions. Cheers!
WJ
-
Thanks for discussing this, Martijn.
Aside from Google, is there any concern that other search engines would have issues rendering a JS website, whether the site uses React, Angular or another framework?
Thanks
-SB
-
Hi Martin,
It can be, that's the actual answer. As React is using JavaScript to load its pages and load the content in most cases. Google and other search engines are able to read the content but it's always required in these cases to check what the actual result is. I've worked with many sites using React and it depends if they're using server or client-side rendering. Start there, to figure out what you can be using for your client/company. Some teams are really drawn to the client side rendering which is a little bit more dangerous as not always can Google see the actual content. In case of server-side rendering, I've seen it go well for most of these.
Let me know if you have any specific questions, happy to answer them!
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
By Using interstitial (popups) on the webiste, will google penalize ranks for desktop and mobile both ?
We have implemented interstitials (pop-ups) on a website (Business Articles Website). The popups are basically used for getting leads from the website (using Signup popups). Before Popup implementation the traffic was steady, After the implementation, the traffic started to decay after a couple of weeks and due to the drop we disabled the popup from the website and initiated a force crawl and within next few weeks, we observed traffic gaining back to its normal trend. Within these timelines drop in desktop traffic was more and mobile traffic remain steady. As per Google guidelines, interstitials are more likely to be affected on mobile than desktop. But in our case, desktop traffic was hit more than mobile. So we carried out this experiment for 3 months. And we observed traffic decay and regain. Is interstitials the only culprit here (as the drop is only in desktop) or Can there be some other reasons as well for the traffic drop? bF7hc
Algorithm Updates | | iQuanti1 -
SEO - Google Local Listing & Same Day Delivery
Hi We are looking to offer same day delivery if you're in a 20 mile radius to us. I'm trying to do some research on how to optimise this for Google organic listings. Would this be the same as optimising for a local business listing? I'm not sure where to start. Thanks! Becky
Algorithm Updates | | BeckyKey0 -
Why SEO
Hi Guys, it's going to be a long one so thank you from advance for your patience. First i wanted you to i am new to SEO world and this world excites me In the last 30 days, i started to check my companies (i am employee not an owner) website in SEO point of view. What we do: Online grocery shopping and home delivery www.artizone.com I will share with you some of the results: 5482 errors for duplicate page content 8850 errors for duplicate page title 37 title missing or empty some pages had 4+ H1 tags many more HTML errors. page authority 42/100 domain authority 40/100 We have no listing 163 back-links domain to our Dallas & Chicago website and much more another thing i checked was where are we for the keywords that we are interested in: "online grocery shopping Dallas" - position 1 in Google search shop for groceries online Dallas - Position 1 in Google search shop for groceries from home Dallas - Position 1 in Google search And some more and the same for Chicago I Did all the searches in new incognito window My Question - how come we build our website so bad, with no Google+, we have very few back-links(compare to our competitors) and the page and domain rank are so low and still we are in the 2 or 3 highest position in Google search One more question - Why to invest time in fixing all the SEO issues i found? You are all welcome to have a SEO view in my website Thank you vest much Ivgi
Algorithm Updates | | iivgi0 -
How to write a good resourceful SEO enabled article
We have our saas based website - most of our online customers are those who keep coming back to us and my GA is full of their footprints. I completely want to concentrate on getting hold of those who might really need our software and as of now are not able to find them . Including keywords through which people might want to find us is one of the ways. Next how do I publish that to the majority of the users to find and get traction better on that article or post? Would posting links to facebook twitter etc and getting people to find those articles there and link back and come on our main website to read it - will this help? We sell cloud based software but have various domains where our customers can make use of it. There are at least 5-10 of them. We don't have content at all on our website. In a few simple steps how can I get started with this - Content generation **Linking back the content ** Generating good foot falls from users to those cotent Notching up on google for those content page A detailed insight would prove much helpful Thanks
Algorithm Updates | | shanky11 -
Youtube, Video SEO, & my site
For our business we are building a collection of videos ranging including product info, how-to's, and some funny content. My understanding is that if you embed these onto my site from youtube you don't get any credit for these videos on the web site even if submitting a video sitemap. My thinking is to post these videos to youtube and to host them on our own site and submit a video sitemap including the videos on our site. We would change the name, description, etc. on youtube vs. what's o our web site. Question is - is this the best strategy? Do I get penalized for duplicate content? They are important for both the social aspects of youtube and the content vaue of our web site.
Algorithm Updates | | uwaim20120 -
Localised Hosting is Good for SEO - But How Local?
Hi SEOmoz community, A UK based client will soon be opening an office in the USA. We have advised them to create a new website specifically aimed at the US market, primarily because the way you talk to your potential customers is slightly different than here in the UK. However, this has also raised the question of hosting. Of course we'll be advising them to host their new US site in the States, however does it matter where? For example, if their office is in NYC, would it matter if their hosting was based in Dallas? I.e. does Google rank sites hosted in a US city / state higher for localised searches? Interested to hear your thoughts - thanks for your time! Mark
Algorithm Updates | | RiceMedia0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0