If Fetch As Google can render website, it should be appear on SERP ?
-
Hello everyone and thank you in advance for helping me.
I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP).
Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties!
I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host!
Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem.
If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP?
-
Absolutely not a problem. I do think that SSR would be a really positive way forwards for your website! Hopefully that will begin to get the trend-line going up again instead of down
-
Thank you Effectdigital for this response and for your spending time to me. I read twice to get understand and it was fully explained all things in details. I'm gonna searching more about some of your keywords that you mentioned above. I have planed to run SSR in a few months later and finished this problem as well.
-
From the sounds of it, it's not a penalty - it's just a botched migration (with no redirects) to a new platform which is less search-accessible than the previous platform.
Fetch and render has many pitfalls. It (WRONGLY) makes webmaster's think that, every crawl Google does - will be to that level of depth. What you get with fetch and render is a best-case scenario, where Google are deploying all their crawling and rendering technologies for you including rendered browsing (to capture generated content)
You have your base (un-modified) source code, and then you have your modified source code. To get at that (which is far richer, especially for sites which are mostly generated) you have to run a crawler which uses a headless browser (something like Selenium or Windmill, through something like Python) in order to fire the scripts and harvest the modified source data. These days that doesn't take extreme amounts of time, but it does take extreme amounts of time when you compare it to base-source scraping (on average 10x longer). It may still seem like seconds to you, but believe me it takes much more time than near-instant source-code scraping
Google's mission is to index the web. Do you really think they're going to take a random 10x efficiency hit because, modern devs have decided that more modified content is faster and better?
Well... they will and they won't. Google have confirmed that they can and do crawl in this way. But results from moves just like yours, are constantly showing us that they don't deploy this tech for everyone - and even when they do, they don't use it all the time for every crawl (scrape)
If you're in control of a huge site that Google can't afford to lose from their index (like compare the market, Barclays, coca-cola etc) then you have a lot more room to play in this area and reap the benefits of a lightning fast CMS (and front-end deployment, obviously better UX)
If you're not in that position, don't be surprised when these things happen. You have to have some perspective on yourself and what your site is worth to the web. To you it's everything, to Google it's one grain of sand on a vast ocean-floor. And it's one grain of sand which is making Google's life harder, by hitting the efficiency of their core MO (mission objective)
There may be some stuff you can do to fix this, or it may be time to swallow a bitter pill and do a roll-back.
Looking at your source code:
^ the above link will only work in Google Chrome!
It is obvious that it's extremely bare
Let's download the 'base' source code to a PHP file:
It's actually just 3 lines of code, but it probably takes up the space of ... well, a lot more than that (hundred lines maybe)
But here's your modified source code:
It's WAY BIGGER, it's 49 lines of code and even then it's highly condensed
My assertion to you, is that not enough of your coding and content resides within the 'base' source code, most of it is in the modified source code
It's a tough lesson to learn. Yeah, Google 'can' do many things. Yeah their analysis tools put their best foot forwards and show you what they 'can' do. But 'can' and 'will'... they're different cookies man
if you have a powerful enough server (even if you don't maybe it's time to get one!) - maybe you could have all the scripts fire server-side and then just fire users (and search engines) the pre-rendered base-source. Or do something clever like that. This is not game-over, but you'll need to get really smart now. I wouldn't recommend bothering to do that without retrospectively going back (FAST) and doing a full, URL-to-URL 301 redirect migration project (using .htaccess or web.config)
The faster you act, the more likely your recover
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is anyone aware about a tool that can be used for tracking YouTube video ranking on SERP & YouTube?
I am looking for a tool that can track ranking position of a YouTube video on SERP and YouTube. An example would be, Suppose I have a video X. I want to know the position of the video for keywords A,B,C on SERP and YouTube. Do let me know your inputs.
Intermediate & Advanced SEO | | SameerBhatia0 -
How to find the redirects on website
I want to find the complete internal redirects on website. Just internally linked. How to find such?
Intermediate & Advanced SEO | | vtmoz0 -
Could i add my website in Google News
Hi , I am looking to add article news section of my website http://goo.gl/De5MKo in google news. We'll remove all content from this news section http://www.99acres.com/articles/real-estate-news and upload unique news, might be possible we use graph/chart and images from some other news site and other sources but the content will be unique and fresh. Any specific guidelines for URL structure for this news section? We are thinking about to create URL like xyz.com/news/ <title>. Is it okay and will not harm our site? Google News can consider my section as a news?</p></title>
Intermediate & Advanced SEO | | vivekrathore0 -
Moving half my website to a new website: 301?
Good Morning! We currently have two websites which are driving all of our traffic. Our end goal is to combine the two and fold them into each other. Can I redirect the duplicate content from one domain to our main domain even though the URL's are different. Ill give an example below. (The domains are not the real domains). The CEO does not want to remove the other website entirely yet, but is willing to begin some sort of consolidation process. ABCaddiction.com is the main domain which covers everything from drug addiction to dual diagnosis treatment. ABCdualdiagnosis.com is our secondary website which covers everything as well. Can I redirect the entire drug addiction half of the website to ABCaddiction.com? With the eventual goal of moving everything together.
Intermediate & Advanced SEO | | HashtagHustler0 -
How to ask customers to +1 the page of the service/product they used and liked on Google+ of website
What format to use (how to write) to ask customers to +1 the page of the service/product they used and liked on Google+?
Intermediate & Advanced SEO | | MasonBaker0 -
Client is paranoid about Google penguin penalty from getting links from a new website they are building
We have a client that is creating a new promotional website that consists of videos, brands and product reviews (SITE B). After a visitor watches a video on SITE B they will be given a "click to purchase" option that will lead them to the original website (SITE A). Our client is paranoid that since all the outgoing links on the new SITE B are going to the original SITE A there might be algorithm penalty (for one website or both). I find this very unlikely and even recommend "no follow" coding for a peace of mind. However are there any resources/links out there that can back up my argument that they will be alright? Thanks
Intermediate & Advanced SEO | | VanguardCommunications0 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
How can I see all the pages google has indexed for my site?
Hi mozers, In WMT google says total indexed pages = 5080. If I do a site:domain.com commard it says 6080 results. But I've only got 2000 pages in my site that should be indexed. So I would like to see all the pages they have indexed so I can consider noindexing them or 404ing them. Many thanks, Julian.
Intermediate & Advanced SEO | | julianhearn0