If Fetch As Google can render website, it should be appear on SERP ?
-
Hello everyone and thank you in advance for helping me.
I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP).
Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties!
I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host!
Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem.
If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP?
-
Absolutely not a problem. I do think that SSR would be a really positive way forwards for your website! Hopefully that will begin to get the trend-line going up again instead of down
-
Thank you Effectdigital for this response and for your spending time to me. I read twice to get understand and it was fully explained all things in details. I'm gonna searching more about some of your keywords that you mentioned above. I have planed to run SSR in a few months later and finished this problem as well.
-
From the sounds of it, it's not a penalty - it's just a botched migration (with no redirects) to a new platform which is less search-accessible than the previous platform.
Fetch and render has many pitfalls. It (WRONGLY) makes webmaster's think that, every crawl Google does - will be to that level of depth. What you get with fetch and render is a best-case scenario, where Google are deploying all their crawling and rendering technologies for you including rendered browsing (to capture generated content)
You have your base (un-modified) source code, and then you have your modified source code. To get at that (which is far richer, especially for sites which are mostly generated) you have to run a crawler which uses a headless browser (something like Selenium or Windmill, through something like Python) in order to fire the scripts and harvest the modified source data. These days that doesn't take extreme amounts of time, but it does take extreme amounts of time when you compare it to base-source scraping (on average 10x longer). It may still seem like seconds to you, but believe me it takes much more time than near-instant source-code scraping
Google's mission is to index the web. Do you really think they're going to take a random 10x efficiency hit because, modern devs have decided that more modified content is faster and better?
Well... they will and they won't. Google have confirmed that they can and do crawl in this way. But results from moves just like yours, are constantly showing us that they don't deploy this tech for everyone - and even when they do, they don't use it all the time for every crawl (scrape)
If you're in control of a huge site that Google can't afford to lose from their index (like compare the market, Barclays, coca-cola etc) then you have a lot more room to play in this area and reap the benefits of a lightning fast CMS (and front-end deployment, obviously better UX)
If you're not in that position, don't be surprised when these things happen. You have to have some perspective on yourself and what your site is worth to the web. To you it's everything, to Google it's one grain of sand on a vast ocean-floor. And it's one grain of sand which is making Google's life harder, by hitting the efficiency of their core MO (mission objective)
There may be some stuff you can do to fix this, or it may be time to swallow a bitter pill and do a roll-back.
Looking at your source code:
^ the above link will only work in Google Chrome!
It is obvious that it's extremely bare
Let's download the 'base' source code to a PHP file:
It's actually just 3 lines of code, but it probably takes up the space of ... well, a lot more than that (hundred lines maybe)
But here's your modified source code:
It's WAY BIGGER, it's 49 lines of code and even then it's highly condensed
My assertion to you, is that not enough of your coding and content resides within the 'base' source code, most of it is in the modified source code
It's a tough lesson to learn. Yeah, Google 'can' do many things. Yeah their analysis tools put their best foot forwards and show you what they 'can' do. But 'can' and 'will'... they're different cookies man
if you have a powerful enough server (even if you don't maybe it's time to get one!) - maybe you could have all the scripts fire server-side and then just fire users (and search engines) the pre-rendered base-source. Or do something clever like that. This is not game-over, but you'll need to get really smart now. I wouldn't recommend bothering to do that without retrospectively going back (FAST) and doing a full, URL-to-URL 301 redirect migration project (using .htaccess or web.config)
The faster you act, the more likely your recover
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting Old Websites to New Websites
Hi Everyone, We are about to take down a number of websites in favour of a new singular B2B hub and would be looking to redirect all of these sites to the new home. For SEO purposes, what would be the best way to do this? Due to the difference in setups and scale of the site, it would be difficult to correctly match up each page to page between the sites for individual 301 redirects. Could someone advise on the best plan of action? Thanks.
Intermediate & Advanced SEO | | chbiz0 -
How can I stop spam Google Organic traffic?
Hey Moz, I'm a rather experienced SEO who just encountered a problem I have never faced. I am hoping to get some advice or be pointed in the right direction. I just started work for a new client. Really great client and website. Nicer than most design/content. They will need some rel canonical work but that is not the issue here. The traffic looked great at first glance 131k visits in April. Google Analytics Acquisition Overview showed 94% of the traffic as organic. When I dug deeper and looked at the organic source I saw that Google was 99.9% of it. Normal enough. Then I looked at the time on site and my jaw dropped. 118,454 Organic New Users for Google only stayed on the site for 3 seconds. There is no way that the traffic is real. It does not match what Google Webmaster tools, Moz, and Ahrefs are telling me. How do I stop a service that is sending fake organic Google traffic?
Intermediate & Advanced SEO | | placementLabs0 -
New websites
Hi Moz community, My company updated and used a new developer to build and re-design their charity websites: www.runforcharity.com, www.cycleforcharity.com and www.sportforcharity.com. This sites were "re-launched" at the beggining of December 2015 and I have now been able to get a good 6 weeks worth of data. I've been religiously using Moz.com for a couple of years and I use it simply for SEO purposes. Our websites are built upon organic traffic being driven to them and I have noticed that the PA on the new sites has taken a hammering. They all appear to have a PA of 1 and I'm at a loss why? It appears that no page has h1 text? Would this be an issue with the developer or something the content team is doing wrong? Any help of advice would be much appreciated. Many thanks Ryan
Intermediate & Advanced SEO | | Bennerya0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
What is better for google: keep old not visited content deeply in the website, or to remove it?
We have quite a lot of old content which is not visited anymore. Should we remove it and have a lot of 410 errors which will be reported in GWT? Or should we keep it and forget about it?
Intermediate & Advanced SEO | | bele0 -
Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
Hi, I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible. However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
Intermediate & Advanced SEO | | mindflash0 -
Google +1 and Yslow
After adding Google's +1 script and call to our site (loading asynchronously), we noticed Yslow is giving us a D for not having expire headers for the following scripts: https://apis.google.com/js/plusone.js
Intermediate & Advanced SEO | | GKLA
https://www.google-analytics.com/ga.js
https://lh4.googleusercontent.com... 1. Is their a workaround for this issue, so expire headers are added to to plusone and GA script? Or, are we being to nit-picky about this issue?0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1