Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
-
Hi there,
A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description.
I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are:
-
If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap?
-
Is submitting each url manually bad, and if so, why?
-
Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls?
-
Any other suggestions?
-
-
Hi David,
The Fetch and Render looked blank, but I know Google can still read the code since it picked up on the schema we added less than a week after we added it. I sent the javascript guides over to our developers, but I would still really appreciate you looking at the URL if possible. I can't find a way to DM you on here, so I've sent you a LinkedIn request. Feel free to ignore it if there's a better way to communicate

- JW
-
That is a interesting Question
-
Hi,
I would mostly look into the site itself, from what you've mentioned here I don't think that the problem is in your sitemap but more on the side or React. Are you using server side or client side rendering for the pages in React? That usually can have a big impact on how Google is able to see the different pages and pick up on content (including meta tags).
Martijn.
-
Hi DigitalMarketingSEO,
This sounds like it's Google having some issues with your React website.
There are plenty of good SEO for Javascript guides out there that I would recommending reading through:
https://www.elephate.com/blog/ultimate-guide-javascript-seo/
https://builtvisible.com/javascript-framework-seo/
https://www.briggsby.com/dealing-with-javascript-for-seoHow did the "Fetch and Render" look? Was Googlebot able to see your page exactly as a human user would?
Can you share the URL here (or PM me)? I've done a lot of work on JS sites and I'd be happy to take a quick look to see I can give some more specific advice.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Help with error: Not Found The requested URL /java/backlinker.php was not found on this server.
Hi all, We got this error for almost a month now. Until now we were outsourcing the webdesign and optimization, and now we are doing it in house, and the previous company did not gave us all the information we should know. And we've been trying to find this error and fix it with no result. Have you encounter this issue before? Did anyone found or knows a solution? Also would this affect our website in terms of SEO and in general. Would be very grateful to hear from you. Many thanks. Here is what appears on the bottom of the site( www.manvanlondon.co.uk) Not Found The requested URL /java/backlinker.php was not found on this server. <address>Apache/2.4.7 (Ubuntu) Server at 01adserver.com Port 80</address> <address> </address> <address> </address>
Web Design | | monicapopa0 -
Should i not use hyphens in web page titles? Google Penalty for hyphens?
all the page titles in my site have hyphens between the words like this: http://texas.com/texas-plumbers.html I have seen tests where hyphenated domain names ranked lower than non hyphenated domain names. Does this mean my pages are being penalized for hyphens or is this only in the domain that it is penalized? If I create new pages should I not use hyphens in the page titles when there are two or more words in the title? If I changed all my page titles to eliminate the hyphens, I would lose all my rankings correct? My site is 12 years old and if I changed all these titles I'm guessing that each page would be thrown in the google sandbox for several months, is this true? Thanks mozzers!
Web Design | | Ron100 -
Missing Meta Description Tag - Wordpress Tag
I am going through my crawl diagonostics issues and I have lots of "Missing Meta Description Tags". However when I look at the url's they are Wordpress Tags, which do not have a meta description. Shall I just ignore these errors or should I find a way to add a meta description? Is it important?
Web Design | | petewinter0 -
Over Optimization & Footer Links for Crediting Web Design to a Company
With the recent updates to the algorithm having to do with link networks and over optimization it has got me to thinking about the footer links we add to each site that we build and do web design for linking back to ours. I could certainly see how Google could make the assumption that these are all on the same server, pointing back to one main site, and penalize us for that. Should we no=follow these links? They may say something like, "Website Designed By: Company Name". They do provide a valuable source to some extent of traffic to the site from people interested in our designs. Any thoughts?
Web Design | | JoshGill270 -
Should I use the google mod_pagespeed in my apache server?
Anyone already use it? There is some speed benefit? http://code.google.com/speed/page-speed/docs/module.html
Web Design | | Naghirniac0 -
Recommended Website Monitoring Tools
Hi, I was wondering what people would recommend for website monitoring (IE is my website working as it should!). I need something that will:
Web Design | | James77
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities. We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case. Thanks0