Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
-
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ?
Website: http://www.mantistechnologies.com/
-
Hi Robin,
There's no indication Google is having any trouble picking up the separate URLs and their associated Titles and Descriptions properly - a site: search for your domain returns all pages I'm able to find manually, and each page has a unique and accurate Title and Description snippet.
ReactJS is one of the most commonly used JS platforms, with a lot of momentum in the development community especially on high-traffic sites, and Google has innovated their crawl tech to include JS-support (they crawl with a headless version of Google Chrome) to adapt to this platform.
"View Source" is no longer valid for interpreting page code as Google will render it - they crawl with JS support, so JS interactions and modifications of source code are visible to Google. Using "Inspect Element" in Chrome shows a more accurate representation of what Google can crawl/render.
In short: I see no negatives for SEO here, and I expect at this point your analytics and Search Console data will show that your pages are indexed and eligible for traffic (potentially already getting traffic) from Google.
Best,
Mike -
Have a look at the page view source of every page urls. Website url: http://www.mantistechnologies.com/
All the pages will show the same meta title and description. But it will dynamically call and display the right thing. While checking it with Moz browser plugin and Open Stats browser plugin. It is showing everything correct. So is that means my site is perfect or wrong? Does it harm my site in terms of seo or not?
Need an advanced opinion about my site from Moz team. Please have a deep look on my site URL mentioned above.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links to internal search with pharma spam anchor text. Negative seo attack
Suddenly in October I had a spike on inbound links from forums and spams sites. Each one had setup hundreds of links. The links goes to WordPress internal search. Example: mysite.com/es/?s=⚄
White Hat / Black Hat SEO | | Arlinaite470 -
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
Pointless Wordpress Tagging: Keep or unindex?
Simple as that. Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page. If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once). I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
White Hat / Black Hat SEO | | HashtagHustler0 -
Hidden H1 Tags
I am trying to triple check this - I have a client who has all of their H1 tags as hidden. As far as I am concerned, anything hidden is not a good thing for SEO. I am debating with their online store provider that this is not good practice. Everything I am reading says it is not good practice. They are saying it is for "My SEO experience would suggest otherwise. In addition, the H1 adds semantic value for users with disabilities to help give them context with what the content of the page is." Did I miss something? They are a large brand and have not been penalized. This has been happening for 8 months.
White Hat / Black Hat SEO | | smulto0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11