Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
-
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ?
Website: http://www.mantistechnologies.com/
-
Hi Robin,
There's no indication Google is having any trouble picking up the separate URLs and their associated Titles and Descriptions properly - a site: search for your domain returns all pages I'm able to find manually, and each page has a unique and accurate Title and Description snippet.
ReactJS is one of the most commonly used JS platforms, with a lot of momentum in the development community especially on high-traffic sites, and Google has innovated their crawl tech to include JS-support (they crawl with a headless version of Google Chrome) to adapt to this platform.
"View Source" is no longer valid for interpreting page code as Google will render it - they crawl with JS support, so JS interactions and modifications of source code are visible to Google. Using "Inspect Element" in Chrome shows a more accurate representation of what Google can crawl/render.
In short: I see no negatives for SEO here, and I expect at this point your analytics and Search Console data will show that your pages are indexed and eligible for traffic (potentially already getting traffic) from Google.
Best,
Mike -
Have a look at the page view source of every page urls. Website url: http://www.mantistechnologies.com/
All the pages will show the same meta title and description. But it will dynamically call and display the right thing. While checking it with Moz browser plugin and Open Stats browser plugin. It is showing everything correct. So is that means my site is perfect or wrong? Does it harm my site in terms of seo or not?
Need an advanced opinion about my site from Moz team. Please have a deep look on my site URL mentioned above.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Client Wants To Use A .io Domain Name - How Bad For Organic?
Hi, I have a U.S. client who is stuck on a name that he wants to get as a .io (British Indian Ocean) domain name for a new site. Aside from the user confusion/weirdness, how much harder do you think this makes this sites organic in the U.S. in the future with a .io domain name? FYI, the other part of the domain name he wants to use is short, meaningless and implies nothing in and of itself. Thanks!
White Hat / Black Hat SEO | | 945012 -
Pointless Wordpress Tagging: Keep or unindex?
Simple as that. Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page. If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once). I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
White Hat / Black Hat SEO | | HashtagHustler0 -
Hidden H1 Tags
I am trying to triple check this - I have a client who has all of their H1 tags as hidden. As far as I am concerned, anything hidden is not a good thing for SEO. I am debating with their online store provider that this is not good practice. Everything I am reading says it is not good practice. They are saying it is for "My SEO experience would suggest otherwise. In addition, the H1 adds semantic value for users with disabilities to help give them context with what the content of the page is." Did I miss something? They are a large brand and have not been penalized. This has been happening for 8 months.
White Hat / Black Hat SEO | | smulto0 -
A Branded Local Search Strategy utilizing Microsites?
Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches. The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern. However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS). They are currently utilizing different domains, on different IPs, on different servers, etc. We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy. The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here. Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?
White Hat / Black Hat SEO | | AaronHenry1 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
Benefit of using 410 gone over 404 ??
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone. Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change. I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case? Or, when would you use a 410 gone? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0