Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
-
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ?
Website: http://www.mantistechnologies.com/
-
Hi Robin,
There's no indication Google is having any trouble picking up the separate URLs and their associated Titles and Descriptions properly - a site: search for your domain returns all pages I'm able to find manually, and each page has a unique and accurate Title and Description snippet.
ReactJS is one of the most commonly used JS platforms, with a lot of momentum in the development community especially on high-traffic sites, and Google has innovated their crawl tech to include JS-support (they crawl with a headless version of Google Chrome) to adapt to this platform.
"View Source" is no longer valid for interpreting page code as Google will render it - they crawl with JS support, so JS interactions and modifications of source code are visible to Google. Using "Inspect Element" in Chrome shows a more accurate representation of what Google can crawl/render.
In short: I see no negatives for SEO here, and I expect at this point your analytics and Search Console data will show that your pages are indexed and eligible for traffic (potentially already getting traffic) from Google.
Best,
Mike -
Have a look at the page view source of every page urls. Website url: http://www.mantistechnologies.com/
All the pages will show the same meta title and description. But it will dynamically call and display the right thing. While checking it with Moz browser plugin and Open Stats browser plugin. It is showing everything correct. So is that means my site is perfect or wrong? Does it harm my site in terms of seo or not?
Need an advanced opinion about my site from Moz team. Please have a deep look on my site URL mentioned above.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Google Disavow and Penalty lifted please help?
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob
White Hat / Black Hat SEO | | BobAnderson0 -
Link Building after Google updates!
Hello All, I just wanted to ask the question to start a discussion on link building after the Google Updates. I haven't been very proactive lately with regards to link building due to the updates and not wanting to get penalised! Are there any link building trends/techniques people are using since the changes? Thanks, seo_123
White Hat / Black Hat SEO | | TWPLC_seo0 -
Local Listing Spam - Why is Google Missing this?
I have a competitor that ranks in Google Search for a Top Dollar Keyword in the organic rankings with the normal result, however just below that it shows another result that contains the Local City Name followed by their business name. In the URL they have domain.com > Local and below the description data it shows a map for a totally different location as this competitor only has one location... Once clicking on the link I found that it has everything in the title, description and h1 and body content in footer that talks about the local area but not their product. and when you click the breadcrumbs you can go back to a directory of all the other cities and states they are targeting with doorway pages with the same layout however the anchor text is cityname+keyword How are they getting away with this?
White Hat / Black Hat SEO | | Ben-HPB0 -
Should we add our site to Google Webmaster Tools
Hello, Should we add our site nlpca(dot)com to google webmaster tools? Everything's very white hat but we do have a section on each of our 4 sites for "Our other Sites" that link to the others. It's been there for many years. We're looking for clues as to why we've dropped in rank Thanks!
White Hat / Black Hat SEO | | BobGW0