Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
-
| |
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> |and Please tell me effect on seo or not
-
Also, some more information I can gather from your question:
- that noscript is telling non-js users/bots to meta refresh to an error page
- Google shouldn't be confused by that, but Screaming Frog would (and potentially other search engines)
- it is probably also not the best experience for non-js users: You can display an error messages without redirecting to another URL.
Hope that's helpful...
-
Thanks for the question!
It sounds like you are concerned about Google being able to crawl your site, and you think the
<noscript>tag on every page might be the cause? In your example it looks like if someone tries to access your page with JavaScript disabled they would be redirected to an error page? </p> <p>Anyway you can share your domain so I can better assist?</p> <p>Thanks!</p></noscript>
-
Manual index
-
I got your site in your PM. I went to google and typed site:yourdomain.com and saw that Google reports over 400 pages from your site are indexed.
-
I send site to private message
-
Can you share your site?
-
I have content all page but google can't crawl my site i check on frog crawl but can't find any page
-
So there is no content between the noscript tags?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | Aug 7, 2020, 2:18 PM | PabloCulebras0 -
Moving to new site. Should I take old blog posts with me?
Our company website has needed a complete overhaul for some time now and the new one is almost ready to go live. We also have a separate "news" site that is houses around 800 blog posts and news items. (That news site will be thrown away because it's on a completely different domain and causes confusion.) So we have a main site with about 100 decent blog posts and a separate news site with 800 poor posts. I plan on bringing all the main site blog posts over to the new site (both WordPress), but my question is whether or not to bring over the news site posts? All, handful, none? Another issue is the news site doesn't have Google Analytics, so I'm not sure if any posts actually generate traffic, but I can from the main site we do get some referrals from it. As far as quality of content goes, it's poor. Not sure who wrote it all, but it's mainly text press releases that aren't very interesting. Is it worth bringing over for SEO purposes or simply delete the site and create a mass redirect so all of those pages will direct to the new website's blog page? Any help is greatly appreciated.
Web Design | Sep 4, 2015, 11:25 AM | codyfrew0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | Mar 24, 2014, 6:03 PM | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Does Google follow links inside a <noscript>tag?</noscript>
I'm looking at making an embedable calculator and asking users to embed it to their website. I had the idea of using javascript to include the calculator which would also conatain a text link back to my site in order to gain some back links. If it's possible Google won't see the link (as they may not execute the javascript), is it safe to place the link in the <noscript>tag? If so, Will it be indexed and will Page Rank be passed?</span></p> <p>Thanks in advance for your answers. </p> <p>Anthony</p> <p><span style="color: #5e5e5e;"><br /></span></p></noscript>
Web Design | Sep 28, 2012, 4:14 PM | BallyhooLtd0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | Oct 18, 2018, 6:06 AM | DownPour0 -
Google penalty for links opening in new tab?
Our web services provided suggested that Google doesn't like in-text links that open the link in a new tab. Can anyone verify this? We often link to outside credible resources for our audience, though it seems smarter to open in a new tab rather than risk that the person will not navigate back to our site after finding us. Thank you in advance!
Web Design | May 19, 2012, 10:47 AM | jhamlin0 -
How will it affect my site if i link to a site with adult content?
We are currently working on creating 2 sites for a company, one with no adult content, one with adult content. Will it affect the non adult content site if i link to the other one in terms of Google and being blocked by some internet providers.
Web Design | May 8, 2012, 12:31 PM | MattWheatcroft0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | Oct 27, 2011, 8:40 AM | wdziedzic0