Dynamically Inserting Noindex With Javascript
-
Hello,
I have a broken plugin creating hundreds of WP-Content directory pages being indexed by Google. I can not access the source code of these pages to add a noindex to them. The page URL's all have the plugin name within them. In order to resolve the issue, I wrote a solution with javascript to dynamically add in a noindex tag to any URL containing the plugin name. Would this noindex be respected by Google and is there a way to immediately check that it is respected?
Currently, I can not delete the plugin due to issues with it's php.
If you would like to view the code: https://codepen.io/trodrick/pen/Gwwaej?editors=0010
Thanks!
-
Perfect! Happy to help!
-
It seemed to work. Hopefully the noindex is respected, thank you!
-
You can! Via https://support.google.com/webmasters/answer/1663419?hl=en
- If you choose to hide a directory, then any file or directory starting with the prefix you supply will be blocked. So if you enter /folder, then /folder/somefile, foldername/somefile, and /folder.html will all be blocked.
- To hide an entire site, leave the path empty.
-
It looks like it is active. Thanks, John! Can you no-index an entire directory in GSC? I thought it was only per URL.
-
Hey there! Do you have a link to a page where it's implemented live? As long as you have the no-index, no follow in there, you should be okay. Other things you can do:
1. Use robots.txt to hide the directory of pages that the plugin is outputting
2. Sign into the Google Search Console and no-index the same directory
I would do 1 and 2 to help speed up things once the no-index, no-follow tag is in place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
I'm getting an error in Search Console that pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. Unfortunately I can't post images on here but I've linked some url's below. The page below in search console shows the error above... https://mixeddigitaleduconsulting.com/ As does this one. https://mixeddigitaleduconsulting.com/independent-school-marketing-communications/ However, this page does not have the error and is indexed by Google. The meta robots tag looks identical. https://mixeddigitaleduconsulting.com/blog/leadership-team/jill-goodman/ Any and all help is appreciated.
Technical SEO | | Sean_White_Consult0 -
"noindex" internal search result urls
Hi, Would applying "noindex" on any page (say internal search pages) or blocking via robots text, skew up the internal site search stats in Google Analytics? Thanks,
Technical SEO | | RaksG0 -
301: Dynamic URL to Static Page
I've been going around trying to get this dynamic url to redirect in the .htaccess file. I know I'm missing something but can't figure it out. Code: RewriteEngine on
Technical SEO | | ohlmanngroup
RewriteCond %{QUERY_STRING} ^/dynamic-url.php?id=43$
RewriteRule ^$ http://static/page/url/inserted/here? [R=301,L] Suggestions?0 -
Hiding Duplicate Content using Javascript
We have e-commerce site selling books. Besides basic information on books, we have content for “About the book” , “Editorial Reviews”, “About the author” etc. But the content in all these section are duplicate and are available on all sites selling similar books. Our question is: 1.Should we worry about the content being duplicate?2.If yes, then will it by a good idea to hide this duplicate content using javascript or iframe?
Technical SEO | | CyrilWilson0 -
Overly Dynamic URLs
I have a site that I use to time fitness events and I like to post the results using query strings. I create a link to each event's results/gallery/etc. I don't need these pages crawled and I don't want them to hurt my seo. Can I put a "do not crawl" meta on them or will that hurt my overall positioning? What are my other options?
Technical SEO | | bobbabuoy0 -
Javascript late loaded content not read by Gogglebot
Hi, We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value. I've read Google doesn't weigh <noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
Technical SEO | | NicB10 -
Dynamic URLs via Refinements
What is the best way to handle large product pages with many different refinement possibilities. Ex. hard drive - 40 gigs - black case etc. All of these refinements add to the length of the url and potentially create crawling issues as the url is to dynamic. I have seen people canonical all refinements and pages to the main cat page, I have seen others no follow certain refinements. Also in the SEOmoz crawling report it tells me that over two parameters is bad. What is the best way to handle this? Thanks
Technical SEO | | Gordian0 -
Javascript bad for SEO?
If we utilize javascript to pull information from a database to display on a site, is that bad for SEO? Can search engines still see the data?
Technical SEO | | nicole.healthline0