How long til meta robots noindex takes effect?
-
I have a wordpress site with about 3,000 posts and over 1,000 tags. All of the tag archives are currently indexed in Google and I don't want them to be.
I just set the meta robots to no-index all the tag archives and was wondering how long it will take til they're out of the search engines?
Since there are close to 1,500 of these and they are duplicate content it would be nice to have them gone asap.
I noticed Webmaster Tools allows me to resubmit my site to index if my site has changed significantly... should I try that??
Any other advice would be greatly appreciated!
-
Thanks for the responses! The pages just got noindexed today... it took almost exactly a week.
-
I personally would update your robots.txt and then submit a sitemap with the URLs in question via GWMT - this will result in your pages being crawled very quickly (within 48 hours) and then dropped out.
I would also look at either using 301's if you see link-juice from those URLs or using 404s/410s.
I personally found that resubmitting takes longer than pushing an updated sitemap.
-
There's no set time, typically it can take a week or two, I've seen it take a month or longer though to completely noindex all desired results on bigger sites.
There's no harm in re-submtting the site, try submitting URL and all linked pages via webmaster tools, it may help get crawled a little faster.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dark Traffic & Long URLs - Clarification
Hi everyone, I've been reading the 2017 report by Groupon and the 2017 article by Neil Patel r.e. dark traffic. Both of these articles work on the assumption that most long URLs are not direct traffic because people wouldn't type a long URL into their browser. However, what happens to me personally all the time is that I start typing a URL into the browser, and the browser brings up a list of pages I've visited recently, and I click on some super long URL that I didn't bookmark but have visited in the past. That is legitimate direct traffic, but it's a long URL. I'm just wondering if there's something flawed in my reasoning or in the reasoning of Patel and Groupon. Maybe most people aren't relying on browsers like I am, or maybe things have changed a lot in the past 3 years. What do you think? And are there any more recent resources/articles that you would recommend r.e. trying to parse out dark traffic? https://neilpatel.com/blog/dark-traffic-stealing-data/ Thanks!
Search Behavior | | LivDetrick0 -
Long list or paginated pages
Hi peeps, I am just interested in this from a usability POV and to see what you would prefer to see when you are met with a page that has multiple options. Lets say that the page looks like a list of services, each clearly marked out in its own segment, but there are 50-60 options that match your requirements. Do you like to keep scrolling, or would you prefer to take what is there and then move on if you feel you want to dig deeper? Would you like to see a long list, of have the options loaded in as you get to them? -Andy
Search Behavior | | Andy.Drinkwater2 -
GWMT - "Tag Site For Child Directed Treatment" Effect On Search / Rankings?
Hi All, We have a client who has been directed to tag their site for "Child Directed Treatment" in Webmaster tools to comply with AdExchange policies. The site is, generally speaking, directed at those between the ages of 13 and 16 along with their parents, but does NOT collect any data (No sign in, login, signups etc). You can find out more about the specific tag here (unfortunately not much more about it) https://support.google.com/webmasters/answer/3221080?hl=en Our concern is that we have never heard of this specific tag, and our client is asking us to let them know if that this will have no effect on search traffic or ranking. I can't find much in the way of anyone who HAS implemented this tag and the effects it has had on their site. They are ad supported, receive millions of unique hits a month, and the majority of their traffic is from branded keywords. Would love to hear from anyone with ANY experience or thoughts on this process and what to be aware of. Your assistance is muchly appreciated.
Search Behavior | | SearchMarketers0 -
Meta description of homepage, changing to latest post
Here's something strange I noticed. The meta description for Engadget when doing a Google search is their latest blog entry. However, if you land on the homepage and view source the page, the meta description is a standard one for their homepage. My first impressions : Wha? How? and Wha? Could it be because it is a "news" site, Google goes "go on, have custom meta descriptions of your latest entry.." Thoughts?
Search Behavior | | Bio-RadAbs0 -
Disallow robots on a url effect?
Hello, I am using wordpress and on some of my top tag urls e.g : http://www.designzzz.com/tag/brushes/ there is an avg page rank of 4. I was reading somewhere that my huge jump in not selected count in index status in GWT is becuase of tags/categories urls. So i added disallow in my robots.txt for the url /tag What sort of effect will it have on my tag urls rankings or PR ?
Search Behavior | | wickedsunny10 -
Recovering from a Hack: How long until Google reindexes changes?
In a previous post I made, I was able to determine that one of my sites; http://pokeronamac.com/ was hacked and was feeding spam perscription drug content to search engines, then redirecting to another site when clicked on Google. I then contacted my web host, and, after they did a scan of our files, they determined that something within the wp-includes directory was compromised and malicious. They removed the file, though they weren't able to determine the source of the attack, or how they god in (should we be scared?). Anyway, its been several days now ~5 and if I do a site search the spam pages still show up, but the redirect is no longer working. At this point, I am at a standstill, because i'm loosing traffic on my site by about 90%, and google hasn't sent us any warnings of malaware or the like. I know I was recommended against this before, but should I attempt to submit a reconsideration request, or should I just wait it out? Thanks for your help, Zach
Search Behavior | | Zachary_Russell0 -
Long page - good or bad?
Our attorney wrote a dozen articles that range from 300 to 700 words on various topics of the certain law area. These articles are all placed on our FAQ page with anchored table of contents. This page does frequently come up on the first page of the google when people search for the questions discussed in these articles. 90% of these visits are not local therefore they are not potential clients. Attorney views it more like a community service then a marketing tool. However, I think there might be a problem. People read though the page and close it because usually they can find what they were looking for right there, however GA counts it as bounce because they did not browse to another page. Would large number of bounces hurt our standing with Google? Would it be better to separate the page into multiple pages for each article to make visitors browse?
Search Behavior | | SirMax0 -
Google Location - Taking Away Our National Reach?
Hey, I was just noticing that we achieve #2 ranking on Google for one of our customers for one of their primary keyword phrases. But then I noticed the traffic analytics were not matching what we should expect from that keyword phrase. Then I noticed, in using "Chrome's Incognito Window", that our location was automatically selected for our main geographical city area. I then went and changed that location from Denver, to San Diego & Also Chicago, just to see what would happen, and I noticed we instantly dropped from #2 to #7 when changing our location. I don't know what my question is, but I guess I feel like that is preventing us from achieving the results we need to sell ecommerce products. Is there any info on this or suggestions anyone has on how to tackle this issue? It feels like Google is pulling the rug out from underneath our feet and trying to spread rankings more to localized areas, rather than offering someone the opportunity to capitalize on good rankings for a national audience. I understand why they would do it, and I don't say I disagree. But it just seems to affect our work as SEO's doesn't it? Since we can't be as effective for customers that have a global audience instead of strictly a localized one. I'm curious to see what people have to say about this issue. Thanks!
Search Behavior | | JerDoggMckoy0