Does Disallowing a directory also tell search engines to unindex?
-
I have a bunch of duplicate pages/duplicate title issues because of Joomla's item/category/menu structures.
I want to tell search engines not to crawl, and also to unindex anything in those directories in order to solve the duplicate issues.
I thought of disallowing in robots.txt, but then I realized that might not remove the URLs if they've already been indexed.
Please help me figure this out.
-
Yes, this will remove them, but if you want to speed up the process you can use the google URL removal tool http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Improve Search Ranking from 10 to Top 5
Hi, I was just optimizing a page and have improved it from Page 2 -> Page 1 ; Position 15->10 for now by basic onpage seo edits. I want to understand how I would take it to next level by getting it to Top 5 or Top 3 results. My keyword and page are as follows (all checked in Google India (.in) ) Page - https://nirogam.com/ayurvedic-treatment-home-remedies-chikungunya/
Intermediate & Advanced SEO | | pks333
Keyword - ayurvedic treatment for chikungunya What steps can I take to go from 10 - Top 3 or Top 5 results here? I was checking the Rank tracker & moz Grader for this - got all ticks except adding keyword in URL. Would this be recommended of changing the URL after it's ranking so well to just add this keyword in the same?0 -
Google Search Console
abc.com www.com http://abc.com http://www.abc.com https://abc.com https://www.abc.com _ your question in detail. The more information you give, the better! It helps give context for a great answer._
Intermediate & Advanced SEO | | brianvest0 -
How to Submit My new Website in All Search Engines
Hello Everyone, Can Any body help to suggest Good software, or Any other to easily Submit my website , to All Search Engines ? ? Any expert Can help please, Thanx in Advance
Intermediate & Advanced SEO | | falguniinnovative0 -
Does anyone have a clue about my search problem?
After three years of destruction, my site still has a problem - or maybe more than one. OK, I understand I had - and probably still have - a Panda problem. The question is - does anyone know how to fix it, without destroying eveything? If I had money, I'd gladly give it up to fix this, but all I have is me, a small dedicated promotions team, 120,000+ visitors per month and the ability to write, edit and proofread. This is not an easy problem to fix. After completing more than 100 projects, I still haven't got it right, in fact, what I've done over the past 2 months has only made things worse - and I never thought I could do that. Everything has been measured, so as not to destroy our remaining ability to generate income, because without that, its the end of the line. If you can help me fix this, I will do anything for you in return - as long as it is legal, ethical and won't destroy my reputation or hurt others. Unless you are a master jedi guru, and I hope you are, this will NOT be easy, but it will prove that you really are a master, jedi, guru and time lord, and I will tell the world and generate leads for you. I've been doing website and SEO stuff since 1996 and I've always been able to solve problems and fix anything I needed to work on. This has me beaten. So my question is: is there anyone here willing to take a shot at helping me fix this, without the usual response of "change domains" "Delete everything and start over" or "you're screwed" Of course, it is possible that there is a different problem, nothing to do with algorithms, a hard-coded bias or some penalizing setting, that I don't know about, a single needle in a haystack. This problem results in a few visible things. 1. Some pages are buried in supplemental results 2. Search bots pick up new stories within minutes, but they show up in search results many hours later Here is the site: http://shar.es/EGaAC On request, I can provide a list of all the things we've done or tried. (actually I have to finish writing it) Some Notes: There is no manual spam penalty. All outgoing links are nofollow, and have been for 2 years. We never paid for incoming links. We did sell text advertising links 3-4 years ago, using text-link-ads.com, but removed them all 2 1/2 years ago. We did receive payment for some stories, 3-4 years ago, but all have been removed. One more thing. I don't write much - I'm a better editor than a writer, but I wrote a story that had 1 million readers. the massive percentage of 0.0016% came from you-know-who. Yes, 16 visitors. And this was an exclusive, unique story. And there was a similar story, with half a million readers. same result. Seems like there might be a problem!
Intermediate & Advanced SEO | | loopyal0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Should 301 Redirects be used only in cross domains or also internally?
In the following video with Cutts: http://youtu.be/r1lVPrYoBkA he explains a bit more about 301 redirects but he only talks about cross sites. What about redirecting internally from a non-existing product in a store to a new similar existing product?
Intermediate & Advanced SEO | | BeytzNet0 -
Canonical Tags & Search Bots
Does anyone know for sure if search engine bots still crawl links on a page whose canonical tags are set to a different page? So in short, would it be similar to a no-index follow? Thanks! -Margarita
Intermediate & Advanced SEO | | MargaritaS0