Does Disallowing a directory also tell search engines to unindex?
-
I have a bunch of duplicate pages/duplicate title issues because of Joomla's item/category/menu structures.
I want to tell search engines not to crawl, and also to unindex anything in those directories in order to solve the duplicate issues.
I thought of disallowing in robots.txt, but then I realized that might not remove the URLs if they've already been indexed.
Please help me figure this out.
-
Yes, this will remove them, but if you want to speed up the process you can use the google URL removal tool http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Orphan Duplicate is created as Subdomain in Google Search
We noticed that some of our results on google for the blog are also come up with subdomain that is not linked from anywhere on the website. For example: SUBDOMAIN1.website.com/blog/content.html -> it redirects to website.com/blog/content.html SUBDOMAIN1 is not linked anywhere on the website. How did the google find it in the first place? Why does it still keep it in the search results? How do you get rid of it?
Intermediate & Advanced SEO | | rkdc0 -
Is Building a Local Directory of Businesses on a Subdomain Good SEO?
Hello Fellow Moz'ers: I own a small digital shop in a major US city. We had a marketing idea which I'd like some input on the soundness of. We are creating a professional services directory of 'digital professional services providers' in our hometown. The directory's membership will only be open to firms located within our city limits. The directory will be curated and maintained, ongoing, by us. Our motivation is 75% selfish and 25% benevolent. The idea is that, by building the directory on our subdomain, we hopefully will collect links, which ultimately will enhance search visibility. But I'm concerned about the devaluation directories have incurred in recent years and I've even seen advice given to the effect that listings in some directories might be harmful to a site's link profile. It is not our intention to harm those who might list in our directory. Any thoughts on this matter would be greatly appreciated!
Intermediate & Advanced SEO | | Daaveey0 -
Getting too many links on Google search results, how do I fix?
I'm a total newbie so I apologize for what I am sure is a dumb question — I recently followed Moz suggestions for increasing visibility on my site for a specific keyword by including that keyword in more verbose page descriptions for multiple pages. This worked TOO well as now that keyword is bringing up too many results in Google for these different pages on my site . . . is there a way to compile them into one result with the subpages like for instance, the attached image for a search on Apple? Do I need to change something in my robots.txt file to direct these to my main page? Basically, I am a photographer and a search for my name now brings up each of my different photo gallery pages in multiple results, it's a little over the top. Thanks for any and all help! CNPJZgb
Intermediate & Advanced SEO | | jason54540 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Citation/Business Directory Question...
A company I work for has two numbers... one for the std call centre and one for tracking SEO. Now, if local citation/business directory listings have the same address but different numbers, will this affect local/other SEO results? Any help is greatly appreciated! 🙂
Intermediate & Advanced SEO | | geniusenergyltd0 -
Directory VS Article Directory
Which got hit harder in penguin update. I was looking at SEER Interactive backlink profile (the SEO company that didn't rank for it's main keyword phrases) and noticed a pretty big trend on why it might not rank for its domain name. SEER was in a majority of anchor text, many coming from directories. i'm guessing THEY were effected because they matched the exact match domain link profile rule I'm not an expert programmer, but if i was playing "Google Programmer" I would think the Algo update went something like. If ((exact match domain) & (certain % anchor text==domain) & (certain % of anchor text== partial domain + services/company)) { tank the rankings } So back to the question, do you think that this update had a lot to do with directories, article directories, or neither. Is article directories still a legit way to get links. (not ezine)
Intermediate & Advanced SEO | | imageworks-2612900 -
Temporarily Delist Search Results
We have a client that we run campaign sites for. They have asked us to turn off our PPC and SEO in the short term so they can run some tests. PPC no problem straight forward action, but not as straight forward to just turn off SEO. Our campaign site is on Page 1, Position 4, 3 places below our clients site. They have asked us to effectively disappear from the landscape for a period of 1-2 months. Has anyone encountered this before, the ability to delist good SERP for a period of time? Details: Very small site with only 17 pages indexed within google, but home page has good SERP result. My issues are, How to approach this in the most effective manor? Once the delisting process is activated and the site/page disappears, then we reverse the process will we get back to where we were? Anyone encountered this before? I realise this is a ridiculous question and goes against SEO logic, get to page 1 results only to remove it, but hey, clients are always presenting new challenges for us to address..... Thanks
Intermediate & Advanced SEO | | Jellyfish-Agency0 -
Could Temporarily Linking New Directory Pages to my Homepage Help SEO?
Within my website we maintain a nationwide directory of auto repair shops. When we add or significantly update / modify a particular listing, would it help improve the individual search engine rankings, Google PageRank, and / or Page Authority of the new auto shop page if we linked these pages to an area on the home page for "Our Newest Featured Shops" or "Latest Member Additions" or something of the nature? Each new shop profile would then be linked directly from the homepage for a period of time. I assume that it might be crawled and added to the indexes quicker, but would there be other benefits? If so, would those benefits only be temporary if eventually the new shop no longer linked to the homepage? Would keeping all featured shops in rotational display on the homepage make any difference? Any input is appreciated. Thanks. Kelly Vaught
Intermediate & Advanced SEO | | kelly_vaught0