Does Disallowing a directory also tell search engines to unindex?
-
I have a bunch of duplicate pages/duplicate title issues because of Joomla's item/category/menu structures.
I want to tell search engines not to crawl, and also to unindex anything in those directories in order to solve the duplicate issues.
I thought of disallowing in robots.txt, but then I realized that might not remove the URLs if they've already been indexed.
Please help me figure this out.
-
Yes, this will remove them, but if you want to speed up the process you can use the google URL removal tool http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Displaying Vanity URL in Google Search Result
Hi Moz! Not sure if this has been asked before, but is there any way to tell Google to display a vanity URL (that has been 301d) instead of the actual URL in the SERP? Example: www.domainA.com is a vanity URL (bought specifically for Brand Identity reasons) that redirects to www.domainB.com. Is it possible to have the domainA Url show up in Google for a Branded search query? Thanks in advance! Arjun
Intermediate & Advanced SEO | | Lauriedechaseaux0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Natural Fluctuation in Search Traffic
This is going to sound like a weird question... I'm curious to know whether there is a natural fluctuation in the actual number of searches being made online each week. It would be great to relate this to the performance of my own organic traffic each week. For example, if organic search traffic is down 10% week on week, is that because search in general is down 10%? Has anybody ever looking into this?
Intermediate & Advanced SEO | | ausmed0 -
What are the search pros and cons of a transcript of a video in scroll box
This is being considered for fairly templated pages as a means of adding unique text to the page, without giving up a ton of real estate. The text is all visible in view source, but only the first few lines in the scroll box. The box itself is just above the fold. How, do you think this would affect search? Any other ideas for handling this. I'm limited by what the client will accept. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Title displays differently depending on search query?
Hi, I have seen this a few times but maybe someone can shed some light as to why this happens? If I search for a generic keyword im targeting in the title tag it shows the actual title tag placed in the code. But if I search for the brand name, the title tag changes to show just the brand name, so completely different to the default title tag. Any ideas why it does this? And is this bad, is Google saying the content on the site is not relevant and therefore decides to change it? Cheers
Intermediate & Advanced SEO | | activitysuper0 -
How to tell if you have been penalised?
Hey Guys, Something is seriously going on with the SEO on my website and it's become business critical. We were ranked first on Google for phrases like 'Advertising agency Cambridge' and all of a sudden we have been dropped, not even ranking. To me this screams a penalisation but I can't understand why. I have never paid for links or done anything out of the ordinary that would get such a negative impact. Is there a way to check if a site has actually been penalised and find the reason? What would be the best course of action to appeal? Has anyone experienced something like this and have a timeframe for resolution? Thanks, Gareth
Intermediate & Advanced SEO | | digitalclubb0 -
SEO Correlation Between Code and Search Engine Rankings
I posted this on my blog and wanted to get everyones opinion on this (http://palatnikfactor.com/2011/06/07/seo-correlation-between-code-and-search-engine-rankings/) I’m always looking to see what top ranking websites may be doing to get the rankings they do. One of the tasks of any SEO I guess is to really analyze competitors, right? I want to really stress that what I am writing here is completely opinion based and have not (due to time) validated this correlation enough but would like to get the discussion started. Nevertheless, I did enough research to see that there may be a correlation between code validation and top ranking websites, at least for certain queries where the number of real big players/brands is limited or non-existent. So, what do I mean? http://validator.w3.org/ validates code on websites. This tool shows you errors and warnings that may be making it harder for search engines to crawl your website. Looking at top competitors for certain niches, I was surprised to find that top sites had very few errors compared to 2+ page rankings. That’s not to say that all the sites on the first page had fewer errors (cleaner code) than websites in the 2<sup>nd</sup> page plus. However, again, top ranking websites for keywords that I was looking at had cleaner code which may have a correlation in regards to organic rankings. What’s your take? Does this have any effect in regards to SEO?
Intermediate & Advanced SEO | | PaulDylan0