Quickest way to deindex large parts of a website
-
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,
Jochen
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen -
Thanks for the hint Dirk! I've used the tool and it works great. I even found a handy chrome extension ("WebMaster Tools - Bulk URL removal") that made the removal of my 3,000 subdirectories very smooth and saved me about 25 hours of manual work!
WebMaster Tools - Bulk URL removal
-
Hi,
There was a similar question a few days ago: https://moz.com/community/q/is-there-a-limit-to-how-many-urls-you-can-put-in-a-robots-txt-file
Quote: Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site is being deindexed for unknown reason
A few days ago I noticed that my site gusty.se was not showing up in google, only the subpages. There is no message in the google search console. I requested the site to be reindexed and about a day later the site was showing up in google again. Now another day has past and the site is now again not indexed in google. Question is why the site is being deindexed??? I have worked a bit with getting backlinks to the site and I did recently gain 3 backlinks within a few days (about a week has past since I gained these links). Still I can't believe Google would count this as unnatural link building, especially since I guess it will take some time for Google to detect new incoming links. Another thing I've notice though is that my site about two weeks ago got a high number of incoming links from different spam sites with .gq TLD's (see the attached screenshot). The majority of these sites have however not linked to my main page but to a sub page which still is indexed by Google. Can all these spamlinks be the reason to why Google has deindexed the main page of my site? I've read that Google in general ignore links from spam sites, still I have taken action against these spam sites by submitting a disavow text file containing all these spam domains. I submitted this file about 2 days ago. I have now again requested the site to be reindexed so perhaps will it soon be listed again. Still, I can't keep having my site deindexed and having me reindexing it every second day. I would really appreciate if someone could give me some insight in this problem. moz.jpg
Intermediate & Advanced SEO | | Grodan21 -
International website. Di I need a new website
i am looking to expand from the UK and open a location in the US. i curretly have a .co.uk domain. what would you recommend I do with th website, create a new one wth a .com domain?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
What is the fastest way to deindex content from Google?
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags). We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Htaccess Question - Is this the way to go to consolidate?
Hi all, My site seems to have www.xyz.com, http://www.xyz.com, http://xyz.com and other variations! From an old agency doing this. All showing differing backlinks etc. SO I want to merge them so I can just look at one analytics account - with everything combined. I want it just to consolidate all to https:///www.xym.com as the client wants - how do I do this? Does it take long to take effect?? Also I presume in webmaster I'll have to set up the preferred extension? Thanks very much for any advice 🙂
Intermediate & Advanced SEO | | VMLQais0 -
Urls in Bilingual websites
1-I have a bilingual website. Suppose that I am targeting a page for keyword "book" and I have included it in that page url for the English version: English version: www.abc.com/book Can I use the translation of "book" in the second language of the website url instead of "book" ? Please let me know which of the following urls are right " French Verison: www.abc.com/fr/book or www.abc.com/fr/livre livre=Book in French 2- Does Google have any tool to check if the second language page of the website has exactly the same content as the English version. What I want to do is for example for a certain page in English version, my targeted keyword is "book" . So my content would be around books. But in the French version of this page, I want to focus on keyword "Pencil" in French instead of "book". Is it wrong or any consequences? That was the main reason for the question number one. Because if it is ok to do what I explained in item 2 then I will set my urls like: In English : www.abc.com/book In French: www.abc.com/fr/crayon crayon=Pencil in French
Intermediate & Advanced SEO | | AlirezaHamidian0 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0 -
URL Parameter is not available in website which was monitored by Google
I was checking URL parameters section over Google webmaster tools. Google have monitored following parameters and exclude it from crawling. utm_campaign utm_medium utm_source I have built URLs with following tool to track visits from vertical search engine like Google shopping and other comparison shopping engines. http://www.google.com/support/analytics/bin/answer.py?answer=55578 So, I am quite confuse to see over my data. Will Google consider external URLs which are available with above parameters or require to consist on live website? Note: I am asking for my eCommerce website. http://www.lampslightingandmore.com/
Intermediate & Advanced SEO | | CommercePundit0 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0