Quickest way to deindex large parts of a website
-
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,
Jochen
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen -
Thanks for the hint Dirk! I've used the tool and it works great. I even found a handy chrome extension ("WebMaster Tools - Bulk URL removal") that made the removal of my 3,000 subdirectories very smooth and saved me about 25 hours of manual work!
WebMaster Tools - Bulk URL removal
-
Hi,
There was a similar question a few days ago: https://moz.com/community/q/is-there-a-limit-to-how-many-urls-you-can-put-in-a-robots-txt-file
Quote: Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Two websites vs each other owned by same company
My client owns a brand and came to me with two ecommerce websites. One website sells his specific brand product and the other sells general products in his niche (including his branded product). Question is my client wants to rank each website for basically the same set of keywords. We have two choices I'd like feedback on- Choice 1 is to rank both websites for same keyword groupings so even if they are both on page 1 of the serps then they take up more real estate and share of voice. are there any negative possibilities here? Choice 2 is to recommend a shift in the position of the general industry website to bring it further away from the industry niche by focusing on different keywords so they don't compete with each other in the serps. I'm for choice 1, what about you?
Intermediate & Advanced SEO | | Rich_Coffman0 -
University website outbound links issue
Hi - I'm working on a university website and have found a load of (1) outbound links to companies that have commercial tie ups to the university and, beyond that, loads of (2) outbound links to companies set up by alumni and (3) outbound links to commercial clients of the university. Your opinions on whether I should nofollow these, or not, would be welcome. At the moment I'm tempted to nofollow (1) yet leave (2) and (3) - quite simply because the (1) backlinks may have been negotiated as part of a package (nobody can actually remember at the university!), yet (2) and (3) were freely given by the university. Your thoughts would be welcome!
Intermediate & Advanced SEO | | McTaggart0 -
Need help for new website!
I want to a make new website. Can you please advise me what all things are involved which I should keep in mind before and during the website preparation. Like how to make pages, what to include in website, best way to create pages etc. Please provide me the link where I can study all the above information. I am planning to create global printing website.
Intermediate & Advanced SEO | | AlexanderWhite0 -
Which part of SEO take the most time?
Which part of SEO do you think will take the most time? and Why?
Intermediate & Advanced SEO | | marknorman0 -
Purpose of a Blog in a website
How internal blog or external blog is helpful in SEO?why it is good to have a site with blog?
Intermediate & Advanced SEO | | Alick3000 -
Google does not target my website properly!
Hello everyone, My website : www.pentrucadouri.ro, despite is a .ro with romanian content and is hosted in Romania appear for google.ro as a english targeted website.Google see internal pages as romanian ones but main page as english . In order to change this , I added : Also few days ago I uploaded a geositemap and I submitted this to google. Do you have suggestions ? Website ranks 2nd for "cosuri cadou" on google.com and 3rd on bing, but on google.ro ranks 11 . Thanks!
Intermediate & Advanced SEO | | VertiStudio0 -
Why would the PageRank for all of our websites show the same?
The last time I checked (early this year), the PageRank on the sites I manage varied, with the highest showing as 6. It made sense as the PR6 site has loads of links and has been around for a long time, whereas the other sites hadn't. Now all of our websites are showing the same PageRank - 6, even one that has recently launched and another that has barely any links/traffic or anything to it. I didn't check the PR of that one last time (I'd be surprised if it was 2), but the sites now showing as 6 ranged from PR3 to PR6 back then. We changed server in February...so could this issue be something to do with all of the sites being stored on the same server? It doesn't seem right but it's the only thing I can think of. At the moment, the Domain Authority for these six websites ranges from 27 to 62.
Intermediate & Advanced SEO | | Alex-Harford0