Quickest way to deindex large parts of a website
-
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,
Jochen
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen -
Thanks for the hint Dirk! I've used the tool and it works great. I even found a handy chrome extension ("WebMaster Tools - Bulk URL removal") that made the removal of my 3,000 subdirectories very smooth and saved me about 25 hours of manual work!
WebMaster Tools - Bulk URL removal
-
Hi,
There was a similar question a few days ago: https://moz.com/community/q/is-there-a-limit-to-how-many-urls-you-can-put-in-a-robots-txt-file
Quote: Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to structure website URLs ?
Hi, can anyone help me to understand if having category folder in URL matters or not? how to google treat a URL? for example, I have the URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts? if in separate parts, is it safe to use pcb/pcb-certification? or it will be considered as keyword stuffing? Thank you in anticipation,
Intermediate & Advanced SEO | | SierraPCB1 -
Change of URLs - Part of Migration
We are looking to change our URLs to this format /SKU/TITLE/COLOUR as part of our SEO migration.
Intermediate & Advanced SEO | | christwix
e.g. https://example.com.au/ac-rck-b/rolla-crew-knit/berry.html As of the moment, our URLs are TITLE/NO
e.g. https://example.com.au/rolla-crew-knit/6562563.html
(Shopify is creating a random number on the end of the URL which is representing a different colour) Is this fine SEO wise? Will this affect rankings and user experience?0 -
Website copying in Tweets from Twitter
Just noticed a web developer I work with has been copying tweets into the website - and these are displayed (and saved) one page at a time across hundreds of pages (this is so they can populate a twitter feed, I am told). How would you tackle this, now that the deed's been done? This is in Drupal. Your thoughts would be welcome as this is a new one to me. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
University website outbound links issue
Hi - I'm working on a university website and have found a load of (1) outbound links to companies that have commercial tie ups to the university and, beyond that, loads of (2) outbound links to companies set up by alumni and (3) outbound links to commercial clients of the university. Your opinions on whether I should nofollow these, or not, would be welcome. At the moment I'm tempted to nofollow (1) yet leave (2) and (3) - quite simply because the (1) backlinks may have been negotiated as part of a package (nobody can actually remember at the university!), yet (2) and (3) were freely given by the university. Your thoughts would be welcome!
Intermediate & Advanced SEO | | McTaggart0 -
Looking for help with my website
Hi does any one know of a good seo company that will get results, i.e., fix site issues and get the site improving in the serps.
Intermediate & Advanced SEO | | Taiger0 -
Best way to link 150 websites together
Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂
Intermediate & Advanced SEO | | WesleySmits0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Export Website into XML File
Hi, I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML. What is best way to export content including meta tags for an entire site along with the steps on how to?
Intermediate & Advanced SEO | | Melia0