Removing URLs in bulk when directory exclusion isn't an option?
-
I had a bunch of URLs on my site that followed the form:
http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l=
There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk?
Any insights are much appreciated.
Kurus
-
I'd go into Google Webmaster Tools and their parameter settings and tell them to ignore this parameter.
I would need to look up the exact syntax, but Google does accept some dynamic exclusions and parameters in robots.txt, and you may be able to put that into robots and then use the URL removal tools.
-
There are no links to these pages, so no juice. There are also no 'new' replacement pages. We just want them out of the index ASAP by any means necessary.
-
You should have 301 your most important pages to the new urls, so that you would keep your juice.
-
Thanks, but the goal is to expedite the removal process via the URL removal tool. We've already 404'd the pages, so they'll be removed from the index. It's a question of timing, since the pages in question are low quality and hurting us in the context of Panda.
-
try 301 redirect for most important links. http://www.seomoz.org/learn-seo/redirection
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http resolving to https - why isn't it doing that?
Hi everyone I've just been looking at a few https websites and noticed the http urls weren't redirecting to their https equivalents - why would a website owner not bother redirecting? As an example: http://www.marksandspencer.com I look forward to your feedback. L
Intermediate & Advanced SEO | | McTaggart0 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
Intermediate & Advanced SEO | | andyheath0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Monthly Refreshes Aren't Actually Needed, Right?
We get tons of emails from Network Solutions with the following text: To ensure that your website is easily found online it is important that you submit your website to the major search engines and internet directories, including: | Google™ Google Places™ Google Mobile™ Bing™ Yahoo!<sup>®</sup> Twitter<sup>®</sup> | Facebook<sup>®</sup> CitySearch<sup>®</sup> Foursquare™ Angie's List<sup>®</sup> GPS navigation MerchantCircle<sup>®</sup> | To do so, we recommend you go to each search engine and internet directories web page, locate the instructions and then complete a monthly refresh of your listing. If you would like us to complete this process for you please call us at... Everything I've ever read about modern SEO says this isn't necessary and it's just a solicitation to get people to pay them for something they don't even need. We update our social pages regularly and maintain listings on many citation sites using Moz Local (in addition to manually building citations). Can you guys confirm that this is just more spam from Network Solutions?
Intermediate & Advanced SEO | | ScottImageWorks0 -
New website won't rank for branded keywords in Google, but does in Bing
We launched a website in October www.butterfly.com. The branded product name "Butterfly Body Liners" will not rank until page 2 of Google, but it ranks #1 in Bing. Organic traffic never really picked up so it's not easy to tell if it's been "hit" by any penalty. The strange thing is, this website: http://archive.is/PQZdO is ranking #1. This is an archived version of the site. Does anyone have any insight as to why this is happening?
Intermediate & Advanced SEO | | LaughlinConstable0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0 -
How to remove an entire subdomain from the Google index with URL removal tool?
Does anyone have clear instructions for how to do this? Do we need to set up a separate GWT account for each subdomain? I've tried using the URL removal tool, but it will only allow me to remove URLs indexed under my domain (i.e. domain.com not subdomain.domain.com) Any help would be much appreciated!!!
Intermediate & Advanced SEO | | nicole.healthline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0