How to properly remove pages and a category from Google's index
-
I want to remove this category http://www.webdesign.org/web-design-news-all/ and all the pages in that category (e.g. http://www.webdesign.org/web-design-news-all/7386.html ) from Google's index.
I used the following string in the "Reomval URS" section in Google Webmaster Tools:
http://www.webdesign.org/web-design-news-all/*
is that correct or I better use http://www.webdesign.org/web-design-news-all/ ?
Thanks in advance.
-
Thanks for your replies guys. Now I know how to proceed.
-
In addition I would put the "No Index" meta attribute in the pages.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
-
It sounds like you want to de-index them.
Go into your robots.txt and disallow Google from the category..
It will look something like this:
User-agent: googlebot or *
Disallow: /web-design-news-all/
-
Are you removing the pages or just trying to de-index them? If you're trying to remove them, make sure the old pages return a 404 so they can't be restored to the index. If you're trying to remove them, make sure they are excluded by robots.txt or meta tag.
Here's the WMT Help page on the topic
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Number of internal links and passing 'link juice' down to key pages.
Howdy Moz friends. I've just been checking out this post on Moz from 2011 and wanted to know how relevant it is today? I'm particularly interested in a number of links we have on our HP potentially harming important landing page rankings because not enough 'link juice is getting to them i.e) are they are being diluted by all the many other links on the page? (deeper pages, faqs, etc etc) It seems strange to me that as Google as has got more sophisticated this would still be that relevant (thus the reason for posting). Anyway, I thought I was definitely worth asking. If we can leverage more out of our on-page efforts then great 🙂
On-Page Optimization | | isaac6630 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
Long URL's
So I'm super new at SEO and learning a lot. I'm a small business owner and enjoy doing it myself. Are long URL's good or bad? Like this: http://www.farnorthkennel.com/german-shepherd-puppies-the-girls/long-haired-german-shepherd-puppies-lava Is that too long? The german-shepherd-puppies-the-girls is an actual page with actual content. Do those hurt me?
On-Page Optimization | | Joshlaska0 -
Where aren't on page reports generated for all of my keywords?
I have 39 targetted keywords, yet only 10 on-page reports are generated. My site has about 100 pages. Why don't I see reports for all of my pages?
On-Page Optimization | | mynton0 -
Would removing high dynamic pages though nofollow help or hurt?
We have a sub-domain that is hosted by a third party. These pages are highly dynamic (change daily or more often) as they are product search results. Unfortunately they are raising several errors and warnings including duplicate page content, title missing or empty, long URLs, overly dynamic URL Would putting nofollows on the links to this sub-domain help, hurt or not affect page rank? As an example: Links in the middle of this page (prices) http://targetvacations.ca go to a page such as this http://travel.targetvacations.ca/cgi-bin/resultadv.cgi?id=16294922&code_ag=tgv&alias=tgv which is then redirected to a dynamic URL and presents the results.
On-Page Optimization | | TSDigital0 -
What URL Should I use in Google Place Page?
Alright, I have a client that has 1 website and 14 locations. We want to create place pages for each of their locations but my question is which URL should I put in the place page and why? I can put in the root domain into each place page, or should I put in the URL that lands on the actual location on the root. example: domain.com/location1 Thanks!
On-Page Optimization | | tcseopro0 -
Another SEO's point of view
Hiya fellow SEO's I have been working on a site - www.hplmotors.co.uk and I must say it has become difficult due to flaws with the content management system . We are speaking with the web site makers to be able to add a unique title, description to all pages. I know what is wrong but I would also like some 2nd opinions on this and welcome any suggestions for the site. A burnt out seo 🙂 thanks
On-Page Optimization | | onlinemediadirect0