Removing a Page From Google index
-
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
-
Thanks Ryan. I fully understand what you are saying and will be careful while making the change.
-
Hi Atul,
Generally speaking I am uncomfortable advising others on specific changes to the .htaccess file. If you make even a slight error while working with the file, your site security can be compromised, not to mention your SEO. There are also many factors to consider such as which mods are enabled on your particular server along with other configuration issues. Lastly, the order in which your code is placed in the file can effect it's operation so it's not like adding a meta tag to the section of an html document.
If you are on managed hosting, my recommendation is to ask your web host to make the change. If you are not on managed hosting, I recommend asking the developer who manages the site to make the change.
If you still insist on making the change yourself, try
Redirect gone /ABC/xyz.html
-
After reading your answer, i searched for methods on how to generate 410 error.
Lets say i want to remove a page named xyz.html.
Which of the following entry in .htaccess is correct
Redirect gone xyz.html
or
Redirect gone /xyz/
If xyz were in a folder named ABC,
would it be correct
Redirect gone /ABC/xyz.html
Thaks
-
Thanks everyone! We are just going to leave it as is. Google will eventually flush it out. Ryan - because of the 90 days we can't remove the URL's. I will need them back sooner than that when we actually put products in those states. Thanks again! helpful....as usual!
-
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
I agree with everything you shared Esko up to this point. Aside from the time spent to remove the page, there is another downside. The URL you remove will not appear in SERPs again for 90 days after being manually removed.
If your URL was mysite.com/blue-widgets then your site will not have another /blue-widgets page listed again for 90 days. I can share it is a headache as an SEO trying to figure out why a page is not being indexed, and later learning I did not ask all the right questions i.e. "Prior to hiring my services, have you or anyone with access to your WMT account used the URL Removal tool within the past 90 days?". That otherwise obscure question now is asked regularly of my clients. Painful lesson.
Also, I wanted to share another helpful link I located from Google: When NOT to use the URL Removal tool.
-
Google will completely drop the page from the index after the next time they crawl it. Using the Remove URLs tool in Google Webmaster Tools will only expedite removal.
Best practice is to 404 (Not Found) or 410 (Gone) the page first of all.
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
-
Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
The best practice would be to generate a 410 error (GONE) for the pages and Google will remove them from their index fairly quickly.
The next best practice would be to leave the pages as 404s and Google will still remove them from their index but it will take a bit longer.
A 410 is used to inform Google and others the page is definitely gone. A 404 merely states the page is unavailable now. It could be available later.
The removal tool should only be used if it is a major concern for the search result to appear in SERPs. An example would be if confidential information was leaked.
-
I think it's always good to let Google know as they might remove it sooner. But there's no guarantee either way. Though if you can, you should 301 your content to a new/similar page rather than just let it 404.
-
I understand HOW to remove a page. I want to know whether it's better for me to manually remove it or let Google remove it on its own.
-
Remove a page from Google's Index
Use Google Webmaster Tools www.google.com/webmasters/tools/
http://www.google.com/support/webmasters/bin/answer.py?answer=1663419
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Old pages STILL indexed...
Our new website has been live for around 3 months and the URL structure has completely changed. We weren't able to dynamically create 301 redirects for over 5,000 of our products because of how different the URL's were so we've been redirecting them as and when. 3 months on and we're still getting hundreds of 404 errors daily in our Webmaster Tools account. I've checked the server logs and it looks like Bing Bot still seems to want to crawl our old /product/ URL's. Also, if I perform a "site:example.co.uk/product" on Google or Bing - lots of results are still returned, indicating the both still haven't dropped them from their index. Should I ignore the 404 errors and continue to wait for them to drop off or should I just block /product/ in my robots.txt? After 3 months I'd have thought they'd have naturally dropped off by now! I'm half-debating this: User-agent: *
Intermediate & Advanced SEO | | LiamMcArthur
Disallow: /some-directory-for-all/* User-agent: Bingbot
User-agent: MSNBot
Disallow: /product/ Sitemap: http://www.example.co.uk/sitemap.xml0 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Google + pages and SEO results...
Hi, Can anyone give me insight into how people are getting away with naming their business by the SEO search term, creating a BS Google + page, then having that page rank high in the search results. I am speaking specifically about the results you get when you Google: "Los Angeles DUI Lawyer". As you can see from my attached screenshot (I'm doing the search in Los Angeles), the FIRST listing is a Google + business. Strangely, the phone number listed doesn't actually take you to a DUI attorney, but rather to some marketing group that never answers the phone. Can anyone give me insight into why Google even allows this? I just find it odd that Google cares so much about the user experience, but have the first result be something completely misleading. I know it sounds like I'm just jealous (which I am, a little), but I find it disheartening that we work so hard on SEO, and someone takes the top spot with an obvious BS page. UupqBU9
Intermediate & Advanced SEO | | mrodriguez14400 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Google+ Local pages under review. How long does this take?
I have a couple Google+ local pages that have been placed under review. Does anyone have experience regarding the time frame of this reveiw process. Google says to give it a few weeks, but one page has been under review for four weeks now. How long should I wait for Google to review them before I delete the page and start over?
Intermediate & Advanced SEO | | VentaMarketing0 -
How to have pages re-indexed
Hi, my hosting company has blocked one my web site seeing it has performance problem. Result of that, it is now reactivated but my pages had to be reindexed. I have added my web site to Google Webmaster tool and I have submitted my site map. After few days it is saying: 103 number of URLs provided 39 URLs indexed I know Google doesn't promesse to index every page but do you know any way to increase my chance to get all my pages indexed? By the way, that site include pages and post (blog). Thanks for your help ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0 -
Will Google Revisit a 403 Page
Hi, We've got some pretty strict anti-scraping logic in our website, and it seems we accidentally snared a Googlebot with it. About 100 URL requests were responded to with a 403 Forbidden error. The logic has since been updated, so this should not happen again. I was just wondering if/when Googlebot will come back and try those URLs again. They are linked from other pages on the site, and they are also in our sitemap. Thanks in advance for any assistance.
Intermediate & Advanced SEO | | dbuckles0