Removing a Page From Google index
-
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
-
Thanks Ryan. I fully understand what you are saying and will be careful while making the change.
-
Hi Atul,
Generally speaking I am uncomfortable advising others on specific changes to the .htaccess file. If you make even a slight error while working with the file, your site security can be compromised, not to mention your SEO. There are also many factors to consider such as which mods are enabled on your particular server along with other configuration issues. Lastly, the order in which your code is placed in the file can effect it's operation so it's not like adding a meta tag to the section of an html document.
If you are on managed hosting, my recommendation is to ask your web host to make the change. If you are not on managed hosting, I recommend asking the developer who manages the site to make the change.
If you still insist on making the change yourself, try
Redirect gone /ABC/xyz.html
-
After reading your answer, i searched for methods on how to generate 410 error.
Lets say i want to remove a page named xyz.html.
Which of the following entry in .htaccess is correct
Redirect gone xyz.html
or
Redirect gone /xyz/
If xyz were in a folder named ABC,
would it be correct
Redirect gone /ABC/xyz.html
Thaks
-
Thanks everyone! We are just going to leave it as is. Google will eventually flush it out. Ryan - because of the 90 days we can't remove the URL's. I will need them back sooner than that when we actually put products in those states. Thanks again! helpful....as usual!
-
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
I agree with everything you shared Esko up to this point. Aside from the time spent to remove the page, there is another downside. The URL you remove will not appear in SERPs again for 90 days after being manually removed.
If your URL was mysite.com/blue-widgets then your site will not have another /blue-widgets page listed again for 90 days. I can share it is a headache as an SEO trying to figure out why a page is not being indexed, and later learning I did not ask all the right questions i.e. "Prior to hiring my services, have you or anyone with access to your WMT account used the URL Removal tool within the past 90 days?". That otherwise obscure question now is asked regularly of my clients. Painful lesson.
Also, I wanted to share another helpful link I located from Google: When NOT to use the URL Removal tool.
-
Google will completely drop the page from the index after the next time they crawl it. Using the Remove URLs tool in Google Webmaster Tools will only expedite removal.
Best practice is to 404 (Not Found) or 410 (Gone) the page first of all.
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
-
Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
The best practice would be to generate a 410 error (GONE) for the pages and Google will remove them from their index fairly quickly.
The next best practice would be to leave the pages as 404s and Google will still remove them from their index but it will take a bit longer.
A 410 is used to inform Google and others the page is definitely gone. A 404 merely states the page is unavailable now. It could be available later.
The removal tool should only be used if it is a major concern for the search result to appear in SERPs. An example would be if confidential information was leaked.
-
I think it's always good to let Google know as they might remove it sooner. But there's no guarantee either way. Though if you can, you should 301 your content to a new/similar page rather than just let it 404.
-
I understand HOW to remove a page. I want to know whether it's better for me to manually remove it or let Google remove it on its own.
-
Remove a page from Google's Index
Use Google Webmaster Tools www.google.com/webmasters/tools/
http://www.google.com/support/webmasters/bin/answer.py?answer=1663419
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Google Frequently Indexing - Good or Bad?
Hi, My website is only 4 months old and receives about 40 to 50 organic visits every day. It currently has about 100 pages out of which only 3-4 rank in the top 10 for the target KWs. I usually try to publish, at least 1 article a day but sometimes certain articles are more than 2000 words long with a few of infographics and hence takes way more time (maybe even 3 days to publish one) Only over the last week, I am observing that every time i am publishing a page (usually daily) google is indexing them the same day. This I have heard happens for moderately big sites but my site is really small at this stage. Note: For the first 80 pages, I used to "fetch as googlebot" in webmasters as otherwise my site would be crawled once in 2 weeks but over the last 3-4 weeks, i rely on googles scheduled visits. Is this a good or bad sign? I would like to assume its good because of my engagement. Though for only organic visits, my Gogle Analytics bounce rate is 65% in analytics out of the remaining 35%, the avg time on site >7 mins. That means if someone sticks to my site, they consume a lot of my content. Also, since analytics' bounce rate is not same as the search bounce (back button) I would like to consider that the bounce is actually lesser than that.
Intermediate & Advanced SEO | | dwautism0 -
Why would one of our section pages NOT be indexed by Google?
One of our higher traffic section pages is not being indexed by Google. The products that reside on this section page ARE indexed by Google and are on page 1. So why wouldn't the section page be even listed and indexed? The meta title is accurate, meta description is good. I haven't received any notices in Webmaster Tools. Is there a way to check to see if OTHER pages might also not be indexed? What should a small ecom site do to see about getting it listed? SOS in Modesto. Ron
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Does Google only look at LSI per page or context of the Site?
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words. For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans. Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Is it better to not allow Google to index my Tumblr Blog?
Currently using a subdomain for my blog via Tumblr In my seo reports I see alot of errors. Mostly from the Tumblr blog. Made change so there are unique titles and tags. Too many errors I am wondering if it is best to just not allow it to be indexed via tumblr control panel. It certainly is doing a great job with engagement and social network follows, but i'm starting to wonder if and how much it is penalizing my domain.. Appreciate your input.. By the way this theme is not flash for the content very basic single a theme...
Intermediate & Advanced SEO | | wickerparadise0 -
Google + Local Pages
Hi, If I have a company with multipul addresses, Do I create separate Google + page for each area?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
How does google count a menu on each page
Hello, Just wondering how google treats the TOp and bottom menu that you see on each page of a website ? Does it count it on all the pages in terms of link juice, or is it just there for user experience and only what it counts are the links in the content of a page or on the side ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0