Removing a Page From Google index
-
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
-
Thanks Ryan. I fully understand what you are saying and will be careful while making the change.
-
Hi Atul,
Generally speaking I am uncomfortable advising others on specific changes to the .htaccess file. If you make even a slight error while working with the file, your site security can be compromised, not to mention your SEO. There are also many factors to consider such as which mods are enabled on your particular server along with other configuration issues. Lastly, the order in which your code is placed in the file can effect it's operation so it's not like adding a meta tag to the section of an html document.
If you are on managed hosting, my recommendation is to ask your web host to make the change. If you are not on managed hosting, I recommend asking the developer who manages the site to make the change.
If you still insist on making the change yourself, try
Redirect gone /ABC/xyz.html
-
After reading your answer, i searched for methods on how to generate 410 error.
Lets say i want to remove a page named xyz.html.
Which of the following entry in .htaccess is correct
Redirect gone xyz.html
or
Redirect gone /xyz/
If xyz were in a folder named ABC,
would it be correct
Redirect gone /ABC/xyz.html
Thaks
-
Thanks everyone! We are just going to leave it as is. Google will eventually flush it out. Ryan - because of the 90 days we can't remove the URL's. I will need them back sooner than that when we actually put products in those states. Thanks again! helpful....as usual!
-
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
I agree with everything you shared Esko up to this point. Aside from the time spent to remove the page, there is another downside. The URL you remove will not appear in SERPs again for 90 days after being manually removed.
If your URL was mysite.com/blue-widgets then your site will not have another /blue-widgets page listed again for 90 days. I can share it is a headache as an SEO trying to figure out why a page is not being indexed, and later learning I did not ask all the right questions i.e. "Prior to hiring my services, have you or anyone with access to your WMT account used the URL Removal tool within the past 90 days?". That otherwise obscure question now is asked regularly of my clients. Painful lesson.
Also, I wanted to share another helpful link I located from Google: When NOT to use the URL Removal tool.
-
Google will completely drop the page from the index after the next time they crawl it. Using the Remove URLs tool in Google Webmaster Tools will only expedite removal.
Best practice is to 404 (Not Found) or 410 (Gone) the page first of all.
Remove URL tool will just expedite the inevitable. There is no downside in doing so.
-
Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
The best practice would be to generate a 410 error (GONE) for the pages and Google will remove them from their index fairly quickly.
The next best practice would be to leave the pages as 404s and Google will still remove them from their index but it will take a bit longer.
A 410 is used to inform Google and others the page is definitely gone. A 404 merely states the page is unavailable now. It could be available later.
The removal tool should only be used if it is a major concern for the search result to appear in SERPs. An example would be if confidential information was leaked.
-
I think it's always good to let Google know as they might remove it sooner. But there's no guarantee either way. Though if you can, you should 301 your content to a new/similar page rather than just let it 404.
-
I understand HOW to remove a page. I want to know whether it's better for me to manually remove it or let Google remove it on its own.
-
Remove a page from Google's Index
Use Google Webmaster Tools www.google.com/webmasters/tools/
http://www.google.com/support/webmasters/bin/answer.py?answer=1663419
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
In the google index but search redirects to homepage
Hi everyone, thanks for reading i have a website "www.gardeners.scot" and have the following pages listed in google site: command http://www.gardeners.scot/garden-landscaping-Edinburgh.htm & http://www.gardeners.scot/garden-maintenance-Edinburgh.htm however when a user searches for "garden landscaping Edinburgh" or "garden maintenance Edinburgh" we are in the rankings but google search links these phrases to the home page not to their targeted pages. the site is about a year old have checked the robots.txt, sitemap.xml & .htaccess files but can see anything wrong there. any ideas out there?
Intermediate & Advanced SEO | | livingphilosophy0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Why is a site no longer being indexed by Google after HTTPS switch?
A client of ours recently had a new site built and made the switch to HTTPS. We made sure to redirect all of the HTTP pages to HTTPS and submitted a new sitemap to Google. GWT says the sitemap was submitted successfully but only 4 pages have been indexed where there should be over 2000. This has led to a plummet of organic traffic and we can't find the issue. Has anyone else had issues/success with doing a HTTPS switch that knows how to fix this problem?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0 -
Getting 260,000 pages re-indexed?
Hey there guys, I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed? Thanks!
Intermediate & Advanced SEO | | StefanJDorresteijn0