Trying to get Google to stop indexing an old site!
-
Howdy,
I have a small dilemma. We built a new site for a client, but the old site is still ranking/indexed and we can't seem to get rid of it.
We setup a 301 from the old site to the new one, as we have done many times before, but even though the old site is no longer live and the hosting package has been cancelled, the old site is still indexed. (The new site is at a completely different host.)
We never had access to the old site, so we weren't able to request URL removal through GSC.
Any guidance on how to get rid of the old site would be very appreciated.
BTW, it's been about 60 days since we took these steps.
Thanks, Kirk
-
No worries, let us know if it changes anything.
-
Thanks for the tip Martijn,
I will give it a try and let you know how it goes.
(By the way, sorry for the slow response. I did not get a notification that I had any)
Kirk
-
Thanks EGOL, that seems to be exactly what is happening to us!
-
For the past year, Google is having a very hard time forgetting pages. You can use a 301 redirect, take the files off of the server, and Google will still list the old URL - but click to the 301 destination.
-
Hi Kirk,
Try pinging the URLs of these old pages to Google (http://www.google.com/ping?sitemap=URL/of/file), if you have a list of the pages on the old site that's something that I would try. What could be causing this is that these old pages were barely visited by the crawlers and because of that they're not being picked up yet as being redirected. Basically, by pinging them to Google (a bit of an oldskool technique) you can trigger a crawl of them and hopefully, this will help.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is robots met tag a more reliable than robots.txt at preventing indexing by Google?
What's your experience of using robots meta tag v robots.txt when it comes to a stand alone solution to prevent Google indexing? I am pretty sure robots meta tag is more reliable - going on own experiences, I have never experience any probs with robots meta tags but plenty with robots.txt as a stand alone solution. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart1 -
Huge google index with un-relevant pages
Hi, i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages. since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do: when a match is finished - not linked, but appears on the index and SERP 301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant? use rel=canonical to the match Category do nothing.... *301 redirect will shrink my index status, some say a high index status is good... *is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google? *would canonical remove the past matches pages from the index? what do you think? Thanks, Assaf.
Intermediate & Advanced SEO | | stassaf0 -
What is the best approach for getting comments indexed, but also providing a great UX?
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded. I'm working on a set of requirements to convert the system over to be more SEO-friendly. Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments. This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
Intermediate & Advanced SEO | | JDatSB0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
Why are indented listings coming up from our old site?
We recently redesigned our e-commerce site and we've submitted the new site map and fetched as google bot but old site pages are still coming up indented under our homepage Google results. The new meta description is coming up for the homepage but the Quilt Guard page is showing up under indented results and it goes to a 404 error because the link is no longer on the new site. Is there anyway to control the indented results/pages that show up in results? http://www.google.com/webhp?source=search_app#hl=en&output=search&sclient=psy-ab&q=sleep+city&oq=sleep+city&gs_l=hp.3..0l4.3702.1178235.0.1178474.21.19.1.1.1.1.613.3532.0j11j3j0j1j1.16.0...0.0...1c.tBBERBg0aIo&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=dddf8a6dafb67f55&biw=1280&bih=923
Intermediate & Advanced SEO | | mmgmontana0 -
Google recognising regional canadian site as primary instead .com
Hi, we updated corporate site salvagedata.com to new design,but for migration test we do it on our canadian salvagedata.ca site. In few day we migrated salvageta.com. In this time google indexed salvagedata.ca contents, and now looks like google recognising it as primary site, and show it higher in search results. for example: hard drive data recovery Can 301 redirect .ca-> .com to resolve problem?
Intermediate & Advanced SEO | | markgray0 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0