To remove or not remove a redirected page from index
-
We have a promotion landing page which earned some valuable inbound links. Now that the promotion is over, we have redirected this page to a current "evergreen" page. But in the search results page on Google, the original promotion landing page is still showing as a top result. When clicked, it properly redirects to the newer evergreen page. But, it's a bit problematic for the original promo page to show in the search results because the snippet mentions specifics of the promo which is no longer active. So, I'm wondering what would be the net impact of using the "removal request " tool for the original page in GSC.
If we don't use that tool, what kind of timing might we expect before the original page drops out of the results in favor of the new redirected page?
And if we do use the removal tool on the original page, will that negate what we are attempting to do by redirecting to the new page, with regard to preserving inbound link equity?
-
-
Thanks so much for the update on this. So, so glad to hear that this was straightened out. Also good to know that it wasn't a linear / logical route on Google's part, and that it took 4-5 days. (Some people on the forum expect an immediate turnaround on this, I've noticed.)
Have a great day,
Zack -
Zack,
All is good with this now. Here's how it actually went:
On January 6, I used GSC to request the cache be removed
On January 7, the page still showed for keyword searches, with the page title, but without any snippet.
On January 7 I requested a re-index.
From January 7-10, the page still showed in search results with only a title
On January 11, the redirected page finally shows in the keyword search results, instead of the original page (desired end result).
So, in summary, it was fairly quick (4-5 days), and in the end the redirected page took the place of the original page, and in the interim the original page showed up with its title but with no snippet.
-
My pleasure. I agree, and honestly think it may be a newer feature in GSC, as I just noticed it for one of our client's needs today, and thought of your open question. Hope to hear how this goes -- no pressure of course.
-Zack
-
Thanks Zack. That option is a bit hidden, and I hadn't noticed it. I'm trying now to see what happens with just clearing cached URL until the next recrawl might do. I'm kind of curious what it will do with the page title, which also currently has details about the offer.
-
Hi there,
In GSC, are you aware that there is a "Clear Cached URL" option in Removals, that allows one to "Keep URL(s) in Google Search results, just clear current snippet and cached version until the next crawl."? Here is more information, just in case it's helpful:
https://support.google.com/webmasters/answer/9689846#clear_cache_request (You may already be aware of this)
While I don't know for sure, I would merit a guess that changing the snippet's title and description would certainly change the clickthrough on it, and possibly alter the positioning as well if it has changed enough between other ranking factors.
Does this help at all?
Best,
Zack
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Redirecting an Entire Site to a Page on Another Site?
So I have a site that I want to shut down http://vowrenewalsmaui.com and redirect to a dedicated Vow Renewals page I am making on this site here: https://simplemauiwedding.net. My main question is: I don't want to lose all the authority of the pages and if I just redirect the site using my domain registrar's 301 redirect it will only redirect the main URL not all of the supporting pages, to my knowledge. How do I not lose all the authority of the supporting pages and still shut down the site and close down my site builder? I know if I leave the site up I can redirect all of the individual pages to corresponding pages on the other site, but I want to be done with it. Just trying to figure out if there is a better way than I know of. The domain is hosted through GoDaddy.
Intermediate & Advanced SEO | | photoseo10 -
Page must be internally linked to get indexed?
If a there is page like website.com/page; I think this page will be indexed by Google even we don't link it internally from anywhere. Is this true? Will it makes any difference in-terms of "indexability" if we list this page on sitemap? I know page's visibility will increase when link from multiple internal pages. I wonder will there be any noticeable difference while this page is listed in sitemap.
Intermediate & Advanced SEO | | vtmoz0 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Should I remove pages to concentrate link juice?
So our site is database powered and used to have up to 50K pages in google index 3 years ago. After re-design that number was brought down to about 12K currently. Legacy URLs that are now generating 404 have mostly been redirected to appropriate pages (some 13K 301 redirects currently). Trafficked content accounts for about 2K URLs in the end so my question is should I in context of concentrating link juice to most valuable pages: remove non-important / least trafficked pages from site and just have them show 404 no-index non-important / least trafficked pages from site but still have them visible 1 or 2 above plus remove from index via Webmaster Tools none of the above but rather something else? Thanks for any insights/advice!
Intermediate & Advanced SEO | | StratosJets0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0 -
301 a page and then remove the 301
I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page. Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content. We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301. Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!
Intermediate & Advanced SEO | | ChrisKolmar0