Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long for google to de-index old pages on my site?
-
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters).
I see that when I google my site, My new open graph settings are coming up correct.
Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages?
Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service.
My site is the following:
http://studio35design.com -
Awesome! Thanks Bas. Thats a great idea. I'll give it a shot.
-
Hi Ruben,
Have you tried deleting these old pages from the index at Google Webmaster Tools?
https://www.google.com/webmasters/tools/url-removal
You can only delete them temporarily but it might overlap the process of actually deleting the pages that you have already set in motion by uploading a new site map.
I did that about a week ago and the effect was noticeable within a couple of days.
Bas
-
Hi Martijn. Thanks for your response. My primary concern are the links that appear below my main link in the SERP. See screenshot. Half out those are no longer working. Sure, they redirect to a 301, but its still messy.
-
Hi Mark. Thanks for your response. All links as far as I can tell now have 301s. I'm sure there might be the odd page out that I forgot, but I'll be monitoring search console for errors.
Your suggestion about the specific page to redirect web design traffic is a good one. I'll think about it.
-
Hi,
Yes this really depends on how frequently Google crawls your site. Do these pages now lead to a 404 error? If yes I would suggest 301 redirecting them to other pages on your site. See this useful Moz blog about 301 redirects: https://moz.com/blog/heres-how-to-keep-301-redirects-from-ruining-your-seo
You also mentioned that you don’t offer the web design service anymore. If you still gets some traffic there you could make a specific page. Here you can state that you don’t offer web design but maybe some other relevant services.
-
This can take a very long time sometimes, for bigger sites I could see this take months with smaller sites it depends on the frequency and the crawl rate that Google visits your site. If Google is not very active on your site because the content doesn't really relate to something that is updated often then Google might decide not to come back too often to save their own servers and find other content elsewhere on the web.
In your case I would focus on making sure that the new site and structure are working flawless and less about de-indexing the old pages. I can't imagine that they still receive a ton of traffic. Without any doubt is 4 days still very early for Google to pick up the changes.
Hope this helps!?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About porn sites and ranking
Hello, I'm thinking to extend my website into porn. At the moment there is no pornography on it, although we do talk about sex related topics and products (from dating to tutorials, to toys etc.) Would it be dangerous to keep the porn section on the same domain as the rest? Would this negatively affect my non-porn content as Googlebot would "flag" my website as being pornographic (although only a few pages would be)? Or simply Googlebot would leave the current non-porn pages ranking as they are now, just fine, and plus it would rank the porn pages if they "deserve" to? I hope my question is clear. I don't want to create a subdomain.
Algorithm Updates | | fabx0 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
Why is old site not being deindexed post-migration?
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
Algorithm Updates | | ggpaul5620 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1