How long does it take before URL's are removed from Google?
-
Hello,
I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later.
Is there something I am missing? Or will it just take time for them to get de-indexed?
As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical?
Thanks!
-
I assume the new url format / structure is also the new inner link structure (all links in the site are updated with the new format). if this is the case the indexation is based on this not based on old urls following the 301s.
As far as testing them - did you test to see what response code do you get when accessing the old urls ?
Though I don't understand why you'd submit a sitemap with the old urls?
** To send google's bot to crawl those and see the 301s and delist them from the index.
How do you ping it?
** There are a lot of free services available that you can use - just run a search and you will have several to choose from.
-
Thanks for your response! I would assume the 301 are setup correctly if Google is indexing them and of course they work when I test them.
Though I don't understand why you'd submit a sitemap with the old urls? How do you ping it?
Any thoughts on the Page Authority?
-
Hi Sean,
For this small number of urls you can help Google's bot to dexindex those by having a separate xml sitemap in your web master tools with the old urls only - submit the sitemap and ping it. This will help speep up the process a little bit.
However since the number of urls is small - 55 - google will delist those old urls based on the 301 redirect (if the 301 setup is correct) at the next crawl cycle for your website.
You can also check Web master tool crawl rates to be able to make some prediction on how fast and often google is "visiting" your site.
There is no fix term for this. I have a large account with several million pages in the index, with 410 set on several thousands pages that were removed and those are still in the index after 4 months - it 's related with the size of the website, the crawl rate, freshness of those pages ...
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
What was your experience with changing site url's?
I work with a company that is about to move to a new platform. Because the category and page structure is different every almost every url but the home page will need to be 301 redirected. I know how to do this and am pretty sure I will find and fix 99% ahead of time and not have too many 404's showing up in webmaster tools to clean up. My question is has anyone who is reading this post had to do this before and what was your experience with organic traffic after you made the switch. I am predicting that even if I successfully redirected 100% of the url's there would be some loss for a couple of months just due to the fact that we are making a major change. My bosses are asking if there will be any loss and I need to tell them what to expect.
Intermediate & Advanced SEO | | KentH0 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
How do you find old linking url's that contain uppercase letters?
We have recently moved our back office systems, on the old system we had the ability to use upper and lower case letters int he url's. On the new system we can only use lower case, which we are happy with. However any old url's being used from external sites to link into us that still have uppercase letterign now hit the 404 error page. So, how do we find them and any solutions? Example: http://www.christopherward.co.uk/men.html - works http://www.christopherward.co.uk/Men.html - Fails Kind regards Mark
Intermediate & Advanced SEO | | Duncan_Moss0 -
How to remove an entire site from Google?
Hi people, I have a site with around 2.000 urls indexed in google, and 10 subdomains indexed too, which I want to remove entirely, to set up a new web. Which is the best way to do it? Regards!
Intermediate & Advanced SEO | | SeoExpertos0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
How to check a website's architecture?
Hello everyone, I am an SEO analyst - a good one - but I am weak in technical aspects. I do not know any programming and only a little HTML. I know this is a major weakness for an SEO so my first request to you all is to guide me how to learn HTML and some basic PHP programming. Secondly... about the topic of this particular question - I know that a website should have a flat architecture... but I do not know how to find out if a website's architecture is flat or not, good or bad. Please help me out on this... I would be obliged. Eagerly awaiting your responses, BEst Regards, Talha
Intermediate & Advanced SEO | | MTalhaImtiaz0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0