How long does it take before URL's are removed from Google?
-
Hello,
I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later.
Is there something I am missing? Or will it just take time for them to get de-indexed?
As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical?
Thanks!
-
I assume the new url format / structure is also the new inner link structure (all links in the site are updated with the new format). if this is the case the indexation is based on this not based on old urls following the 301s.
As far as testing them - did you test to see what response code do you get when accessing the old urls ?
Though I don't understand why you'd submit a sitemap with the old urls?
** To send google's bot to crawl those and see the 301s and delist them from the index.
How do you ping it?
** There are a lot of free services available that you can use - just run a search and you will have several to choose from.
-
Thanks for your response! I would assume the 301 are setup correctly if Google is indexing them and of course they work when I test them.
Though I don't understand why you'd submit a sitemap with the old urls? How do you ping it?
Any thoughts on the Page Authority?
-
Hi Sean,
For this small number of urls you can help Google's bot to dexindex those by having a separate xml sitemap in your web master tools with the old urls only - submit the sitemap and ping it. This will help speep up the process a little bit.
However since the number of urls is small - 55 - google will delist those old urls based on the 301 redirect (if the 301 setup is correct) at the next crawl cycle for your website.
You can also check Web master tool crawl rates to be able to make some prediction on how fast and often google is "visiting" your site.
There is no fix term for this. I have a large account with several million pages in the index, with 410 set on several thousands pages that were removed and those are still in the index after 4 months - it 's related with the size of the website, the crawl rate, freshness of those pages ...
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Someone asked me: What's the latest in SEO?
Hi, I'm wondering how others would respond to this question. "What's the latest in SEO?" Someone random asked me this on a plane that does not know much about digital marketing, but has someone else do for their business. I told them the google algortithm is constantly changing and it's always new, that there are about 500 changes a year (thought that was close to right) and then got down to some basic principals. I'm asking how you might answer as I could see someone asking me this within my organization as well. Thanks for any tips on a great answer or resources. Laura
Intermediate & Advanced SEO | | lauramrobinson321 -
Dfferent url of some other site is shown by Google in cace copy of our site's page
Hi, When i check cached copy of url of my site http://goo.gl/BZw2Zz , the url in cache copy shown by Google is of some other third party site. Why is Google showing third party url in our site's cached url. Did any of you guys faced any such issue. Regards,
Intermediate & Advanced SEO | | vivekrathore0 -
We sold our site's domain and have a new one. Where do we go from here?
We recently sold our established domain -- for a compelling price -- and now have the task of transitioning to our new domain. What steps would you recommend to lesson the anticipated decline from search engines in this scenario?
Intermediate & Advanced SEO | | accessintel0 -
Latest on how long your dmoz submisssions are taking
I've been waiting 11, almost 12 weeks for several submissions to appear in Open Directory. I research the right category and fill in all fields carefully incl. title and description. What are typical wait times now (4th Qtr 2012)?
Intermediate & Advanced SEO | | alankoen1230 -
How to find all of a website's SERPs?
Was wondering how easiest to find all of a website's existing SERPs?
Intermediate & Advanced SEO | | McTaggart0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0