How long does it take before URL's are removed from Google?
-
Hello,
I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later.
Is there something I am missing? Or will it just take time for them to get de-indexed?
As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical?
Thanks!
-
I assume the new url format / structure is also the new inner link structure (all links in the site are updated with the new format). if this is the case the indexation is based on this not based on old urls following the 301s.
As far as testing them - did you test to see what response code do you get when accessing the old urls ?
Though I don't understand why you'd submit a sitemap with the old urls?
** To send google's bot to crawl those and see the 301s and delist them from the index.
How do you ping it?
** There are a lot of free services available that you can use - just run a search and you will have several to choose from.
-
Thanks for your response! I would assume the 301 are setup correctly if Google is indexing them and of course they work when I test them.
Though I don't understand why you'd submit a sitemap with the old urls? How do you ping it?
Any thoughts on the Page Authority?
-
Hi Sean,
For this small number of urls you can help Google's bot to dexindex those by having a separate xml sitemap in your web master tools with the old urls only - submit the sitemap and ping it. This will help speep up the process a little bit.
However since the number of urls is small - 55 - google will delist those old urls based on the 301 redirect (if the 301 setup is correct) at the next crawl cycle for your website.
You can also check Web master tool crawl rates to be able to make some prediction on how fast and often google is "visiting" your site.
There is no fix term for this. I have a large account with several million pages in the index, with 410 set on several thousands pages that were removed and those are still in the index after 4 months - it 's related with the size of the website, the crawl rate, freshness of those pages ...
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
Removing dates from wordpress blog URL
Hi all, Ours is website's blog is built with wordpress. We used to have the below URL pattern like may other websites: www.website.com/blog/2016/04/10/topic-on-how-to-optimise-blog. Recently we removed the date and made the URL pattern to just like: www.website.com/blog/topic-on-how-to-optimise-blog All the links have been generated with new URLs across the blog. Still all the old URLs have been reported as crawl errors in search console. I am wondering will there be any auto redirect formula to redirect all the old URLs to new URLs. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
Do we need to remove Google Authorship from the blog?
http://www.virante.org/blog/2013/12/19/authorshippocalypse-google-authorship-penguin-finally-appeared/ Search Engine Land reported that Google confirms that Authorship results in search are being intentionally reduced. It appears that the Matt Cutts-promised reductions to the amount of Google Authorship results being shown in Google Search has begun. Do we need to remove a Google Authorship tag from the blog? Because it hurts the ranking?
Intermediate & Advanced SEO | | ross254sidney0 -
What are Soft 404's and are they a problem
Hi, I have some old pages that were coming up in google WMT as a 404. These had links into them so i thought i'd do a 301 back to either the home page or to a relevant category or page. However these are now listed in WMT as soft 404's. I'm not sure what this means and whether google is saying it doesn't like this? Any advice welcomed.
Intermediate & Advanced SEO | | Aikijeff0 -
Google fluctuates its result on Chrome's private browsing
I have seen an interesting Google behaviour this morning. As usual, I would open Chrome's private browsing to see how a keyword is ranking. This was what I see... Typed in "sell my car", I see Auto Trader page on 3rd. (Ref:Sell My Car 1st result img) Googled something else, then re-Googled "sell my car" and saw that our page went to 2nd! I repeated the same process and saw that we went from 3rd to 2nd again. Has Google results gone mental? PaGXJ.png
Intermediate & Advanced SEO | | tmg.seo0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
How can I block unwanted urls being indexed on google?
Hi, I have to block unwanted urls (not that page) from being indexed on google. I have to block urls like example.com/entertainment not the exact page example.com/entertainment.aspx . Is there any other ways other than robot.txt? If i add this to robot.txt will that block my other url too? Or should I make a 301 redirection from example.com/entertainment to example.com/entertainment.aspx. Because some of the unwanted urls are linked from other sites. thanks in advance.
Intermediate & Advanced SEO | | VipinLouka780