Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
-
hello Moz
We know that this year, Moz changed its domain to moz.com from www.seomoz.org
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above)We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess.- How long would it take Google to refresh the index? We just don't worry about it?
- Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint)
Thank you in advance for your reply.
-
Unfortunately, the answer is "it depends".
I do have some recent experience with this for 2 very small sites (one has around 300 indexed URL, the other has around 70), which you may find useful.
In each case, it took just a day or two to get the most important URLs (best rankings, traffic, link authority, etc.) swapped in for their non-https counterparts. However, deeper URLs with little link authority took up to 90 days to be swapped out.
If your most important URLs don't get swapped out in a week or so, I would check these things:
- Make sure you've updated internal links so that they point to the https URLs. You don't want to pass your link authority through 301s anyways.
- Make sure all versions of the site are verified in GWT, setting the https version as the preferred version.
- Make sure your sitemaps (XML and HTML) contain the https versions of your URLs
- Make sure that the https URLs do not have the non-https URL's set as the canonical version.
Hope this helps and good luck!
-
Google is super fast when it comes to the main, most important stuff on your domain. It's still indexing stuff from the old SEOmoz.org domain because we have a ton of pages! and frankly, some of them aren't very popular. We also made the decision not to redirect every single page and killed a ton of them. The less popular pages are lingering (though with the right 301 redirects, we're still getting that traffic to the still important to us pages) with SEOmoz.org, either waiting to be indexed at Moz.com or tossed out as they no longer exist.
For dealing with people who are scraping your site, make sure you have canonical tags implemented on your pages for your shiny new https site. Most scrapers steal the code, so they grab those too.
-
Hi there,
Google says in their guidelines: The time it takes Googlebot and our systems to discover and process all URLs in the site move depends on how fast your servers are and how many URLs are involved. As a general rule, a medium-sized website can take a few weeks for most pages to move, and larger sites take longer. The speed at which Googlebot and our systems discover and process moved URLs depends the number of URLs and the server speed.
You can find out all the information here https://support.google.com/webmasters/answer/6033080?hl=en
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
Question re: spammy internal links on site
Hi all, I have a blog (managed via WordPress) that seems to have built spammy internal links that were not created by us on our end. See "site:blog.execu-search.com" in Google search results. It seems to be a pharma-hack that's creating spammy links on our blog to random offers re: viagra, paxil, xenical, etc. When viewing "Security Issues", GSC doesn't state that the site has been infected and it seems like the site is in good health according to Google. Will anyone be able to provide any insight on the best necessary steps to take to remove these links and to run a check on my blog to see if it is in fact infected? Should all spammy internal links by disavowed? Here are a couple of my findings: When looking at "internal links" in GSC, I see a few mentions of these spammy links. When running a site crawl in Moz, I don't see any mention of these spammy links. The spammy links are leading to a 404 page. However, it appears some of the cached version in Google are still displaying the page. Please lmk. Any insight would be much appreciated. Thanks all! Best,
Technical SEO | | hdeg
Sung0 -
Http -> https redirections / 301 the right way
Dear mozers, Thank you for your time reading the message and wanting to help! So, we have moved our WordPress to https and redirected all the content successfully via htaccess file. We used a simple 301 redirect plugin, which we are using to redirect old URLs to the new ones. The problem today is, the redirections in the plugin are not working for http version. Here is an example: htaccess redirect: http --> https Plugin redirect domain.com/old --> domain.com/new but, the url http://domain.com/old is not redirecting to https://domain.com/new while https://domain.com/old does redirects to https://domain.com/new What can you suggest as a solution? Thank you in advance! P.S. I don't think having 2 redirects for each version of the URL is the smartest solution Best wishes, Dusan
Technical SEO | | Chemometec0 -
Google Indexing Desktop & Mobile Versions
We have a relatively new site and I have noticed recently that Google seems to be indexing both the mobile and the desktop version of our site. There are some queries where the mobile version will show up and sometimes both mobile and desktop show up. This can't be good. I would imagine that what is supposed to happen is that the desktop version is the one that should be indexed (always) and browser detection will load the mobile version where appropriate once the user is on the site. Do you have any advice on what we should do to solve this problem as we are a bit stuck?
Technical SEO | | simonukss0 -
Changed site to https now GWT and analytics - do I Know have to re-add it
Hi had the previous version (wordpress) site in GWT working just fine - now everything seems to have stopped. Do I have to treat this as an entirely new site and now add a new account for the https version? Many thanks,
Technical SEO | | AndreavanEugen0 -
How bad is it to have duplicate content across http:// and https:// versions of the site?
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice? Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
Technical SEO | | RG_SEO0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
Will using http ping, lastmod increase our indexation with Google?
If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt? Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots? Is there any benefit to listing the index sitemaps in robots if we’re pinging? If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod> Do we need to all to cover our bases? thanks Marika
Technical SEO | | marika-1786190