Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
-
hello Moz
We know that this year, Moz changed its domain to moz.com from www.seomoz.org
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above)We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess.- How long would it take Google to refresh the index? We just don't worry about it?
- Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint)
Thank you in advance for your reply.
-
Unfortunately, the answer is "it depends".
I do have some recent experience with this for 2 very small sites (one has around 300 indexed URL, the other has around 70), which you may find useful.
In each case, it took just a day or two to get the most important URLs (best rankings, traffic, link authority, etc.) swapped in for their non-https counterparts. However, deeper URLs with little link authority took up to 90 days to be swapped out.
If your most important URLs don't get swapped out in a week or so, I would check these things:
- Make sure you've updated internal links so that they point to the https URLs. You don't want to pass your link authority through 301s anyways.
- Make sure all versions of the site are verified in GWT, setting the https version as the preferred version.
- Make sure your sitemaps (XML and HTML) contain the https versions of your URLs
- Make sure that the https URLs do not have the non-https URL's set as the canonical version.
Hope this helps and good luck!
-
Google is super fast when it comes to the main, most important stuff on your domain. It's still indexing stuff from the old SEOmoz.org domain because we have a ton of pages! and frankly, some of them aren't very popular. We also made the decision not to redirect every single page and killed a ton of them. The less popular pages are lingering (though with the right 301 redirects, we're still getting that traffic to the still important to us pages) with SEOmoz.org, either waiting to be indexed at Moz.com or tossed out as they no longer exist.
For dealing with people who are scraping your site, make sure you have canonical tags implemented on your pages for your shiny new https site. Most scrapers steal the code, so they grab those too.
-
Hi there,
Google says in their guidelines: The time it takes Googlebot and our systems to discover and process all URLs in the site move depends on how fast your servers are and how many URLs are involved. As a general rule, a medium-sized website can take a few weeks for most pages to move, and larger sites take longer. The speed at which Googlebot and our systems discover and process moved URLs depends the number of URLs and the server speed.
You can find out all the information here https://support.google.com/webmasters/answer/6033080?hl=en
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Sites website https://www.opcfitness.com/ title NOT GOOD FOR SEO
We set up a website https://www.opcfitness.com/home on google sites. but google sites page title not good for SEO. How to fix it?
Technical SEO | | ahislop5740 -
HTTP URLs Still in Index
One of the sites I manage was migrated to secure 2 months ago. XML sitemaps have been updated, canonical tags all have https:, and a redirect rule was applied. Despite all this, I'm still seeing non-secure URLs in Google's index. The weird thing is, when I click those links, they go to the secure version. Has anyone else seen weird things with Google not properly indexing secure versions of URLs?
Technical SEO | | LoganRay0 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
301 redirect from sites closing down
Hi We have around 10 supplementary sites that have links to our site which are now closing down but are out of our control. We could have access to their domains so how could we maintain the link juice from these old sites which are going to our new site? However there will be no websites left on these old supplementary just domain names
Technical SEO | | ocelot0 -
Roger bot taking a long time to crawl site
Hi all, I've noticed Roger bot is taking a long time to crawl my new site. It started on the 28th Feb 2013 and is still going. There aren't many pages at the moment. Any ideas please? thanks a lot, Mark.
Technical SEO | | caterfor1 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Why isnt my Site getting Re Crawled !!!
The last crawl for my website was done long back ! What is the use if you dont re-crawl ? My website is www.caraccessoriesdelhi.in I have corrected all the errors. When will you recrawl it ? This is absolutely not done ! The last crawl was done on 3rd May !
Technical SEO | | VarunBansal0 -
Why did our site drop in Google rankings?
My site's URL (web address) is: http://tinyurl.com/3svn2l9 Hi there, We operate a travel site that lists numerous tours, accommodation and activities. Since 6th August 2011 we have dropped from top 10 SERP rankings of our pages to around result number 100 (page 10) and losing massive amount of visitors via Google Search. Our Yahoo and Bing rankings are still in the top10. We need your advice and quick! The last changes we have made are the following: -redirected the non-www version to the www version on the 1st August -bought advertising with a follow link in a sidebar that is being populated across the site (+4000 pages) about 2 months ago -added a blog to the website 2 weeks ago and posted 2 posts to date. Additionally, our website structure allows visitors (and bots) to see the same listings via different URLs which caused duplicate content. This has been the case since the launch of our website about 1 year ago. To prevent this duplicate content we have placed canonical tags on the individual listings pages. Why did our site all of a sudden plummet in the rankings?
Technical SEO | | Robbern0