Why google stubbornly keeps indexing my http urls instead of the https ones?
-
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why.
Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum
Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum
The third organic result listed is still http.
Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index.
Anyone knows why?
My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
-
Thanks again Dirk! At the end I used xenu link sleuth and I am happy with the result.
-
Hi Massimiliano,
In Screaming Frog there is the option: Bulk Export > All inlinks -> this generates the full list of all your internal links with both source & destination. In Excel you just have to put a filter on the "Destination" column - to show only the url's starting with "http://" and you get all the info you need. This will probably not solve the issues with the images. For this the next solution below could be used.
The list can be quite long depending on the total number of url's on your site. An alternative would be to add a custom filter under 'Configuration>Custom' - only including url's that contain "http://www.gomme-auto.it" or "http://blog.gomme-auto.it" in the source, but in your case this wouldn't be very helpful as all the pages on your site contain this url in the javascript part. If you change the url's in the Javascript to https this could be used to find references to non https images.
If you want to do it manually, it's also an option - in the view 'internal' of the crawler you put "http://" in the search field - this shows you the list of all the http:// url's. You have to select the http url's one by one. For each of the url's you can select "Inlinks" at the bottom of the screen & then you see all the url's linking to the http version. This works for both the html & the images.
Hope this helps,
rgds
Dirk
-
Forgot to mention, yes I checked the scheme of the serp results for those pages, is not just google not displaying it, it really still have the http version indexed.
-
Hi DC,
in screaming frog I can see the old http links. Usually are manually inserted links and images in wordpress posts, I am more than eager to edit them, my problem is how to find all the pages containing them, in screaming frog I can see the links, but I don't see the referrer, in which page they are contained. Is there a way to see that in screaming frog, or in some other crawling software?
-
Hi,
First of all, are you sure that Google didn't take the migration into account?I just did a quick check on other https sites. Example: when I look for "Google Analytics" in Google - the first 3 results are all pointing to Google Analytics site, however only for the 3rd result the https is shown, even when all three are in https. So it's possible it is just a display issue rather than a real issue.
Second, I did a quick crawl of your site and I noticed that on some pages you still have links to the http version of your site (they are redirected but it's better to keep your internal links clean - without redirections).
When I checked one of these pages (https://www.gomme-auto.it/pneumatici/pneumatici-cinesi) I noticed that this page has some issues as it seems to load elements which are not in https - possible there are others as well.
example: /pneumatici/pneumatici-cinesi:1395 Mixed Content: The page at 'https://www.gomme-auto.it/pneumatici/pneumatici-cinesi' was loaded over HTTPS, but requested an insecure image 'http://www.gomme-auto.it/i/pneumatici-cinesi.jpg'. This content should also be served over HTTPS.
The page you mention as example: the http version still receives two internal links from https://www.gomme-auto.it/blog/pneumatici-barum-gli-economici-che-assicurano-ottime-prestazioni and https://www.gomme-auto.it/pneumatici/continental with anchor texts 'pneumatici Barmum' & 'Barum'
Guess google reasons, if the owner of the site is not updating his internal links, I'm not going to update my index
On all your pages there is a part of the source which contains calls to the http version - it's inside a script so not sure if it's really important, but you could try to change it to https as well
My advice would be to crawl your site with Screaming Frog, and check where links exist to http versions and update these links to https (or use relative links - which is adviced by Google (https://support.google.com/webmasters/answer/6073543?hl=en see part 'common pitfalls')
rgds
Dirk
-
Mhhh, you are right theoretically could be the crawler budget. But if that is the case I should see that from the log, I should miss crawler visits on that page. Instead the crawler is happily visiting them.
By the way, how would you "force" the crawler to parse these pages?
I am going to check the sitemap now to remove that port number and try to split them. Thanks.
-
Darn it, you are right, we added a new site, not a change of address, sorry about that. Apparently my coffee is no longer effective!
-
As far as I know the change of address for http to https doesn't work, the protocol is not accepted when you do a change of address. And somewhere I read google itself saying when moving to https you should not do a change of address.
But they suggest to add a new site for the https version in GWT, which I did, and in fact the traffic slowly transitioned from the http site to the https site in GWT in the weeks following the move.
-
Are you sure? On https://support.google.com/webmasters/answer/6033080?hl=en&ref_topic=6033084 it says: "No need to submit a change of address if you are only moving your site from HTTP to HTTPS."
I dont think you are given the option to select the same domain for change of address in GWT.
-
Looks like you are doing everything right (set up 301 redirects, updated all links on the site, updated canonical urls) - just need to force the crawlers to parse those pages more. perhaps crawler is hitting its budget before it gets to recrawl all of your new urls?
You should also update your sitemap as it contains a bunch of links that look like: https://www.gomme-auto.it:443/pneumatici/estivi/pirelli/cinturato-p1-verde/145/65/15/h/72
I recommend creating several sitemaps for different sections of the site and seeing how they are indexed via GWT.
-
Did you do a change of address in Google Webmaster Tools? Http and Https are considered different URLs, and you will have to do a change of address if you switched to a full https site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console - Click Count Inconsistency
In Google's search console I see a discrepancy between click counts. At top I see this here and then beneath I see these kinds of numbers for click counts here. So the top click count says 252 and the bottom section appears to only shows less than 40. Probably a simple explanation here that I'm just not seeing. Thanks!
Reporting & Analytics | | a_toohill0 -
Search and Replace filter on Google Analytics?
Hello! On our GA account for one of our clients, we'd like to add a search and replace filter to the Views section of the account. The URL is www.askergoworks.com (it redirects to askergoworks.com), and Google has flagged us to have redundant hostnames. This is why we'd like to add the filter. Would the regular expression be askergoworks.com|www.askergoworks.com ? Any help would be great - I'm not a regex expert, so I really don't how to go about this. Thanks!
Reporting & Analytics | | AGILITY0 -
Rel=Canonical vs. No Index
Ok, this is a long winded one. We're going to spell out what we've seen, then give a few questions to answer below, so please bear with us! We have websites with products listed on them and are looking for guidance on whether to use rel=canonical or some version of No Index for our filtered product listing pages. We work with a couple different website providers and have seen both strategies used. Right now, one of our web providers uses No Index, No Follow tags and Moz alerted us to the high frequency of these tags. We want to make sure our internal linking structure is sound and we are worried that blocking these filtered pages is keeping our product pages from being as relevant as they could be. We've seen recommendations to use No Index, Follow tags instead, but our other web provider uses a different method altogether. Another vendor uses a rel=canonical strategy which we've also seen when researching Nike and Amazon's sites. Because these are industry leading sites, we're wondering if we should get rid of the No Index tags completely and switch to the canonical strategy for our internal links. On that same provider's sites, we've found rel=canonical tags used after the first page of our product listings, and we've seen recommendations to use rel=prev and rel=next instead. With all that being said, we have three questions: 1)Which strategy (rel=canonical vs. No Index) do you recommend as being optimal for website crawlers and boosting our site relevance? 2)If we should be using some version of No Index, should we use Follow or No Follow? 2)Depending on the product, we have multiple pages of products for each category. Should we use rel=prev & rel=next instead of rel=canonical among the pages after page one? Thanks in advance!
Reporting & Analytics | | Leithmarketing0 -
Google Analytics Average Position
I'm looking at Google Analytics -> Acquisition -> Search Engine Optimization -> Queries reports. I'm looking at keywords and the average position. What Google reports and what I see in a Google incognito search is different (usually my search is much lower). For example, for one search term, Google reports 5.8 average position and every time I search it is 8. My local result is 4. Anyone know why this is? I'm wondering if Google is averaging the Local results into number?
Reporting & Analytics | | CalicoKitty20000 -
Cookie tracking in Google Analytics
Hi How do I remove the "/?__utma=...." at the end of my URLs?We have a site http://www.jetonline.co.za/, if you click on one of the menu navigation links i.e. "fashion". A long "/?__utma=...." url appears. I understand this is for tracking as we have separate domains for each page but is there a way to remove this dynamic url and keep it hidden from users?Thanks in advance
Reporting & Analytics | | NeilPursey0 -
Why do I have few different index URL addresses?
Yes I know, sorry guys but I also have a problem with duplicate pages. It shows that almost every page of my site has a duplicate content issue and looking at my folders in the server, I don't see all these pages... This is a static Website with no shopping cart or anything fancy. The first on the list is my [index] page and this is giving me a hint about some sort of bad settings on my end with the SEOMOZ crawler??? Please advice and thank you! index-variations.jpg
Reporting & Analytics | | cssyes0 -
Google Analytics Best Practice Set up for Clients
Hi When setting up new Google Analytics accounts for clinets what is the preferred/best practice. At present we have our own company google account and add new clinets this way (to our account) - the disadvantage with this, we can only grant them limited account access otherwise they would be able to view all the accounts we cretaed. Plus we can't link their adwords to the GA account we cretaed them. Is it best practice to set the client up with their own Google Account and then we just link to their account. Advise would be appreciated, thank you.
Reporting & Analytics | | daracreative0 -
Google Docs Paranoia
Recently, there has been a lot of great information on SEOmoz about using Google Apps, Docs, etc. However, I suffer from Google paranoia (the fear that Google can see what's going on in my Google products). Is this fear unreasonable? Should I be concerned about using Google Apps, docs, analytics, etc. for SEO data (including keyword position tracking, back link analysis, etc)?
Reporting & Analytics | | Gyi0