Why google stubbornly keeps indexing my http urls instead of the https ones?
-
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why.
Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum
Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum
The third organic result listed is still http.
Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index.
Anyone knows why?
My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
-
Thanks again Dirk! At the end I used xenu link sleuth and I am happy with the result.
-
Hi Massimiliano,
In Screaming Frog there is the option: Bulk Export > All inlinks -> this generates the full list of all your internal links with both source & destination. In Excel you just have to put a filter on the "Destination" column - to show only the url's starting with "http://" and you get all the info you need. This will probably not solve the issues with the images. For this the next solution below could be used.
The list can be quite long depending on the total number of url's on your site. An alternative would be to add a custom filter under 'Configuration>Custom' - only including url's that contain "http://www.gomme-auto.it" or "http://blog.gomme-auto.it" in the source, but in your case this wouldn't be very helpful as all the pages on your site contain this url in the javascript part. If you change the url's in the Javascript to https this could be used to find references to non https images.
If you want to do it manually, it's also an option - in the view 'internal' of the crawler you put "http://" in the search field - this shows you the list of all the http:// url's. You have to select the http url's one by one. For each of the url's you can select "Inlinks" at the bottom of the screen & then you see all the url's linking to the http version. This works for both the html & the images.
Hope this helps,
rgds
Dirk
-
Forgot to mention, yes I checked the scheme of the serp results for those pages, is not just google not displaying it, it really still have the http version indexed.
-
Hi DC,
in screaming frog I can see the old http links. Usually are manually inserted links and images in wordpress posts, I am more than eager to edit them, my problem is how to find all the pages containing them, in screaming frog I can see the links, but I don't see the referrer, in which page they are contained. Is there a way to see that in screaming frog, or in some other crawling software?
-
Hi,
First of all, are you sure that Google didn't take the migration into account?I just did a quick check on other https sites. Example: when I look for "Google Analytics" in Google - the first 3 results are all pointing to Google Analytics site, however only for the 3rd result the https is shown, even when all three are in https. So it's possible it is just a display issue rather than a real issue.
Second, I did a quick crawl of your site and I noticed that on some pages you still have links to the http version of your site (they are redirected but it's better to keep your internal links clean - without redirections).
When I checked one of these pages (https://www.gomme-auto.it/pneumatici/pneumatici-cinesi) I noticed that this page has some issues as it seems to load elements which are not in https - possible there are others as well.
example: /pneumatici/pneumatici-cinesi:1395 Mixed Content: The page at 'https://www.gomme-auto.it/pneumatici/pneumatici-cinesi' was loaded over HTTPS, but requested an insecure image 'http://www.gomme-auto.it/i/pneumatici-cinesi.jpg'. This content should also be served over HTTPS.
The page you mention as example: the http version still receives two internal links from https://www.gomme-auto.it/blog/pneumatici-barum-gli-economici-che-assicurano-ottime-prestazioni and https://www.gomme-auto.it/pneumatici/continental with anchor texts 'pneumatici Barmum' & 'Barum'
Guess google reasons, if the owner of the site is not updating his internal links, I'm not going to update my index
On all your pages there is a part of the source which contains calls to the http version - it's inside a script so not sure if it's really important, but you could try to change it to https as well
My advice would be to crawl your site with Screaming Frog, and check where links exist to http versions and update these links to https (or use relative links - which is adviced by Google (https://support.google.com/webmasters/answer/6073543?hl=en see part 'common pitfalls')
rgds
Dirk
-
Mhhh, you are right theoretically could be the crawler budget. But if that is the case I should see that from the log, I should miss crawler visits on that page. Instead the crawler is happily visiting them.
By the way, how would you "force" the crawler to parse these pages?
I am going to check the sitemap now to remove that port number and try to split them. Thanks.
-
Darn it, you are right, we added a new site, not a change of address, sorry about that. Apparently my coffee is no longer effective!
-
As far as I know the change of address for http to https doesn't work, the protocol is not accepted when you do a change of address. And somewhere I read google itself saying when moving to https you should not do a change of address.
But they suggest to add a new site for the https version in GWT, which I did, and in fact the traffic slowly transitioned from the http site to the https site in GWT in the weeks following the move.
-
Are you sure? On https://support.google.com/webmasters/answer/6033080?hl=en&ref_topic=6033084 it says: "No need to submit a change of address if you are only moving your site from HTTP to HTTPS."
I dont think you are given the option to select the same domain for change of address in GWT.
-
Looks like you are doing everything right (set up 301 redirects, updated all links on the site, updated canonical urls) - just need to force the crawlers to parse those pages more. perhaps crawler is hitting its budget before it gets to recrawl all of your new urls?
You should also update your sitemap as it contains a bunch of links that look like: https://www.gomme-auto.it:443/pneumatici/estivi/pirelli/cinturato-p1-verde/145/65/15/h/72
I recommend creating several sitemaps for different sections of the site and seeing how they are indexed via GWT.
-
Did you do a change of address in Google Webmaster Tools? Http and Https are considered different URLs, and you will have to do a change of address if you switched to a full https site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google analytics
hello there, While Adding My website in Google Analytics,It is not Showing India Country in list. What i can do to add My account in it?? Have requested Google, But Got No response.any body can help please . Thanx in advance,
Reporting & Analytics | | iepl5
Falguni0 -
Ecommerce, Product Content & Google Metrics
Hi I know Google has many different variations of what they consider to be thin content. I wondered if anyone has an idea of the best metric to determine what content you need to improve on your site? I work on a large e-commerce site so there are a thousands of product pages - all with product descriptions similar [but not duplicate] to competitors. I guess in terms of quantity, these pages don't have huge amounts of written content, so I'm wondering what Google classes as 'thin' on a product page: 1. Does Google just expect a conversion to deem that product page useful? And if not, what's the best metric to identify what works vs. what doesn't on product pages in Google's eyes. 2. If adding lots of product pages on mass is bad and will decrease overall authority? The content isn't duplicate, but may be fairly similar to other sites selling the same thing. I'm trying to get our reviews added directly to product pages rather than in a pop up to improve the unique content and I'm starting to write guides, FAQ's and I'll work towards getting video started - however, I'm the only SEO & we don't have much resource so this all takes time. If anyone else has any advice on steps to take that would be great 🙂
Reporting & Analytics | | BeckyKey0 -
Save a Google Analytics account
We recently started with a client who has Google Analytics already installed on their current site, but they have no idea the account login for it? Is there anyway to transfer ownership of this just based on the code?
Reporting & Analytics | | WillWatrous0 -
Using Google Analytics network service provider
Hi What are people's views on using the Technology > Network > Service provider part of Google Analytics as a way to find companies that have visited a website? Then potentially contact them companies etc? Thanks
Reporting & Analytics | | Hughescov0 -
How Accurate Is Google Analytics Audience Location
Cioa from Latitude 53.92705600 Longitude -1.38481600 Clients love to know which part of the country there visitors originate adn cant get enough of this type of data:
Reporting & Analytics | | Nightwing
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/country-level_zps41f273a3.jpg But just how innacurate is this data when you consder is getting the data from isp locations put another way if i was sat in darkets deepest hole of Paisley Scotland and my ISP was in London would i be tagged as visitor originating in London? Ta muchly,David0 -
Not ranking in Google.com
I put up my last site on the web since a month. So far I have been optimizing mainly to Hungary and I got used to that my content was indexed in a day and if content was good it sometimes appeared in the first two pages in a couple of days. Now with my new site I am targeting google.com. I put it up since a week, sent the sitemap to google, it was intresting for mee te see that even the pages to get into the web index needed two days. Seomoz says my site is all right besides some duplicate content issue i will solve soon. So it is past a week and even if I copy a complete sentence from the beginning of my home page and paste it into google my site does not appear. I also purchased couple of backlins but they have not appeared so far as well. Is that really this slow? Am I to impatient? Or should there be something else problem I should be looking for? Thanks for any feedback
Reporting & Analytics | | sesertin0 -
Google analytics tracking
Is their a built-in or easy way to track the amount of times an exit link is clicked on a page and show that with other links of the page. For instance: I had a page with several external ad links on it and I want to track how many each got and rank them in google analytics
Reporting & Analytics | | insitegoogle0 -
Search within search? Weird google URLs
Good morning afternoon, how are you guys doing today? I'm experiencing a few Panda issues I'm trying to fix, and I was hoping I could get some help here about one of my problems. I used Google analytics to extract pages people land on after a Google search. I'm trying to identify thin pages that potentially harm my website as a whole. It turns out I have a bunch of pages in the likes of the following: /search?cd=15&hl=en&ct=clnk&gl=uk&source=www.google .co.uk, and so on for a bunch of countries (.fi, .com, .sg, .pk, and so on, maybe 50 of them) My question is: what are those pages? their stats are awful, usually 1 visitor, 100% bounce rate, and 0 links. Do you think they can explain my dramatic drop in traffic following Panda? If so, what should I do with them? NOINDEX? Deletion? What would you suggest? I also have a lot of links in the likes of the following: /google-search?cx=partner-pub-6553421918056260:armz8yts3ql&cof=FORID:10&ie=ISO-8859-1&sa=Search&siteurl=www.mysite.com/content/article They lead to custom search pages. What should I do with them? Almost two weeks ago, Dr. Pete posted an article untitled Fat Panda and Thin Content in which he deals with "search within search" and how they might be targeted by Panda. Do you think this is the issue I'm facing? Any suggestion/help would be much appreciated! Thanks a lot and have a great day 🙂
Reporting & Analytics | | Ericc220