Authority site drastic ranking drop after google https switch. Please Help!
-
Hi Mozers.
Since Google switched to the https version, our company website (http://we.register.it) indexing switched to the https version (https://www.register.it).
After that the ranking on Google dropped for almost every keyword.
The site is very old and got a great authority and PR 7. It ranked for same keywords for very long time
On each page from years there is the correct meta rel canonical. No spam, and WMT is ok.
Could you please help?
- The internal links are all in http, and in https. If you are https are in https (they are all relatives)
- No changes have been made and the subdomain is in that way from 8 years: the main url has always been http://we.register.it
- Google started this indexing switch around the 15 October
-
Hey Luca
Weird, a similar issue just happened to a friend's site here locally.
To clarify, http pages are still indexed if you search - site:https://www.register.it/ -inurl:https -- but it's just that both http AND https are indexed now.
Also - I don't see we.register.it loading at all - only www.register.it --- was this recently changed? Although I am seeing both we and www indexed in Google. Did we always redirect to www?
Since the HTTPS is redirecting now to HTTP Google will not be able to see the canonical. Here's what I think you should try;
- Undo the 301 redirect from https to http
- In your HTTPS robots.txt file, block crawling
- Register HTTPS in webmaster tools (if not already) and do a URL removal on all the pages you don't want indexed from HTTPS (probably all of them)
- I would strongly recommend sorting out your internal linking using absolute URL paths and ONLY link to pages with HTTPS when they actually are really HTTPS pages. Otherwise links to HTTP.
- Sort out your we vs. www subdomain issue.
This is honestly a slightly involved situation, where we can give cursory advice here, but just want to be transparent that if you're not completely comfortable in what to do, you might want to find someone who can peek behind the curtain (unfortunately as Moz Associates we're not able to log into your accounts or anything).
EDIT: just want to add that you should be sure you have updated and working XML sitemaps for the http and subdomain you want indexed, and they are submitted to WMT
-
No problem Federico.
I do have checked and there aren't any warnings in WMT. Everything looks right!It's very weird that google ranked http://we.register.it in top position for top keywords and since few days google started to index https version (ignored until then). Google first moved all ranking to https version keeping exact same positions and then started to drop everything down.
Any idea?
-
Oh yes, sorry Luca, I thought you were serving different canonicals in each version pointing to itself. The way you have them set is right.
I don't think this is the reason you are seeing the ranking or traffic drop, as actually using the canonicals the right way (the way you are) should avoid any duplicate content issue. But you better cover all the bases and fix everything that could be the reason.
By the way, did you check the manual actions sections in WMT?
-
Hi Federico,
What you are saying about the duplicate issue is definitely right intact we have already planned to 301 redirect one or the other protocol version (http 301 to https or viceversa). Anyway it's very weird that google is realizing this after about 10 years.
Anyway It's not clear to me how Canonicalization it makes it worse.
On the https://www.register.it page there is the following canonicalThis means that we are saying to google that the right page to index is http://we.register.it/ and https://www.register.it is a duplicate of http://we.register.it (which is the page was actually ranking until few days ago).
Am I wrong?Thanks
Luca -
As far as I understand, having both versions of the page HTTP and HTTPS can cause duplicate content issues. Serving pages under both protocols creates 2 pages of each page for search engines, even if the content is the same.
Canonicalization here even makes it worth, as you are actually telling engines to index both as different while they are actually the same.
You need to go with one or the other. Given the common choice and your type of business, I would choose HTTPS and 301redirect ALL HTTP pages to their HTTPS. That can be done with a single PHP line of code (if you are under PHP).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
My Google Author Pic Disappeared
My Google author picture, which had been in place for a couple of years, disappeared from all SERP results recently. I checked, and rel=author attribution is valid on every post, as is the link to to the Google + authorship page (which contains a link back to the web site). When I test URL's in the structured data testing tool the picture appears. I'm out of troubleshooting ideas. Any suggestions welcome.
Technical SEO | | waynekolenchuk0 -
How does Google Crawl Multi-Regional Sites?
I've been reading up on this on Webmaster Tools but just wanted to see if anyone could explain it a bit better. I have a website which is going live soon which is going to be set up to redirect to a localised URL based on the IP address i.e. NZ IP ranges will go to .co.nz, Aus IP addresses would go to .com.au and then USA or other non-specified IP addresses will go to the .com address. There is a single CMS installation for the website. Does this impact the way in which Google is able to search the site? Will all domains be crawled or just one? Any help would be great - thanks!
Technical SEO | | lemonz0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
Please recommend a tool to list pages on my site.
I have taken a major hit from the latest update. Site has been online for 10 years, white hat SEO all the way but I do have some legacy pages were I would duplicate title or the description on a new page. Things are just unorganized currently and trying to find the best approach to organizing what I already have as well as track new content. I would like to have a tool that would basically extract a list of my current pages, the title tags and the description in an Excel file. Not sure how the pros organinze the SEO on a site but my biright idea is that I can have a large excel file with the pages listed so I can detect duplicate info. Site only has about 300 pages. Just regular php pages, no CMS. Thanks in advance!
Technical SEO | | Force70 -
Rankings Dropped Dramatically
Hi All, I need some advice on this one, one of our websites is performing very badly and I am lost to know why. We migrated from an old website www.myclient.co.uk and re-directed to www.myclient.com. The issue we had was that the old site relied on search results i.e. www.myclient.co.uk/search+villas+in+marbella However most ranked url's were not as tidy as the example above and had lots of characters. Because of this and the fact that all incoming links were to the homepage we do not carry out any 301 re-directs. In some keywords we have increased our rankings but for others we have dropped dramatically. keyword 1 4th Jan position 22 9th Jan position 90 keyword 2 4th jan position 34 9th jan position 89 keyword 3 4th jan position 12 9th jan position 16 (a smaller drop than the others) We do have some top 10s for this domain so I don't think we have been penalised for anything but I am shocked to see such drops in rankings. I have setup the website so that each page targets different keywords and I have checked and they get A grades within SEOmoz. Any advice would be really appreciated, as the client is not too happy at the moment. Many thanks Andy
Technical SEO | | iprosoftware0 -
Google Duplicate Content Penalty On My Own Site?
I am certain that I have hit a google penalty filter for my site http://www.playpokeronline.ca for my main keywords "play poker online" in google.ca I rank 670th and used to be on the first page between 1 and 10 in June. On Bing I am like 9th On my site I found the entire site duplicated as follows Original: www.playpokeronline.ca Duplicate www.playpokeronline.ca/playpokeronline/ this duplicate was not intentional and seems to be a result of my hosting at godaddy. for every page on my site and it shows up in webmaster tools I blocked the duplicate with robots.txt and a few days ago dropped it and wrote a rel=connonical tag in the top of each page visitors dropped from 100 per day in august to 12-20 in the last month. Google says that if duplicate content is made to try to game serps they may filter or penalize my site. Have I triggered this penalty or a different sort of over optimization penalty? Will the rel= canonical tags fix this or should i do something else? This Penalty Business is Not my Idea of a good time Thank You Jeb
Technical SEO | | PokerCanada0 -
Google caching meta tags from another site?
We have several sites on the same server. On the weekend we relocated some servers, changing IP address. A client has since noticed something freaky with the meta tags. 1. They search for their companyname, and another site from the same server appears in position 1. It is completely unrelated, has never happened before, and the company name is not used in any incoming text links. Eg search for company1 on Google. Company1.com.au appears at position 2, but at position1 is school1.com.au. The words company1 don't appear anywhere on the site. I've analysed all incoming links with a gazillion tools, and can't find any link text of company1, linking to school1. 2. Even more freaky, searching for company1.com.au at Google. The results at Google in position 1 for the last three days has been: Meta Title for school1 (but hovering/clicking actual goes to URL for company1)
Technical SEO | | ozgeekmum
Meta Description for school1
URL for company1.com.au Clicking on the cached copy of result1, it shows a cached version of school1 taken on March 18. Today is 29 March. Logically we are trying to get Google to spider both sites again quickly. We've asked the clients to update their home pages. Resubmitted xml sitemaps. Checked the HTTP status codes - both are happily returning 200s. Different cookies. I found another instance on a forum: http://webmasters.stackexchange.com/questions/10578/incorrect-meta-information-in-google Any ideas?0