Site deindexed after HTTPS migration + possible penalty due to spammy links
-
Hi all, we've recently migrated a site from http to https and saw the majority of pages drop out of the index.
One of the most extreme deindexation problems I've ever seen, but there doesn't appear to be anything obvious on-page which is causing the issue. (Unless I've missed something - please tell me if I have!)
I had initially discounted any off-page issues due to the lack of a manual action in SC, however after looking into their link profile I spotted 100 spammy porn .xyz sites all linking (see example image).
Didn't appear to be any historic disavow files uploaded in the non https SC accounts.
Any on-page suggestions, or just play the waiting game with the new disavow file?
-
Thanks for answering all of my questions!
It's interesting that when I do a simple site:search in Google none of the main pages of your website are appearing. Most of the search results are either archives or comments. Typically, I've seen this kind of thing happen when something goes wrong in the redirects or a site is penalized.
It looks like the big dip in indexation didn't occur until about August. I would think that if you pulled the trigger in June, pages would start dropping out of the index much sooner.
In this case, your theory about a possible penalization might be right. I'd be interested to see what happens once Google considers the disavow file (unfortunately, that will take some time).
Does anyone else have any input or possible reasons why pages on this site have dropped out of the index so quickly?
-
Hi Serge,
Thanks for your input. I've answered your questions below.
- How long ago did you switch to https? - 21st June
- Have you submitted both non-www and www versions of the https site to Google Search Console (GSC)? - Yes
- Have you kept the http versions of your website in GSC? - Yes
- From the looks of it, your sitemap has been updated to reflect the https pages. Have you submitted the updated sitemap to GSC? - Yes - Submitted pages are not matching Indexed pages
- Are there any sitemap errors appearing in GSC? Any other errors? No Sitemap Errors. some 404ing pages.
- Could you attach a screenshot of the indexation rate on both https and http versions of the site from GSC?
- Could you confirm that all redirects were done 1-to-1 and properly redirected? (301s and not 302s) - Confirmed - all tools are reporting 200 status after hitting the 301.
We are still waiting to see some results from submitting the disavow file. So far, no positive movement.
Thanks for your help!
-
Hi there,
There could be a lot of reasons why certain pages of your website are dropping out of your index. Could you answer the following questions to help us narrow down the possible cause?
- How long ago did you switch to https?
- Have you submitted both non-www and www versions of the https site to Google Search Console (GSC)?
- Have you kept the http versions of your website in GSC?
- From the looks of it, your sitemap has been updated to reflect the https pages. Have you submitted the updated sitemap to GSC?
- Are there any sitemap errors appearing in GSC? Any other errors?
- Could you attach a screenshot of the indexation rate on both https and http versions of the site from GSC?
- Could you confirm that all redirects were done 1-to-1 and properly redirected? (301s and not 302s)
Some things that we could rule out:
- It looks like the site isn't using noindex tags in a way that would cause deindexing
- It looks like the robots.txt file isn't disallowing any important paths that would cause deindexation
- The http version of the www and non-www pages redirects to the www, https version of the site which is good
- Canonicals seem to be updated and pointing to the https version of the site
Sorry for all of the questions, I just want to make sure and rule out possible causes to focus in on what the issue could be.
Thanks, Serge
-
Hi!
what information do you seen in search console?
Assuming that you have already tested all of your old URL's and the redirection paths points correctly to the new URLs, does Google Search console indicates any problems with the number of URLs submitted to it?
canoncals? are they in use? pointing to the correct version of the site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Nofollow Outbound Links on Listings from Travel Sites?
We oversee a variety of regional, county, and town level tourism websites, each with hundreds (or even thousands) of places/businesses represented with individual pages. Each page contains a link back to the place's main web presence if available. My fear is that a large portion of these linked to sites are low quality, and may even be spammy. With our budgets there is no way to sort through them and assign nofollows as needed. There are also a number of broken links that we try to stay on top of but at times some slip through due to the sheer number of pages. I am thinking about adding a nofollow to these outbound links across the board. This would not be all outbound links on the website, just the website links on the listing pages. I would love to know peoples thoughts on this.
Intermediate & Advanced SEO | | Your_Workshop0 -
Link building… how to get high rewarding links?
Hi Guys, I have a few people whom I have built relationships up in my industry with that would like to link to my site. Is there any particular things I need to be mindful of before having them link to me? I'm just mindful of the unknown. Also, which links to use etc? Thanks in advance
Intermediate & Advanced SEO | | edward-may0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
Should I have as few internal links as possible?
On most pages of my site i have a Quick Links section, which gives x3 cross sales links to other products, a newsletter sign up link, link to Blog, x4 links from images to surveys, newsletters, feedback etc. Will these links be hurting my optimal SEO juice between pages, should the number of internal links be kept to a minimum? My site is www.over50choices.co.uk if that helps. Thanks
Intermediate & Advanced SEO | | AshShep1
Ash0 -
Is the Tool Forcing Sites to Link Out?
Hi I have a tool that I wish to give to sites, it allows the user to get an accurate idea of their credit score with out giving away any personal data and with out having a credit search done on their file. Due to the way the tool works and to make the implementation on other peoples sites as simple as possible the tool remains hosted by me and a one line piece of Javascript code just needs to be added to the code of the site wishing to use the tool. This code includes a link to my site to call the information from my server to allow the tool to show and work on the other site. My questions are: Could this cause a problem with Google as far as their link quality goes? - Are we forcing people to give us a backlink to use the tool? (in the eyes of Google) or will Google not be able to read the Javascript / will ignore the link for SEO purposes? Should I make the link in the code Nofollow? If I should make the link a Nofollow any tips on how to make the most of the opportunity from a link building or SEO point of view? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
Counting over-optimised links - do internal links count too?
To whit: In working out whether I've too many over-optimised links pointing to my homepage, do I look at just external links -- or also the links from my internal pages to my homepage? In other words, can a natural link profile from internal pages help dilute overoptimisation from external links?
Intermediate & Advanced SEO | | Jeepster0 -
New site now links disappearing in Open Site Explorer and GWT
We launched a new site at the beginning of December 2012 and carefully 301'd all URLs from the old site to the new (custom CMS on old site wordpress on new). Our rankings have slipped quite badly but the most worrying thing is that we used to have about 1200 backlinks according to GWT/OSE before the new site launched and now we're down to about 30. Can anyone help shed some light on this please? The site is www.littleoneslondon.co.uk A few things that might help: 1. We were getting a lot of links through our job feeds (it's a nanny recruitment site) on indeed and trovitt, for some reason no new ones from these have appeared in site explorer and all the old jobs are gone completely. 2. We had 1000s of not found errors in google webmaster tools and once these were redirected and marked as fixed this is when the links disappeared. 3. We are getting quite a few 504 errors on the site due to an old proxy redirect (/blog was hosted on a different server on the old site and has not been removed yet), this will be fixed tomorrow but could this be a factor? 4. The developer seems to have redirected all the links through wordpress directly some how (I don't see any redirect plugins but there are lots of pages called 'redirect'). There are no references in the htaccess file for any redirects other than from the /blog folder that the wordpress instance sits in. Sorry for the long post, I hope I've given any details you'd need and I really appreciate any help anyone can give. Thanks, Karl
Intermediate & Advanced SEO | | Bdig0 -
Asking Sites to Remove Links.. What should I say?
After getting some guidance from you guys here on this forum i have decided to go through my WMT backlinks and contact all the sites that I think are spammy and are linking back to me....and I will ask them to remove my links from their sites... Can you guys please provide some guidance as to what I should say in the letter (also, anything i should definitely not say).... Thanks for the help...
Intermediate & Advanced SEO | | Prime850