Site deindexed after HTTPS migration + possible penalty due to spammy links
-
Hi all, we've recently migrated a site from http to https and saw the majority of pages drop out of the index.
One of the most extreme deindexation problems I've ever seen, but there doesn't appear to be anything obvious on-page which is causing the issue. (Unless I've missed something - please tell me if I have!)
I had initially discounted any off-page issues due to the lack of a manual action in SC, however after looking into their link profile I spotted 100 spammy porn .xyz sites all linking (see example image).
Didn't appear to be any historic disavow files uploaded in the non https SC accounts.
Any on-page suggestions, or just play the waiting game with the new disavow file?
-
Thanks for answering all of my questions!
It's interesting that when I do a simple site:search in Google none of the main pages of your website are appearing. Most of the search results are either archives or comments. Typically, I've seen this kind of thing happen when something goes wrong in the redirects or a site is penalized.
It looks like the big dip in indexation didn't occur until about August. I would think that if you pulled the trigger in June, pages would start dropping out of the index much sooner.
In this case, your theory about a possible penalization might be right. I'd be interested to see what happens once Google considers the disavow file (unfortunately, that will take some time).
Does anyone else have any input or possible reasons why pages on this site have dropped out of the index so quickly?
-
Hi Serge,
Thanks for your input. I've answered your questions below.
- How long ago did you switch to https? - 21st June
- Have you submitted both non-www and www versions of the https site to Google Search Console (GSC)? - Yes
- Have you kept the http versions of your website in GSC? - Yes
- From the looks of it, your sitemap has been updated to reflect the https pages. Have you submitted the updated sitemap to GSC? - Yes - Submitted pages are not matching Indexed pages
- Are there any sitemap errors appearing in GSC? Any other errors? No Sitemap Errors. some 404ing pages.
- Could you attach a screenshot of the indexation rate on both https and http versions of the site from GSC?
- Could you confirm that all redirects were done 1-to-1 and properly redirected? (301s and not 302s) - Confirmed - all tools are reporting 200 status after hitting the 301.
We are still waiting to see some results from submitting the disavow file. So far, no positive movement.
Thanks for your help!
-
Hi there,
There could be a lot of reasons why certain pages of your website are dropping out of your index. Could you answer the following questions to help us narrow down the possible cause?
- How long ago did you switch to https?
- Have you submitted both non-www and www versions of the https site to Google Search Console (GSC)?
- Have you kept the http versions of your website in GSC?
- From the looks of it, your sitemap has been updated to reflect the https pages. Have you submitted the updated sitemap to GSC?
- Are there any sitemap errors appearing in GSC? Any other errors?
- Could you attach a screenshot of the indexation rate on both https and http versions of the site from GSC?
- Could you confirm that all redirects were done 1-to-1 and properly redirected? (301s and not 302s)
Some things that we could rule out:
- It looks like the site isn't using noindex tags in a way that would cause deindexing
- It looks like the robots.txt file isn't disallowing any important paths that would cause deindexation
- The http version of the www and non-www pages redirects to the www, https version of the site which is good
- Canonicals seem to be updated and pointing to the https version of the site
Sorry for all of the questions, I just want to make sure and rule out possible causes to focus in on what the issue could be.
Thanks, Serge
-
Hi!
what information do you seen in search console?
Assuming that you have already tested all of your old URL's and the redirection paths points correctly to the new URLs, does Google Search console indicates any problems with the number of URLs submitted to it?
canoncals? are they in use? pointing to the correct version of the site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Realtor site with external links in navigation
I have a client with a realtor site that uses IDX for the listings feed. We have several external links going over to the IDX site for various live custom searches (ie: luxury listings, waterfront listings, etc...). We are getting a Moz spam ranking of 2/7 for both "Large Number of External Links" and "External Links in Navigation". Chances are, these are related. My question is this: (1) Being the score is only 2/7, should I bother with fixing this? (2) If I add a rel="nofollow" to all the site-wide links (in header, footer & menu) will this help? I couldn't find anything definitive in the Q&A search. Looking forward to any insights!!!
Intermediate & Advanced SEO | | lcallander1 -
Existing 301s during site migration - what to do?
Hi - I'm looking at an old website and there are lots of 301s internal to that site - what do I do with these when I move to a new site? Should I list them and adjust them so they redirect to the new site now (instead of from one URL to another URL on the old site) - I'm thinking that if I don't the user will have to travel through one 301 then another to get to the new site, which doesn't seem like a great idea? Your thoughts would be welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Do I have to many internal links which is diluting link juice to less important pages
Hello Mozzers, I was looking at my homepage and subsequent category landing pages on my on my eCommerce site and wondered whether I have to many internal links which could in effect be diluting link juice to much of the pages I need it to flow. My homepage has 266 links of which 114 (43%) are duplicate links which seems a bit to much to me. One of my major competitors who is a national company has just launched a new site design and they are only showing popular categories on their home page although all categories are accessible from the menu navigation. They only have 123 links on their home page. I am wondering whether If I was to not show every category on my homepage as some of them we don't really have any sales from and only concerntrate on popular ones there like my competitors , then the link juice flowing downwards in the site would be concerntated as I would have less links for them to flow ?... Is that basically how it works ? Is there any negatives with regards to duplicate links on either home or category landing page. We are showing both the categories as visual boxes to select and they are also as selectable links on the left of a page ? Just wondered how duplicate links would be treated? Any thoughts greatly appreciated thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Site Wide Footer Links Exception, Any Advice ?
I was reading the following Q&A on site wide footer links, http://moz.com/community/q/site-wide-links-from-another-domain-could-these-cause-a-problem I feel my situation is slightly different however,we have lots of international sites linking to each other through these links like our sites for different counties and languages so our German, French and Spanish sites, http://www.cirrusresearch.co.uk/ Our main UK site has always ranked very well and has never really had a problem despite always having had these followed sitewide footer links, Because of this we regularly get high amount of visitors performing English language searches from different counties and i don't think it is a bad thing having more country/language specific sites of ours available in the footer for visitors that may prefer a more localized site, Our main website has to be at least 10+ years old at least, has a lot of strong links compared to our competitors, but the smaller German and Spanish sites are relatively smaller in size and most only 1-2 years old, my big fear is that these smaller sites would not be able to stand on there own without these footer links from our main site, After reading the community question caused me to question this ?, should i take a leap of faith and no-follow all of these site wide footer links connecting all of our sites ? we never really had a problem ranking so i don't really see the need but would this be the best thing to do ? Thank you, James
Intermediate & Advanced SEO | | Antony_Towle0 -
.com ranked where .co.uk site should After Manual Penalty Revoked - Help!!!
Hi All, I wondered if some could help me as I am at my wits end. Our website www.domain.co.uk was hit with a manual penalty back in April 26th 2012 for over optomizing our inbound links and after 9 reconciliation request later and over a year and many links removed the penalty was revoked. Yay I hear you cry! During the year .co.uk was banned we built .com yet did not build any links to it. The purpose of the .com site was to attract an American audience for our products. .com was hosted on a US server and Geo Targeting set to United States in WMT. So here is my problem after the ban was revoke we expected .co.uk to spring back to some reasonable positions. Nope that is not the case Google now is ranking our .com site where our .co.uk should be for powerdull keywords in position 1st to 10th .com has Zero link equity and .co.uk is very reasonable, So how can I rectify this balls ups and get co.uk listed back where it should be…. I am not bothered where .com ranks. Note: To the best of my knowledge there are NO cross domain 301 or the like only an image link between the two sites. I have posted this on WMT forum and it has fallen on deaf ears! ....help me MOZ members you’re my only hope! Thanks in advance Richard PS: If anyone would like the URL’s in question PM me and I will let you know.
Intermediate & Advanced SEO | | Tricky-400 -
A Client Changed the Link Structure for Their Site... Not Just Once, but Twice
I have a client who's experiencing a number of crawl errors, which I've gotten down fo 9,000 from 18,000. One of the challenges they experience is that they've modified their URL structure a couple times. First it was: site.com/year/month/day/post-name
Intermediate & Advanced SEO | | digisavvy
Then it was: site.com/category/post-name
Now it's: site.com/post-name I'm not sure of the time elapsed between these changes, but enough time has passed that the URLs for the previous two URL structures have been indexed and spit out 404s now. What's the best/clean way to address this issue?I'm not going to create 9k redirect rules obviously, but there's got to be a way to address this issue and resolve it moving forward.0 -
Affiliate Links Added and Site Dropped in only Google
My site was dropshipping a product and we switched to an affiliate offer. We had three 4 links to different affiliate products. Our site dropped the next day. I have been number 1 for 6 months, has a pr 6 and is 2 years old. It has been 2 weeks and the site hasn't jumped back. Any suggestions on how to handle this?
Intermediate & Advanced SEO | | dkash0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0