Website Not Performing after switch to HTTPS
-
We recently switched our client's website to HTTPS but after the move, we've experienced a huge decrease in rankings (off the map), and traffic. Our metas for the homepage are not being picked up by Google, although it was appearing correctly before the switch.
We've implemented all redirects, resubmitted URL to Google, and updated GSC.
GSC is also reporting errors in our XML stating there are no URLs to crawl.
Has anyone had any issues similar? What do you all recommend?
Help greatly appreciated
-
Thank you Joe! I'll take a look at the article link you provide. Hopefully, I'll find the culprit.
-
Hi there,
Sounds frustrating, when did this migration occur?
I'm sure you've double checked but you implemented a 301 redirect from http://www. -----> https://www. correct? And URLs were 1 to 1 mapped out, no bulk redirect to the homepage or anything like that?
The site's internal navigation was updated to link directly to https://www. URLs, correct? If you are forcing each link in the site's navigation to redirect, you may be diluting homepage link equity and slow site performance. This could potentially be part of the issue.
This article discusses some potential reasons that rankings can drop after migrating from http ----> https, I would recommend reviewing these.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I have multiple 301's when switching to https version
Hello, our programmer recently updated our http version website to https. Does it matter if we have TWO 301 redirects? Here is an example: http://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/linux-dedicated-server We're getting pulled in two different directions. I read https://moz.com/blog/301-redirection-rules-for-seo and don't know if 2 301's suffice. Please let me know. Greatly appreciated!
Intermediate & Advanced SEO | | Shawn1240 -
Problems in indexing a website built with Magento
Hi all My name is Riccardo and i work for a web marketing agency. Recently we're having some problem in indexing this website www.farmaermann.it which is based on Magento. In particular considering google web master tools the website sitemap is ok (without any error) and correctly uploaded. However only 72 of 1.772 URL have been indexed; we sent the sitemap on google webmaster tools 8 days ago. We checked the structure of the robots.txt consulting several Magento guides and it looks well structured also.
Intermediate & Advanced SEO | | advmedialab
In addition to this we noticed that some pages in google researches have different titles and they do not match the page title defined in Magento backend. To conclude we can not understand if this indexing problems are related to the website sitemap, robots.txt or something else.
Has anybody had the same kind of problems? Thank you all for your time and consideration Riccardo0 -
Keyword Stuffing - Ecommerce websites
Hey Mozzers, Im undertaking a content audit and its going very well, we have written some better content for the first set of pages, it still needs some improvement but we have a good base and starting point from which we can make an SEO log and work on it over time. For the content I used the following formula for how many times to include a keyword Word Count / Length of Keyword. (eg. 600 words / 3 word keyword = 200). Then 1-4% of this (2-8 times). This has worked well for me in the past and has been a good base guide. I have ran the pages through Moz optimiser and every single page hit an A for keyword page optimisation. However many of the pages failed on keyword stuffing, which obviously has high priority. My dilemma is that, moz counts 15 as the cut off for keyword stuffing with the written text we have done really well with using it a set number of times. But these pages are product category pages. The keyword in the extreme of cases is listed 7-9 times in the side nav menu. 7-9 times in the product category listings. Take for example *** it is optimised for thermometers (i know it a tough single word keyword, and we have fairly modest aims with it, im using it here for example purposes). The word is used a good number of times within the article but is sent through the roof with the links to the sub categories. This page for example mentions the keyword 30 times. Can anybody suggest any ways to improve on this? Is how we display the categories in the nav bar and in the page excessive? As always many thanks!
Intermediate & Advanced SEO | | ATP0 -
XML Sitemaps - Multi-lingual website
Hi Mozzers, I am working with a large website that has some of its content translated across multiple languages. I am planning on using The Media Flow to create an HREFLANG Sitemap for content on various languages. Please see the attached image for the questions below. Thanks! Section Highlighted Yellow: When there is a URL that does not have a translated version, should it not be included on the same HREFLANG sitemap? Alternately, could I just remove the languages that are not being targeted, so this would just reflect English language targeting? fqO9Dvk
Intermediate & Advanced SEO | | J-Banz0 -
How to deal with duplicates on an e-commerce website
Hi guys, So we have an e-commerce website and we have some products that are exactly the same but come in different colours. Lets say for example we have a Samsonite Chronolite and this bag comes in 55cm, 65cm and 75cm variations. The same bag also may come in 4 different colours. The bags are the same and therefore have the same information besides maybe the title tag varying due to the size and colour. But the descriptions are the same. How do I avoid Google thinking I am duplicating pages or have duplicated pages. Google things we have duplicated when the scenario is as I have explained. Any suggestions? Best regards,
Intermediate & Advanced SEO | | iBags2 -
Website.com/blog/post vs website.com/post
I have clients with Wordpress sites and clients with just a Wordpress blog on the back of website. The clients with entire Wordpress sites seem to be ranking better. Do you think the URL structure could have anything to do with it? Does having that extra /blog folder decrease any SEO effectiveness? Setting up a few new blogs now...
Intermediate & Advanced SEO | | PortlandGuy0 -
Ajax website and SEO
Hi all, A client of mine has a website similar to Pintrest. All in Ajax/. So imagine an ajax-grid based animal lover site called domain.com. The domain has three different Categories Cats, Dogs, Mice. When you click on a category, the site doesn't handle the URL and doesn't change the domain So instead of the domain going from domain.com to domain.com/cats, it uses the Ajax script and just shows all the cat pins. and when you click on each pin/post it opens a page such as domain.com/Pin/123/PostTitle It doesn't reference the category. However a page domain.com/cats does exist and you can go there directly. Is this an SEO issue for not grouping all pins under a category? How does Google handle Ajax these days, it use to be real bad but if Pintrest is going so well i'm assuming times have changed? Any other things to be wary of for a grid based/ajax site? I am happy to pay for an hour or two for a more in depth audit/tips if you can feed back on the above. Fairly urgent. Thanks
Intermediate & Advanced SEO | | Profero1 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590