Traffic has not recovered from https switch a year ago.
-
I have an ecommerce site that was switched to https a year ago almost to the day. Our category pages are about half of what they were. The redirects were put in properly, and everything in webmaster tools looks good. Anything out there I may not have thought of?
Want to add that the drop is only in Google, Bing stayed just fine.
-
I have read in so many places that it caused a dip for others as well. I had a really bad experience with a site move once so I had a checklist of everything and double and triple checked it, but it has just been a slow decline.
-
We experienced the same thing and I am fairly certain that we did EVERYTHING right. I just think the algorithms are messed a little. I even made a competitor analysis and found that all the websites that did the https move have experienced a major dip in the past. I cannot tie it to the move date, but it is clearly visible on semrush.
I have a feeling that google endorsed this https move because they need the referer data to make their analytics product work better over time, but while this whole web wide move is happening they accept some collateral damage. I even hired consultants and there is no proof anywhere that https is that "positive ranking signal" Matt Cutts vaguely indicated...but then again he said it is a ranking signal and it might as well be a negative ranking signal by that wording. My hunch so far.
-
Hi Cyrus,
1. I believe that pagination is implemented correctly. Is there anything specific you think I should check?
2. Canonicals are in place.
3. The category pages do not have their own introductory text.
4. We have the title tags and descriptions set.
Wanted to also add that we have the correct schema on the pages as well.
-
We've actually seen Google get harsh on category-type pages across a wide number of industries and sites. It's even happened here at Moz. If your HTTPS is implemented correctly (and sounds like you are reasonably certain it is) you might want to look to other areas.
I'd look at your category pages and make sure:
- Pagination is implemented correctly
- Canonical are in place, where appropriate
- If possible, each category should have it's own introductory text, i.e. https://moz.com/ugc/category/link-building
- Basically, do everything you can to treat your category pages like actual landing pages worthy of search traffic, including unique content, value, title tags, descriptions, etc.
-
I don't see where he asked about the site structure, but no it didn't change.
Reporting has not changed, no new filters, we block our company's visits, tracking code is consistent.
-
You didn't answer Dirk's question (above). Has the site structure changed at all?
Has your reporting changed? Added any new filters? Forgot to block your own company's visits from being tracked? Is the tracking code consistent on all pages? (Although it's probably not a reporting problem if, as you say, rankings and sales have also dropped.)
It's good you're doing the audit. Doesn't appear to be an obvious problem.
-
All pages have dipped a little but the category pages seems to have lost the bulk. We have had rankings and sales drops. Canonicals are in correctly and sitemaps have been updated properly.
-
1. The traffic decline wasn't sudden or initially very much. If you look at our traffic it looks like a pyramid with the peak being when we switched to https. It has just been a slow gradual decline every since.
2. The migration was Sept 11 last year, I don't think there was anything that week.
3. User behavior has stayed constant.
4. No spike in errors, the migration went very smooth.
-
Is it just the category pages that have lost traffic? Have rankings and sales also changed significantly? Are canonicals pointing to https? Have sitemaps been updated?
-
Did the traffic drop occur right after the migration to https or a few months/weeks later?
Was the migration close to the date of an algorithm change?
Did you see any change in behaviour of your users after migration (time on page, bounce rate, avg. pages/session,...)?
Was there a spike of errors in WMT after migration or did everything go quite smoothly?
Was it just a migration to https - or did other elements change on the website?
To be very honest - trying to figure out one year after migration what went wrong is an almost impossible task - especially because you don't have access to the WMT data from migration.
The best you can do is to dive deep in to your analytics figures (search traffic) and compare data before/after migration and try to understand what might have had an impact.
rgds,
Dirk
-
I am in the middle of doing an audit to see if I may have missed something. We are fully mobile optimized. Maybe it was a penalty but there has never been a single black hat trick used on the site. Panda hit that month but we have just been on a slow decline for the last year so that it is now 50%. As an ecommerce site I can't think of a scenario where Panda would hit us unless we were doing something we shouldn't have.
-
HTTPS did cause the site speed to slow down a little bit, we knew that was coming so right after launch we did some optimizations so it is now faster with https then before with http.
-
It's impossible to say without seeing the site and, likely, without seeing analytics. What I can tell you is that the issue may not have anything to do with HTTPS. There have been updates to Google's algorithms, and many other things. Mobile optimization has become a huge point, for instance. I would run an audit of your site for both technical and SEO issues to see if those might help.
-
Moving to https could have an impact on your site's perfomance - which may counter the potential benefits of migrating to https. If you compare page load times in Analytics before/after migration - did they go up/down or remained stable?
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Scary bug in search console: All our pages reported as being blocked by robots.txt after https migration
We just migrated to https and created 2 days ago a new property in search console for the https domain. Webmaster Tools account for the https domain now shows for every page in our sitemap the warning: "Sitemap contains urls which are blocked by robots.txt."Also in the dashboard of the search console it shows a red triangle with warning that our root domain would be blocked by robots.txt. 1) When I test the URLs in search console robots.txt test tool all looks fine.2) When I fetch as google and render the page it renders and indexes without problem (would not if it was really blocked in robots.txt)3) We temporarily completely emptied the robots.txt, submitted it in search console and uploaded sitemap again and same warnings even though no robots.txt was online4) We run screaming frog crawl on whole website and it indicates that there is no page blocked by robots.txt5) We carefully revised the whole robots.txt and it does not contain any row that blocks relevant content on our site or our root domain. (same robots.txt was online for last decade in http version without problem)6) In big webmaster tools I could upload the sitemap and so far no error reported.7) we resubmitted sitemaps and same issue8) I see our root domain already with https in google SERPThe site is https://www.languagecourse.netSince the site has significant traffic, if google would really interpret for any reason that our site is blocked by robots we will be in serious trouble.
Intermediate & Advanced SEO | | lcourse
This is really scary, so even if it is just a bug in search console and does not affect crawling of the site, it would be great if someone from google could have a look into the reason for this since for a site owner this really can increase cortisol to unhealthy levels.Anybody ever experienced the same problem?Anybody has an idea where we could report/post this issue?0 -
How do I get the sub-domain traffic to count as sub-directory traffic without moving off of WordPress?
I want as much traffic as possible to my main site, but right now my blog lives on a blog.brand.com URL rather than brand.com/blog. What are some good solutions for getting that traffic to count as traffic to my main site if my blog is hosted on WordPress? Can I just create a sub-directory page and add a rel canonical to the blog post?
Intermediate & Advanced SEO | | johnnybgunn0 -
Visibility for https://goo.gl/gJH7eh
Hi Mozzers, I am wondering if anyone can help me with the following. At the start of May this year we really lost visibility for the homepage of this site https://goo.gl/gJH7eh. This was particularly noticeable by tracking rankings for the term 'oak furniture'. We previously ranked on page 1 for the term 'oak furniture', but since May the homepage has struggled to make the top 100 positions for this term. We're confident that we have done everything within Google's guidelines, but it seems something is really holding the homepage back. The site ranks on page 1 for 'oak furniture' on Bing. The site had previously had a manual penalty for unnatural links (warning received several years ago). These links had a particular emphasis on using the anchor text 'oak furniture'. When we took over the site we did an extensive link clean up and disavow and managed to get the penalty removed at the end of October 2013. Any help would be greatly appreciated. Karen
Intermediate & Advanced SEO | | OFS0 -
SSL Certificate valid for SEO https
Hi everybody! I have talked with my hosting provider and he offers me two kind of SSL. I've read that the best option for SEO is to convert the hole site to https response (not only the payment pages), but my developer team is telling me that this kind of security to the whole site will be negative for all the websites contained under this IP ¡! So I wonder if somebody who has the https implemented correctly and working properly for SEO could recommend me: which kind of certificate is the correct one and what specific things sould I consider with my hosting provider if it's true that could be a disaster if I implement the https to the whole website beacause I'm blocking the robots and it's dangerous for my domains in this server Please, any help would be really appreciated. Thanks in advance!
Intermediate & Advanced SEO | | Estherpuntu0 -
Site experiencing drop in Google rankings and organic traffic after redesign.
Hello, The company that I work for recently implemented a complete redesign for our company website. The former site was old, cumbersome and in desperate need of an update. We streamlined the site structure and made sure to redirect as many pages as we could find to new thematically related pages with 301 redirects. After the launch of our new site we saw a large upswing in "soft" 404 errors despite the fact that most of these pages do redirect upon inspection. So in relation to the soft 404s, for example, is it merely a matter of labeling them as fixed if they redirect properly, or could their be an underling issue with the site itself? Also, a majority or the urls labeled "not found" in webmaster tools are properly redirected. Do these merely need to be marked as fixed, or is there something else that needs to be fixed like the sitemap structure? I appreciate any and all input. Beyond Indigo
Intermediate & Advanced SEO | | BeyondIndigo1 -
301 Redirect? How to leverage the traffic on our old domain.
I've seen multiple questions about this but there's a few different answers on ways to approach it. Figured I'd personally ask for our situation. Any advice would be appreciated. We formed a new company with a new name / domain while at the same time buying an existing company in our industry. The domain and site of the company we acquired is ranking for some valuable keywords and still getting a significant amount of traffic (about half of what our new site is getting). A big downside has been, when they moved that site to a different server, something happened to where the site became uneducable so it's full of bad pricing and information. Because of that, we've had a maintenance page up for a little bit because it was generating calls to our sales team (GOOD) but the customer was having seen incredibly incorrect information (BAD) Rather than correcting those issues or figuring out why the site is un-editable, we just want to find a way where we can leverage that traffic and have them end up at our new site. Would we 301 redirect the entire domain to our new one? If we did that would the old domain still keep the majority of it's page rank?
Intermediate & Advanced SEO | | HuskyCargo1 -
[Need advice!] A particular question about a subdomain to subfolder switch
Hello Moz Community! I really was hoping to get your help on a issue that is bothering me for a while now. I know there is a lot of about this topic but I couldn’t find a good answer for my particular question. We are running several web applications that are similar but are also different from each other. Right now, each one has its own subdomain (which was mainly due to technical reasons). Like this: webapp1.rootdomain.com, webapp2.rootdomain.com etc. Our root domain currently points with 301 to webapp1.rootdomain.com. Now, we are thinking about making two changes: changing to a subfolder level like this: rootdomain.com/webapp1 , rootdomain.com/webapp2 etc. Changing our rootdomain to a landing page (lisitng all the apps) and take out the 301 to webapp1 We want to do these changes mainly for SEO reasons. I know that the advantages are not so clear between subdomain/subfolder but we think it could be the right way to go to push the root domain and profit more from juice passing to the different apps. The problem is that we had a bad experience when we first switched from our first wep app (rootdomain.com) to an subdomain (webapp1.rootdomain.com) to set them equal with the other apps. Our traffic dropped a lot and it took us 6 weeks to get back on the same level as before. Maybe it was the 301 not passing all juice or maybe it was the switch to the subdomain. We are not sure. So, I guess my question is do you think it is the right thing to do for web apps to go with subfolders to pass more juice from root to subfolders? Will it bring again huge drops in traffic once we make that change? Is it worth taking that risk or initial drop because it will pay off in the future? Thanks a lot in advance! Your answers would help me a lot.
Intermediate & Advanced SEO | | ummaterial0 -
Recovering from robots.txt error
Hello, A client of mine is going through a bit of a crisis. A developer (at their end) added Disallow: / to the robots.txt file. Luckily the SEOMoz crawl ran a couple of days after this happened and alerted me to the error. The robots.txt file was quickly updated but the client has found the vast majority of their rankings have gone. It took a further 5 days for GWMT to file that the robots.txt file had been updated and since then we have "Fetched as Google" and "Submitted URL and linked pages" in GWMT. In GWMT it is still showing that that vast majority of pages are blocked in the "Blocked URLs" section, although the robots.txt file below it is now ok. I guess what I want to ask is: What else is there that we can do to recover these rankings quickly? What time scales can we expect for recovery? More importantly has anyone had any experience with this sort of situation and is full recovery normal? Thanks in advance!
Intermediate & Advanced SEO | | RikkiD220