How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
-
Hello Everyone,
So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version?
Assuming speed and other elements wouldn’t be an issue and it's done correctly.
Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources.
Any help or insight would be appreciated.
Please let us know if there are any further details we could provide that might help.
Looking forward to hearing from all of you!
Thank you in advance for the help.
Best,
-
Yes I would recommend moving to https over the curent http it will be very unlikely that you have an issue with google and The end-users will feel much more secure.
If you are doing this on a bespoke site ( not very common CMS or made by hand)
I know tools like letsencrypt.com give you a free certificate so will CloudFlare.com so if you're trying to have HTTP/2 , HSTS, forced https & a cert you can use for free version and it will do everything you need for https cost (free-$5,000) (any reverse proxy CDN or WAF) stackpath.com ($20) incapsula.com ($59) sucuri.net WAF ($19) Armor.com ($600-7,000) WAF only (under $200) Speedyrails.com has always had deals on cloudflare (35-40% off) if you're going to use it.
if you use a EV cert or just want to have a regular certificate I strongly recommend looking at third-party sites you can save a lot of money.
I like DigiCert and GlobalSign personal most of the others are on the Symantec what was VERISIGN you can definitely get away with getting one for $5-100 on namecheap but remember to search for the best price on Google.
They tend to make the job a lot easier and my personal opinion is that every person should actually do have WAF for their sites they like.
-
I'm happy to be of help!
-
You're welcome
-
Thank you both for the help, this is great. We were asking more in relation to choosing between which version to make the priority. It seems like you would recommend switching over to ‘https’ even though all of our pages being indexed are ‘http’? - Just wanted to make sure we understood you correctly.
We will make sure to go through all of the documentation you sent over before moving as well.
Please let us know if there is anything we could provide further details for that might help.
Looking forward to hearing from you!
Thank you again for all the help.
Best,
-
This will help too
http://www.aleydasolis.com/en/search-engine-optimization/http-https-migration-checklist-google-docs/
& use http://www.aleydasolis.com/htaccess-redirects-generator/https-vs-http/
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteCond %{SERVER_PORT} !^443$
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]</ifmodule> -
As Romans stated you will need to go into search console and add all four properties. Then pick which one you want to be your canonical or Chosen URL.
-
On the Search Console Home page, click the site you want.
-
Click the gear icon, and then click Site Settings.
-
In the Preferred domain section, select the option you want.
-
HTTP://
-
HTTPS://
** The to do list. https://support.google.com/webmasters/answer/6332964**
** make certain that you force https:// on your hosting environment or WAF/CDN**
**Check it using a **Redirect mapper
- https://varvy.com/tools/redirects/
- If you get lost and need to fix something
- https://online.marketing/guide/https/
- https://www.deepcrawl.com/blog/news/2017-seo-tips-move-to-https/
Add HSTS once everything is definitely working.
Make sure everything is working correctly before Google crawls it use
all the best,
Tom
-
-
There is no a big deal, to avoid duplicate content just make the appropriate redirects, and register both properties on search console the 'http' and 'https' version and then set the right version on this case https.
Go to Search Console > Add a property > https version > Verify
then go Site settings and set the new version of your website, with those simple steps you will have everything up and running.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters
We have several vanity domains that forward to various pages on our primary domain.
Intermediate & Advanced SEO | | SS.Digital
e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine. As of yesterday, we have seen the following problem. We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not. When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters. (e.g. VGuTD) EXAMPLE:
www.vanity.com (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx/xxxxx (302, Found) -->
www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly. Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc... This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy. They have not encountered this issue before, and have no idea what this 5 character string is from. I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past? We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time. Really hoping to fix the cause of the problem instead of just treating the symptom.0 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Website architecture - levels vs filters and authority loss - Enterprise SEO
Hi Everyone, I am participating in the development of a marketplace website where the main channel will be traffic via SEO. We have encountered the directories (levels) vs filters situation. 1. Does everyone still agree that if we have too many levels, authority is loss as you do down through the levels? Does everyone agree that there should be a max of 3 levels and never 4. Example 1 www.domain.com/level1/level2/level3 vs www.domain.com/level1 In theory, the content on "level 3" will have a lower DA than the content on "level1". 2. Does everyone agree that for enterprise SEO (huge marketplace websites) filters are a better idea than levels? Example 2 www.domain.com/level1/level2/level3 vs www.domain.com/filter-option1 In theory, the content on "level 3" will have a lower DA than the content on "filter-option1". Thanks so much in advance
Intermediate & Advanced SEO | | Carla_Dawson0 -
Migrating e-commerce platform (same domain). Do I need to be concerned about these changes?
We are moving a domain from oscommerce to prestashop.
Intermediate & Advanced SEO | | lcourse
We will setup 301 redirects for each page and have made sure that new platform is following SEO best practices. I read a lot that it is critical to keep changes to a minimum when migrating to a new domain, but is this also critical when migrating just to a new e-commerce platform (same domain)? Change of URL is unavoidable, but what about these other changes below? Would you be very concerned about doing them at the same time, or rather would you do them some time after the migration? title tag (about 30% of text in title tag will be different) meta description tag (more customized and varied meta description than before) h1 (expanding product name with some relevant keywords for a number of products) additional table with product features (additional content in product pages) adding additional products to store moving to https instead of http Product descriptions and product images and category descriptions will remain the same. Replicating title tag, title description and h1 from old site would actually imply quite a lot of additional work at this point and we would have to make the change anyway at a later point, so if it is not a major risk I would prefer to do it in one go. Any thoughts?0 -
I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?
Intermediate & Advanced SEO | | McTaggart0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Does 'jump to' navigation work with a hidden div?
Will jump to navigation work when using a hidden div? Basically, we use a navigation system such that when a user clicks on a title, it expands to show the rest of the article, each title has an anchor associated with it, but no where else on the page / site do we link to those anchors. In order to make jump to navigation work, we are considering adding a hidden div with links to the anchors. Does anyone have experience doing this? Did it work?
Intermediate & Advanced SEO | | nicole.healthline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0