General SSL Questions After Move
-
Hello,
We have moved our site to https, Google Analytics seems to be tracking correctly. However, I have seen some conflicting information, should I create a new view in analytics?
Additionally, should I also create a new https property in Google search console and set it as the preferred domain? If so, should I keep the old sitemap for my http property while updating the sitemap to https only for the https property?
Thirdly, should I create a new property as well as new sitemaps in Bing webmaster?
Finally, after doing a crawl on our http domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect?
Thanks for all the help in advance, I know there are a lot of questions here.
-
No.
Just keep an eye on it. If you continue to see impressions and clicks, scan your site again just to make sure all content has been converted and there aren't any lingering files hardcoded with http. You server-side redirect will take care of it. The only possible downside is the unnecessary redirect which slows rendering a bit. Not a show stopper.
-
Thanks for the great responses, Donna and Trenton. I just had one follow up question - I am still seeing small amounts of search console data on my http property. While I have read that this is not uncommon, it is mildly concerning as I force https server-side. Should I be alarmed by this?
-
Awesome, thank you for answering everything!
-
Hi Tom3_15,
Should I create a new view in analytics? No. There's no need and you want to easily compare before and after.
Should I also create a new https property in Google search console and set it as the preferred domain? Yes
If so, should I keep the old sitemap for my http_ property while updating the sitemap to https only for the https property? _Keep the old sitemap for your http property. Add the new sitemap (with https URLs) to the new (https) property.
Thirdly, should I create a new property as well as new sitemaps in Bing webmaster? Yes
Finally, after doing a crawl on our http_ domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect? _I don't know why your crawl stopped after the redirect. I use screaming frog and it continues to crawl after the 301 so it might, as you suggest, be a function or shortcoming of the particular tool you're using. BOTs should be able to continue to crawl your site after the redirect. Google certainly can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving Shopify from a Sub Domain to the Full Domain
Apologies if this has been asked before. Currently we have a Shopify shop on a subdomain shop.lucybee.com a blog on subdomain blog.lucybee.com and the full domain of course. We'd now like to move everything onto Shopify on the full domain. Therefore the blog rolls into Shopify. We'll manually move pages into Shopify. Any advice or links to resources on how to manage would be gratefully received. Thank you , Jim
Technical SEO | | LucyBee0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Specific Domain Migration Question
My company will be taking over an ecommerce site that is built to get local city/state traffic where the competition is slim to none for the given keyword. This site gets 2500+ visits per day, and we're looking to maintain and eventually grow that traffic. We would like to move that site onto our ecommerce platform which will force URL change and of every 'keyword' city/state page on the site. We're undecided whether to keep it on an unfamiliar platform that already gets traffic or to move it and possibly face the 404's or weeks of redirecting a single keyword-city/state page to another. Any advice or insight would be great!
Technical SEO | | BMac540 -
Basic Multi-Site Question
Newb question. We run a site in multiple cities under the same domain. Often times one city will provide content that is "syndicated" to other cites. For example, here is the master post: http://www.styleblueprint.com/food-and-entertaining/kale-salad-quick-healthy/ The content will also show up in the following domains: http://atlanta.styleblueprint.com/food-and-entertaining/kale-salad-quick-healthy/ http://birmingham.styleblueprint.com/food-and-entertaining/recipes/kale-salad-quick-healthy/ Should I be marketing the posts in Atlanta and Birmingham as "no index, no follow" for SEO purposes? Thanks in advance, Jay
Technical SEO | | SSBCI0 -
Webmaster Tools Site Map Question
I have TLD that has authority and a number of micro-sites built off of the primary domain. All sites relate to the same topic, as I am promoting a destination. The primary site and each micro-site have their own CMS installation, but the domains are mapped accordingly. www.regionalsite.com/ <- primary
Technical SEO | | VERBInteractive
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3 Question: Should my XML site map for Webmaster Tools feed all sites off of the primary domain site map or are there penalties for this? Thanks.0 -
Sub-domains for keyword targeting? (specific example question)
Hey everyone, I have a question I believe is interesting and may help others as well. Our competitor heavily (over 100-200) uses sub-domains to rank in the search engines... and is doing quite well. What's strange, however, is that all of these sub-domains are just archives -- they're 100% duplicate content! An example can be seen here where they just have a bunch of relevant posts archived with excerpts. How is this ranking so well? Many of them are top 5 for keywords in the 100k+ range. In fact their #1 source of traffic is SEO for many of the pages. As an added question: is this effective if you were to actually have a quality/non-duplicate page? Thanks! Loving this community.
Technical SEO | | naturalsociety0 -
Robots.txt question
What is this robots.txt telling the search engines? User-agent: * Disallow: /stats/
Technical SEO | | DenverKelly0 -
Site Hosting Question
We are UK based web designers who have recently been asked to build a website for an Australian Charity. Normally we would host the website in the UK with our current hosting company, but as this is an Australian website with an .au domain I was wondering if it would be better to host it in Australia. If it is better to host it in Australia, I would appreciate if someone could give me the name of a reasonably priced hosting company. Thanks Fraser
Technical SEO | | fraserhannah0