HTTPS Certificate Expired. Website with https urls now still in index issue.
-
Hi Guys
This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated.
So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one.
So now we are basically sitting with the site urls, only being www...
My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing.
I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index.
Can somebody please give me some advice on this.
Thanks
Dave
-
My guess would be the person in charge of procuring and/or installing the cert took a day of vacation today and Monday things are closed for the Labor Day holiday... just guessing : )
That or just working it into their schedule. Sometimes at larger companies the people managing the website for SEO/Content/etc. are not the same people managing things on the back-end.
-
Why is it going to take 4 days for them to fix your SSL? That's the question I would want answered in your position. SSL certificates are easy to replace so what's the holdup here?
-
Hi Dave,
If I were in your shoes, I'd set up a rule to 302 redirect all of your https pages to their http equivalent until Tuesday when you get your cert.
This will make it so if anyone clicks on your https pages in Google's index that they will be brought to the appropriate page and not get a security warning.
The 302 will also tell Google "Hey, this is just a temporary redirect... don't worry about indexing things differently." - because if you get Google to index all of the http versions and if you don't 301 those, you will get a ton of 404 errors. Which is fine... but it gets messy fast.
Hope that helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
News articles on our website are being indexed, but not showing up for search queries.
News articles on distributed.com are being indexed by Google, but not showing up for any search queries. In Google Search, I can copy and paste the entire first paragraph of the article, and the listing still won't show up in search results. For example, https://distributed.com/news/dtcc-moves-closer-blockchain-powered-trades doesn't rank AT ALL for "DTCC Moves Closer to Blockchain-Powered Trades", the title of the article. We've tried the following so far: re-submitted sitemap to search console checked manual actions in search console checked for any no-index/no-follow tags Please help us solve this SEO mystery!
Intermediate & Advanced SEO | | BTC_Inc0 -
Any way to force a URL out of Google index?
As far as I know, there is no way to truly FORCE a URL to be removed from Google's index. We have a page that is being stubborn. Even after it was 301 redirected to an internal secure page months ago and a noindex tag was placed on it in the backend, it still remains in the Google index. I also submitted a request through the remove outdated content tool https://www.google.com/webmasters/tools/removals and it said the content has been removed. My understanding though is that this only updates the cache to be consistent with the current index. So if it's still in the index, this will not remove it. Just asking for confirmation - is there truly any way to force a URL out of the index? Or to even suggest more strongly that it be removed? It's the first listing in this search https://www.google.com/search?q=hcahranswers&rlz=1C1GGRV_enUS753US755&oq=hcahr&aqs=chrome.0.69i59j69i57j69i60j0l3.1700j0j8&sourceid=chrome&ie=UTF-8
Intermediate & Advanced SEO | | MJTrevens0 -
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
Any issue? Redirect 100's of domains into one website's internal pages
Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?
Intermediate & Advanced SEO | | Fergclaw0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code?
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code? http://naturespremiumpestdefense.com/ Thanks, Russell
Intermediate & Advanced SEO | | ULTRASEM0 -
Construction website
Hi, I have a construction website that is aimed at tradesmen. There are 2 goals of the site: 1. To allow potential customers to sign up for a trade account. 2. To allow existing customers to access to products and login to their account to make an order. The site is full of categories and products which should be indexed so we rank for these trade products. The homepage redesign is where i am having an issue: Currently the site is set up like a standard retail site but without prices, which are viewable only when logged in. The homepage is designed such that there is several call to actions about promotions, services and to apply for a trade account, that apply to both existing and potential customers. At the moment there is a poor conversion to get potential customers to apply for a trade account. This is because there is too much distraction away from this goal and they are allowed to engage other areas of the site freely. The main purpose of the homepage should be to encourage potential customers to sign up. The secondary purpose to for existing customers to access the accounts and products. I believe potential customers should not be exposed to the categories and products as it is a distraction from the primary goal. Potential customers, i.e. Tradesmen, would already have a certain understanding of the types of products we provide, so I don't feel it is necessary to allow them to crawl the rest of the site unless they have an account. What are your thoughts on that? Here is my lack of understanding: On the homepage, if I restrict access to categories and products to existing account holders only, where a login is required to proceed, would that mean Google cannot access these pages to index them? Or is this only controlled by NoFollows & Robots.txt? Obviously not indexing is undesirable. I do understand potential customers will need some information about our range of products but the idea is to coerce them to sign up for an account so they can see this information. The more information that is provided to a potential customer, the higher the probability a person can make a decision against applying for an account. Restricting access creates a motivator to reveal information and we capture their data to converse with them personally. This increases the probability of us being able to retain their interest by providing a customised service based on their needs. All of this I feel makes perfect sense to me, the only query/obstacle I have is the indexing of the site. If Google cannot index pages that are restricted by account access, then I would like suggestions to solve/compromise/optimise the above. Just to address the desired behaviour of index pages. If in search a our product page appears, the person clicking the link would either be redirected or exposed to a login or sign up screen to view. Thank you so much for your help. Antonio
Intermediate & Advanced SEO | | AVSFencingSupplies0 -
Member request pages, indexed or no indexed?
We run a service website and basically users of the site post their request to get certain items fixed/serviced. Through Google Analytics we have found that we got lots of traffic to these request pages from people searching for those particular items. E.g. A member's request page: "Cost to fix large Victorian oven" has got many visits from searchers searching for "large Victorian oven". The traffic to these pages is about 40% of our Google organic traffic but didn't covert to more users/requests well and has roughly 67% bounce rate. So my question is: should we keep these pages indexed and if yes what can we do to improve the conversion rate/reduce bounce rate? Many thanks guys. David
Intermediate & Advanced SEO | | sssrpm0