Website Displayed by Google as Https: when all Secure Content is Blocked - Causing Index Prob.
-
Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com
In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well. However my homepage did not recover for its top keywords.
Today I notice that when I search for my site, its displayed as https://
Robots.txt blocks all content going to any secure page. Leaving me sort of clueless what I need to do to fix this. Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem.
Any ideas? Redirect the google bot only? Will a canonical tag fix this?
Thx
-
Yeah, I have all of that in place. I found 1 external link from an https , and 1 on my blog that was just an error one of my employees made. 2 Links total, at least thats what I found. Robots blocking everything you mentioned. My header uses absolute paths.
I do agree with you on one thing, once kicked, the little things that may not have mattered over the past 15 years all the sudden pop up as problems... At the same time I have heard the complete opposite, people are kicked and then they are right back where they used to be a few weeks after being included.
Competitive sabotage is positively happening, unless a random person who happens to live in the same city my competitor is located just went awol and decided they wanted to spam my offsite forums, attempt to hack the website multiple times, and add me to a spam link rink.
Anyway a webmaster says he has changed the canonical on their end to http , although it hasnt changed yet. I'm sure this could take a few days or longer to take place. Hopefully that is the fix, we'll see though and thanks for the advise!
-
Someone could probably have an answer to you within minutes if they had the domain URL available.
RE: Competitive sabotage, I very highly doubt it.
RE: Having just occurred - That is often a sticking-point for no good reason. Do not be concerned so much as to why it wasn't an issue before and focus on how to fix it now. Google's algorithm changes all the time. Your standing in the algorithm changes all the time. Trust can be lost if you get a penalty, even if you get out of it. One external link too many going to https, or one change in the crawl path so Googlebog ends up on the https site via a relative path link... Things can suddenly change for a variety of reasons. However, if you do what is being suggested you are very likely to put this issue behind you.
Here is what I do with eCommerce sites, typically:
- Rel canonical both versions to the http version
- Add a robots.txt block and robots meta noindex tag to shopping cart pages
- Use absolute paths, if possible (e.g. http://www.domain.com/file/ instead of .../file/), especially in your primary navigation and footer links.
If that doesn't work please let us know and we can evaluate the site for you.
Good luck!
-
Hmm, see no major changes have been made to the cart. The website has ranked for 15 years, so the https thing just popped up after the penalty/ re inclusion.
I'm wondering, since the canonical tag was added fairly recently. Do you think I should just fetch the homepage and submit again? Or even add a new page, and fetch/crawl/submit that?
Just to get a fresh crawl? Crawl stats show about 2250 on average daily, so I was expecting this https thing to be gone by now... Regardless of why they chose it to index over my normal link.
thx for the input
-
How about changing all of your links from relative to absolute in the HTML? If they're truly only getting there from navigation internally after visiting the shopping cart, this would solve that, yes? Just a thought.
-
If that is the case, then your shopping cart is not "acting right". Https will exist for every page in your site and it shouldn't. What cart are you using? I would redirect everything outside of the payment, cart, and contact pages to non secure. There is a disconnect from what robots files actually do and what people think they do. They are a suggestion, no index means not to add it to the index, but it does not mean don't go on that page. I have spiders on pages that are blocked from them all of the time.
-
My only concern with doing a redirect is this. The shopping cart is https: , so if you start the checkout process you will enter https:
If person decides to continue shopping... They will stay in the https, but since the checkout page is restricted to bots, essentially https doesnt exist and shouldnt show on any searches.
The sitemaps is clean, and a canonical is in place...
I have been having some issues with a competitor, is it possible they submitted https://www.mysite.com/ version of my website knowing that google will prefer this version?
thx for the advise
-
I would redirect the https version to http. Then I would make sure that there is a cannonical tag in place, next I would go over my site map and make sure that there isn't an link to the https page in there. After that you should be set, I wouldn't put it in the robots.txt though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do if lots of backend pages have been indexed by Google erroneously?
Hi Guys Our developer forgot to add a no index no follow tag on the pages he created in the back-end. So we have now ended up with lots of back end pages being indexed in google. So my question is, since many of those are now indexed in Google, so is it enough to just place a no index no follow on those or should we do a 301 redirect on all those to the most appropriate page? If a no index no follow is enough, that would create lots of 404 errors so could those affect the site negatively? Cheers Martin
Intermediate & Advanced SEO | | martin19700 -
Google Search Console indexes website for www but images for non www.
On the google search console, the website data is all showing for the www.promierproducts.com. The images however are indexed on the non www version. I'm not sure why.
Intermediate & Advanced SEO | | MikeSab1 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Thin Content to Quality Content
How should i modify content from thin to high quality content. Somehow i realized that my pages where targetted keywords didn't had the keyword density lost a massive ranking after the last update whereas all pages which had the keyword density are ranking good. But my concern is all pages which are ranking good had all the keyword in a single statement like. Get ABC pens, ABC pencils, ABC colors, etc. at the end of a 300 word content describing ABC. Whereas the pages which dropped the rankings had a single keyword repeated just twice in a 500 word article. Can this be the reason for a massive drop. Should i add the single statement like the one which is there on pages ranking good? Is it good to add just a single line once the page is indexed or do i need to get a fresh content once again along with a sentence of keyword i mentioned above?
Intermediate & Advanced SEO | | welcomecure1 -
2-websites focused on different markets but similar content
Hi all! I have a client who wants to branch out to another market (currently in Northern California and wants to open an office in Southern California), what would happen if we put up a second website that has similar content, but is exclusively for Southern California, with a different office address, and all the content geared towards Southern California market? There would be NO linking between the sites. Would that generate a penalty? Thanks! BB
Intermediate & Advanced SEO | | BBuck0 -
Malicious site pointed A-Record to my IP, Google Indexed
Hello All, I launched my site on May 1 and as it turns out, another domain was pointing it's A-Record to my IP. This site is coming up as malicious, but worst of all, it's ranking on keywords for my business objectives with my content and metadata, therefore I'm losing traffic. I've had the domain host remove the incorrect A-Record and I've submitted numerous malware reports to Google, and attempted to request removal of this site from the index. I've resubmitted my sitemap, but it seems as though this offending domain is still being indexed more thoroughly than my legitimate domain. Can anyone offer any advice? Anything would be greatly appreciated! Best regards, Doug
Intermediate & Advanced SEO | | FranGen0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
Export list of urls in google's index?
Is there a way to export an exact list of urls found in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0