Website Displayed by Google as Https: when all Secure Content is Blocked - Causing Index Prob.
-
Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com
In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well. However my homepage did not recover for its top keywords.
Today I notice that when I search for my site, its displayed as https://
Robots.txt blocks all content going to any secure page. Leaving me sort of clueless what I need to do to fix this. Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem.
Any ideas? Redirect the google bot only? Will a canonical tag fix this?
Thx
-
Yeah, I have all of that in place. I found 1 external link from an https , and 1 on my blog that was just an error one of my employees made. 2 Links total, at least thats what I found. Robots blocking everything you mentioned. My header uses absolute paths.
I do agree with you on one thing, once kicked, the little things that may not have mattered over the past 15 years all the sudden pop up as problems... At the same time I have heard the complete opposite, people are kicked and then they are right back where they used to be a few weeks after being included.
Competitive sabotage is positively happening, unless a random person who happens to live in the same city my competitor is located just went awol and decided they wanted to spam my offsite forums, attempt to hack the website multiple times, and add me to a spam link rink.
Anyway a webmaster says he has changed the canonical on their end to http , although it hasnt changed yet. I'm sure this could take a few days or longer to take place. Hopefully that is the fix, we'll see though and thanks for the advise!
-
Someone could probably have an answer to you within minutes if they had the domain URL available.
RE: Competitive sabotage, I very highly doubt it.
RE: Having just occurred - That is often a sticking-point for no good reason. Do not be concerned so much as to why it wasn't an issue before and focus on how to fix it now. Google's algorithm changes all the time. Your standing in the algorithm changes all the time. Trust can be lost if you get a penalty, even if you get out of it. One external link too many going to https, or one change in the crawl path so Googlebog ends up on the https site via a relative path link... Things can suddenly change for a variety of reasons. However, if you do what is being suggested you are very likely to put this issue behind you.
Here is what I do with eCommerce sites, typically:
- Rel canonical both versions to the http version
- Add a robots.txt block and robots meta noindex tag to shopping cart pages
- Use absolute paths, if possible (e.g. http://www.domain.com/file/ instead of .../file/), especially in your primary navigation and footer links.
If that doesn't work please let us know and we can evaluate the site for you.
Good luck!
-
Hmm, see no major changes have been made to the cart. The website has ranked for 15 years, so the https thing just popped up after the penalty/ re inclusion.
I'm wondering, since the canonical tag was added fairly recently. Do you think I should just fetch the homepage and submit again? Or even add a new page, and fetch/crawl/submit that?
Just to get a fresh crawl? Crawl stats show about 2250 on average daily, so I was expecting this https thing to be gone by now... Regardless of why they chose it to index over my normal link.
thx for the input
-
How about changing all of your links from relative to absolute in the HTML? If they're truly only getting there from navigation internally after visiting the shopping cart, this would solve that, yes? Just a thought.
-
If that is the case, then your shopping cart is not "acting right". Https will exist for every page in your site and it shouldn't. What cart are you using? I would redirect everything outside of the payment, cart, and contact pages to non secure. There is a disconnect from what robots files actually do and what people think they do. They are a suggestion, no index means not to add it to the index, but it does not mean don't go on that page. I have spiders on pages that are blocked from them all of the time.
-
My only concern with doing a redirect is this. The shopping cart is https: , so if you start the checkout process you will enter https:
If person decides to continue shopping... They will stay in the https, but since the checkout page is restricted to bots, essentially https doesnt exist and shouldnt show on any searches.
The sitemaps is clean, and a canonical is in place...
I have been having some issues with a competitor, is it possible they submitted https://www.mysite.com/ version of my website knowing that google will prefer this version?
thx for the advise
-
I would redirect the https version to http. Then I would make sure that there is a cannonical tag in place, next I would go over my site map and make sure that there isn't an link to the https page in there. After that you should be set, I wouldn't put it in the robots.txt though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Google to index our sitemap
Hi, We have a sitemap on AWS that is retrievable via a url that looks like ours http://sitemap.shipindex.org/sitemap.xml. We have notified Google it exists and it found our 700k urls (we are a database of ship citations with unique urls). However, it will not index them. It has been weeks and nothing. The weird part is that it did do some of them before, it said so, about 26k. Then it said 0. Now that I have redone the sitemap, I can't get google to look at it and I have no idea why. This is really important to us, as we want not just general keywords to find our front page, but we also want specific ship names to show links to us in results. Does anyone have any clues as to how to get Google's attention and index our sitemap? Or even just crawl more of our site? It has done 35k pages crawling, but stopped.
Intermediate & Advanced SEO | | shipindex0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
New website strategy concerning Google Spider
Hello, I have a question concerning a new website. What should I do, SEO wise? Should I place all my content on my pages at once? And thus let the spider crawl everything at once? Or should I place my content in different phases? So the spider could crawl my pages multiple times in some days/weeks time? Or do both ways come to the same result? Thank you,
Intermediate & Advanced SEO | | MarikeP0 -
WMT Index Status - Possible Duplicate Content
Hi everyone. A little background: I have a website that is 3 years old. For a period of 8 months I was in the top 5 for my main targeted keyword. I seemed to have survived the man eating panda but not so sure about the blood thirsty penguin. Anyway; my homepage, along with other important pages, have been wiped of the face of Google's planet. First I got rid of some links that may not have been helping and disavowed them. When this didn't work I decided to do a complete redesign of my site with better content, cleaner design, removed ads (only had 1) and incorporated social integration. This has had no effect at all. I filed a reconsideration request and was told that I have NOT had any manual spam penalties made against me, by the way I never received any warning messages in WMT. SO, what could be the problem? Maybe it's duplicate content? In WMT the Index Status indicates that there are 260 pages indexed. However; I have only 47 pages in my sitemap and when I do a site: search on Google it only retrieves 44 pages. So what are all these other pages? Before I uploaded the redesign I removed all the current pages from the index and cache using the remove URL tool in WMT. I should mention that I have a blog on Blogger that is linked to a subdomain on my hosting account i.e. http://blog.mydomain.co.uk. Are the blog posts counted as pages on my site or on Blogger's servers? Ahhhh this is too complicated lol Any help will be much appreciated! Many thanks, Mark.
Intermediate & Advanced SEO | | Nortski0 -
Google is displaying my pages path instead of URLS (Pages name)
Does anyone knows why Google is displaying my pages path instead of the URL in the search results, i discoverd that while am searching using a keyword of mine then i copied the link http://www.smarttouch.me/services-saudi/web-services/web-design and found all related results are the same, could anyone one tell me why is that and is it really differs? or the URL display is more important than the Path display for SEO!
Intermediate & Advanced SEO | | ali8810 -
Content Marketing: Should we build a separate website or built in site within the Website itself?
Hi Mozzers, Client: Big carpet cleaner player in the carpet cleaning industry Main Goal: Creating good content to Get more organic traffic to our main site Structure of the extra content: It will act like a blog but will be differentiated from the regular site by not selling anything but just creating good content. The look and design will be different from the client's site. SEO question: In terms of SEO, what would be the most beneficial for us to do, should we built in this new section/site outside or inside the client's site? I personally think that it should be separated from the main site because of the main reasons: A followed link to the main site Anchor texts implementation linking back to our service pages If we would to choose to build in this content, it would be highly beneficial for getting organic traffic within the main site but I am afraid this will not provide us any link juice since anchor texts won't be accounted the same since all of those would be located in the Nav bar of the main site. Can someone tell me what would be the best in terms of SEO? P.S: My boss doesn't agree with me and would rather go the second option (build in within the main site) that's why i am asking you guys what would be the most beneficial? Thank you Guys
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Multiple cities/regions websites - duplicate content?
We're about to launch a second site for a different, neighbouring city in which we are going to setup a marketing campaign to target sales in that city (which will also have a separate office there as well). We are going to have it under the same company name, but different domain name and we're going to do our best to re-write the text content as much as possible. We want to avoid Google seeing this as a duplicate site in any way, but what about: the business name the toll free number (which we would like to have same on both sites) the graphics/image files (which we would like to have the same on both sites) site structure, coding styles, other "forensic" items anything I might not be thinking of... How are we best to proceed with this? What about cross-linking the sites?
Intermediate & Advanced SEO | | webdesignbarrie0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0