How can I prevent Google and other search engines to crawl my secured pages (https:)?
-
Let me know your thoughts guys. Thanks in advance!
-
Your best bet is to place a meta noindex tag on each secure page. If it's only a few pages, you could just add it by hand. If it's many, you should be able to access each pages protocol with whatever server-side language you're using, and dynamically add it on all secure pages.
If you use robots.txt to exclude the pages, Google can still show them in search results, with the description below them that reads, "A description for this result is not available because of this site's robots.txt – learn more." Personally, I don't care for that.
-
Hi there, blocking the HTTPS version of your pages from being crawled by the search engines is a bit tricky. You might need to come up with a separate robots.txt file to handle the HTTPS requests.
Here you go to know more about the process:
http://www.seoworkers.com/seo-articles-tutorials/robots-and-https.html
Hope it helps.
Best,
Devanur Rafi
-
Hi esiow
You have a choice of placing a robots.txt file in the root folder of your website or if blocking individual pages you could use the meta robots tag. See these page for more information: http://moz.com/learn/seo/robotstxt and https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Problem With My Pages ?
Hi everyone, I created a self-hosted custom PHP website the page is ( custom cosmetic boxes ), but it doesn't show up on Google. I tried to follow tutorials on the HubSpot, but even my site sucks. Please help me where I am wrong. Maybe the little thing is left unattended, which makes it void.
Web Design | | fbowable
Please help me.0 -
How Does Google View Hidden Content?
I have a website which contains a lot of content behind a show hide, does Google crawl the "hidden" copy?
Web Design | | jasongmcmahon0 -
Http to https:Risky?
I recently paid to have my site changed to https but at last minute did not publish it as I got nervous about it! I asked several experienced SEO people and they suggested that as my business is very small, I have a very small budget and it is 100% reliant on search engine hits, that I might take a 10 - 30% hit on ranking and it would take a long time to regain where I was. And that is the best case scenario if we do everything correctly. Apparently even with being very careful, it is easy to make errors which could have some unfortunate consequences. Any opinions?? Is it worth it?
Web Design | | Llanero1 -
Migrating login page from website: SEO impact
Our current login page looks like www.website.com/log-in/. We are planning to migrate it to a sub directory login.website.com. For years, our login page is the top landing with highest visits after homepage. If we migrate this now, are we going to loose traffic and drop in rankings? Thanks
Web Design | | vtmoz0 -
Infinite Scroll and SEO - Is it enough to only link to the previous and next page in the pagination?
Hi all, We are implementing an eCommerce site where the results pages of the products will be visibile on one page (always loading new products when you scroll down the page). Now, I have read that the Google spiders cannot "load" new products scrolling down the page, hence the spider only sees the first few products of the results page. Our developer wants to implement a system where a users sees the first products on example.com/products Then scrolling down, he will see new products with the URL changing to example.com/page/2 and so on. Is it enough that we add a pagination link that goes from example.com/products to example.com/page/2 Then another link that goes from example.com/page/2 to example.com/page/3 and so on, so the Google spider can make his way through all the pages? Or is that too much deep linking and the spider wouldn't even crawl all the results pages? Any recommendations how to go about this? Many thanks in advance!
Web Design | | Gabriele_Layoutweb0 -
In Google Anayltics does the iPad always report portrait orientation?
I find I have disproportionate anount of visitors viewing at 768x1024, which coincides with the high iOS visitor rate. However, does this mean the vistors are all viewing in portrait orientation, or does the report portair regardlaess of orientation?
Web Design | | gotomarketers0 -
Home page redirect - will this cause an SEO problem
Hello, We are using Wordpress to build a wiki site. The wiki plugin we're using (Wordpress Wiki lite) can only be set up on an internal page like nlpwiki(dot)org/wiki Can we redirect the home page to the /wiki subdirectory and use nlpwiki(dot)org/wiki as our home page? I've never done that, just wondering if it will be indexed as the home page or if there are any connonical issues. Thanks!
Web Design | | BobGW0 -
Do you think it will be a good idea to delete old blog pages off the server
I paid somebody to build my website using Dreamweaver, and at one point I didn't know how to use the template which automatically updates every page in the menu section so I stupidly broke the template on every new page when I made the websites blog and put the pages into a subfolder. I realised this was a silly thing to do and now and I now know how to use the template correctly I've copied every single page over from the subfolder and put it into the main template. Now I can update the template menu and every page changes automatically. The only problem is I've now got two versions of every page of my blog on the website. For some reason when I do a sitemap it comes up with a links to the old blog pages I, don't know why when I've removed the links from the blog page? and also the new copies also. I have basically got a copys of all blog pages. Do you think it will be a good idea to delete old indexed blog pages off the server so that when Google spiders the site it will pick up only the new links to the copy pages?
Web Design | | whitbycottages0