How can I prevent Google and other search engines to crawl my secured pages (https:)?
-
Let me know your thoughts guys. Thanks in advance!
-
Your best bet is to place a meta noindex tag on each secure page. If it's only a few pages, you could just add it by hand. If it's many, you should be able to access each pages protocol with whatever server-side language you're using, and dynamically add it on all secure pages.
If you use robots.txt to exclude the pages, Google can still show them in search results, with the description below them that reads, "A description for this result is not available because of this site's robots.txt – learn more." Personally, I don't care for that.
-
Hi there, blocking the HTTPS version of your pages from being crawled by the search engines is a bit tricky. You might need to come up with a separate robots.txt file to handle the HTTPS requests.
Here you go to know more about the process:
http://www.seoworkers.com/seo-articles-tutorials/robots-and-https.html
Hope it helps.
Best,
Devanur Rafi
-
Hi esiow
You have a choice of placing a robots.txt file in the root folder of your website or if blocking individual pages you could use the meta robots tag. See these page for more information: http://moz.com/learn/seo/robotstxt and https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
Why is my homepage not indexed by Google or Bing
http://www.schoppnutritionclinic.com/ Home page is not indexed by Google or Bing but all other pages are indexed. I know that currently i am missing the robot.TXT file and the sitemap. This is something i am woking on as a possible solution. I would have thought Google/Bing would have still indexed this page regardless of the lack of sitemap/robot.txt files not being present. I attempted to run a fetch and render in Webmaster tools and received a Not Found status.
Web Design | | ChrisSams0 -
Is it still necessary to have a "home" page button/link in the top nav?
Or is it not necessary to have a "home" tab/link because everybody by this time knows you can get to the home page by clicking on the logo?
Web Design | | FindLaw0 -
Is there a Joomla! Component For A Blog Page That Is Recommended?
A business partner currently has a page on a Joomla! website that is passing for the blog page. I am not a Joomla! guy so I dont' know much about it. I do know that I don't like a lot of things and prefer Drupal however making a change to Drupal on that site is not an option. We need to upgrade the blog page so that it is more like a blog and I know there has to be an SEO friendly component for a Joomla! blog page. Any ideas?
Web Design | | Atlanta-SMO1 -
Can white text over images hurt your SEO?
Hi everyone, I run a travel website that has about 30 pre-search city landing pages. In a redesign last year we added large "hero" images to the top of the page, and put our h1 headlines on top of them in white. The result is attractive, but I'm wondering if Google could be reading this page as "white text on white page", which is an obvious no-no, especially if it could seem that we're trying to hide text. Here's an example: http://www.eurocheapo.com/paris/ H1: Expert reviews of cheap hotels in Paris I should add that our SERPs for these city pages has dropped (for "Cheap hotels in X"), but it could obviously be related to other issues. Any advice would be appreciated. Many thanks! Tom
Web Design | | TomNYC0 -
Wordpress Pages not indexing in Google
Hi, I've created a Wordpress site for my client. I've produced 4 content pages and 1 home page but in my sitemap it only says I have 1 page indexed. Also SEOmoz only finds 1 page. I'm lost on what the problem could be. The domain name is www.dobermandeen.co.uk Many thanks for any help. Alex
Web Design | | SeoSheikh0 -
IP block in Google
Our office has a number of people performing analysis and research on keyword positions, volume, competition etc. We have 1 external static IP address. We installed the static IP so we can filter out our visits in Google Analytics. However by 10 AM we get impssible CAPTCHA's or even get blocked in Google. Do you have any experience with such an issue? Any solutions you can recommend? Any help would be appreciated! SXI5A.png
Web Design | | Partouter0 -
How do you account for misspellings in search engine queries?
Howdy everyone, I'm pretty new to the whole SEO thing, in fact I hadn't even heard the term until this past Fall when a company I was doing a little freelance writing for fired their SEO guy and asked if I thought I could help them with it. I have a (old) background in HTML coding and web design, but have been out of the business for over a decade. This may be a simple question, but it has come up in discussion several times... How do you make sure that users are directed to your site even if they enter keywords with spelling errors? I know that Google offers "did you mean..." links for a lot of words. Is that the best method and if so, how do you manipulate the data so the misspellings continue to result in your site being listed? Any help on this is greatly appreciated! Marty K.
Web Design | | MartinKlausmeier0