Some bots excluded from crawling client's domain
-
Hi all!
My client is in healthcare in the US and for HIPAA reasons, blocks traffic from most international sources.
a. I don't think this is good for SEO
b. The site won't allow Moz bot or Screaming Frog bot to crawl it. It's so frustrating.
We can't figure out what mechanism they are utilizing to execute this. Any help as we start down the rabbit hole to remedy is much appreciated.
thank you!
-
The main reason it's not good is that Google crawl from different data-centers around the world. So one day they may think the site is up, then the next they may think the site is gone and down
Typically you use a user-agent lance to pierce these kinds of setups. Screaming Frog for example, you can pre-select from a variety of user-agents (including 'googlebot' and Chrome) but you can also author or write your own user-agent
Write a long one that looks like an encryption key. Tell your client the user agent you have defined, let them create and exemption for it within their spam-defense system. Insert the user-agent (which no one else has or uses) into Screaming Frog, use it to allow the crawler to pierce the defense grid
Typically you would want to exempt 'Googlebot' (as a user agent) from these defense systems, but it comes with a risk. Anyone with basic scripting knowledge or who knows how to install Chrome extensions, can alter the user-agent of their script (or web browser, it's under the user's control) with ease and it is widely known that many sites make an exception for 'Googlebot' - thus it becomes a common vulnerability. For example, lots of publishers create URLs which Google can access and index, yet if you are a bog standard user they ask you to turn off ad-blockers or pay a fee
Download the Chrome User-Agent extension, set your user-agent to "googlebot" and sail right through. Not ideal from a defense perspective
For this reason I have often wished (and I am really hoping someone from Google might be reading) that in Search Console, you could tell Google a custom user-agent string and give it to them. You could then exempt that, safe in the knowledge that no one else knows it, and Google would use your own custom string to identify themselves when accessing your site and content. Then everyone could be safe, indexable and happy
We're not there yet
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
What's the correct SEO for a Gallery?
Hi there, I was wondering if anyone was an expert on galleries and using canonical URL's? URL: http://www.tecsew.com/gallery In short I'm doing SEO for a site and it has a large gallery (3000+ images) where each specific image has it's own page and each category (there's 200+) also has its own page. Now, what I'm thinking is that this should be reduced and asking Google to index/rank each page is wrong (I also think this because the quality of the pages are relatively low i.e little text & content etc) Therefore, what should be suggested/done to the gallery? Should just the main gallery categories get indexed (i.e http://www.tecsew.com/3d-cad-showcase)? Or should I continue to allow Google to trawl through all of it? Or should canonical URL's be used? Any help would be greatly appreciated. Best Wishes, Charlie S
Technical SEO | | media.street0 -
No Google cached snapshot image... 'Text-only version' working.
We are having an issue with Googles cached image snapshops... Here is an example: http://webcache.googleusercontent.com/search?q=cache:IyvADsGi10gJ:shop.deliaonline.com/store/home-and-garden/kitchen/morphy-richards-48781-cooking/ean/5011832030948+&cd=308&hl=en&ct=clnk&gl=uk I wondered if anyone knows or can see the cause of this problem? Thanks
Technical SEO | | pekler1 -
Rel cannonical on all my URL's
Hi, sorry if this question has already been asked, but I can't seem to find the correct answer. In my crawling report for the domain: http://www.wellbo.de I get rel cannonical notices. I have redirected all pages of http://wellbo.de to http://www.wellbo.de with a 301 redirect. Where is my error? Why do I get these notices? I hope the image helps. Ep7Rw.jpg
Technical SEO | | wellbo0 -
How to 301 multiple domain names to a single domain
Hey, I tried to find and answer to this seemingly simple question, but no luck. So, I have one domain name with a website attached to it. I also registered all the other domain names that are similar to it or have different extensions - I want to redirect all the other domain names to my one main domain name without getting penalised by the big G. It looks like this: www.mainsite.com - this is my main domain I also have www.mainsite.com.au, www.mainsite.org, and www.mainsite.org.au which I all want to just redirect to www.mainsite.com I have been told that the best way to do this is a 301 redirect, but to do that you need to make a CNAME for all the other domains that points to www.mainsite.com. My problem is that I cannot seem to create a CNAME record for http://mainsite.com - I have it working for http://www.mainsite.com but not the non www record. What should I be doing differently? Is it just my DNS provider is useless? Thanks, Anthony
Technical SEO | | Grenadi0 -
What's the difference between a category page and a content page
Hello, Little confused on this matter. From a website architectural and content stand point, what is the difference between a category page and a content page? So lets say I was going to build a website around tea. My home page would be about tea. My category pages would be: White Tea, Black Tea, Oolong Team and British Tea correct? ( I Would write content for each of these topics on their respective category pages correct?) Then suppose I wrote articles on organic white tea, white tea recipes, how to brew white team etc...( Are these content pages?) Do I think link FROM my category page ( White Tea) to my ( Content pages ie; Organic White Tea, white tea receipes etc) or do I link from my content page to my category page? I hope this makes sense. Thanks, Bill
Technical SEO | | wparlaman0 -
Accidently did a 301 redirect on root domain and lost domain keyword position
I just bought a domain about a week ago and instantly ranked number 4 for for my keywords with the domain keyword bonus. I created a landing page off the root of my domain while I'm building out my main site. I accidentally did a 301 redirect instead of a 302 from my root to my landing paging and this resulted in me losing my position and only being about to find my domain in the google if I searched for my domain specifically. Anyway to regain my original position? I have removed the redirect. Have I been put in the sandbox?
Technical SEO | | JohnTurner790 -
Site just will not be reincluded in Google's Index
I asked a question about this site (www.cookinggames.com.au) some time ago http://www.seomoz.org/qa/view/38488/site-indexing-google-doesnt-like-it and had some very helpful answers which were great. However I'm still no further ahead. I have added some more content, submitted a new XML sitemap, removed the 'lorem ipsum...' Now it seems that even Bing have ditched the site too. The number 1 result in Australia for the search term 'cooking games' is now this one - http://www.cookinggames.net.au/ which surely is not so much better to deserve a #1 spot whilst my site is deindexed? I have just had another reconsideration request 'denied' and am absolutely out of ideas/. If anyone can help suggest what I need to do... or even suggest how I can get feedback from the search engines what's wring that would be fantastic. Thank you David
Technical SEO | | OzDave0