SEO Implications of firewalls that block "foreign connections"
-
Hello!
A client's IT security team has firewalls on the site with GEO blocking enabled. This is to prevent foreign connections to applications as part of a contractual agreements with their own clients.
Does anyone have any experience with workarounds for this?
Thank you!
-
Thank you SO much!
-
I guess that if Google decides to crawl your site using one of their data-centers from one of the blocked regions, suddenly Google will believe that your whole site has gone down and become inaccessible (as Google rarely launches crawls from multiple multi-regional data centers, for one website - simultaneously)
**Exempting GoogleBot via user-agent would be the only possible work-around **(that I know of) If those trying to access your site (whom you are trying to block out) became aware of this modification, they could alter their scripts, browsers and tools to send you the GoogleBot user-agent (thus penetrating your firewall pretty easily)
In the end, you just have to decide what's more important to you
It might be possible to identify Google's data-centre IP addresses from server logs and exempt those instead of exempting their user-agent, but that would probably need a full time employee just to keep up with all the changes. You can be sure that Google won't make it easy to identify their data centers via IP data
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have implemented rel = "next" and rel = "prev" but google console is picking up pages as being duplicate. Can anyone tell me what is going on?
I have implemented rel="next" and rel = "prev" across our site but google console is picking it up as duplications. Also individual pages show up in search result too. Here is an example linkhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-healthhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-health?page=0,3The second link shows up as duplicate. What can i do to fix this issue?
Intermediate & Advanced SEO | | akih0 -
Using "nofollow" internally can help with crawl budget?
Hello everyone. I was reading this article on semrush.com, published the last year, and I'd like to know your thoughts about it: https://www.semrush.com/blog/does-google-crawl-relnofollow-at-all/ Is that really the case? I thought that Google crawls and "follows" nofollowed tagged links even though doesn't pass any PR to the destination link. If instead Google really doesn't crawl internal links tagged as "nofollow", can that really help with crawl budget?
Intermediate & Advanced SEO | | fablau0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Volusion SEO
I have an SEO setting on our Volusion e-commerce store enabled, it is titled "Enable full URL for Home Page Canonical Link (include /default.asp)" I am questioning whether or not this should be enabled for optimal SEO performance. Can anyone provide any advice on this?
Intermediate & Advanced SEO | | PartyStore0 -
Glossary SEO Tactics
A B2B client has a glossary of about 300 terms on its Website. It was done to enhance SEO value. The pages are rarely viewed and the text is often short. What are the best (and wackiest!) ideas to leverage this content for SEO. Here are some: Add videos, images Cross link to content pages Open up comments and get students in this sector to review terms and add their own What else do you suggest?
Intermediate & Advanced SEO | | HarrAuto0 -
SEO vs 301
I have a website about "Download of games" and im planning open one about "games online" i know that "games online" its super hard to get good ranks, soo im thinking and do a 301 from my website of "download games" to my new website, do you think that is a good strategy ?
Intermediate & Advanced SEO | | nafera21 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0 -
What does "base" link mean here?
On http://www.google.com/support/webmasters/bin/answer.py?answer=139394, it says: rel="canonical" can be used with relative or absolute links, but we recommend using absolute links to minimize potential confusion or difficulties. If your document specifies a base link, any relative links will be relative to that base link. Where would a document specify a base link? And how?
Intermediate & Advanced SEO | | nicole.healthline0