Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
DoesHurt SEO
Hey guys, I've read mixed reviews on this. Does anyone have an answer to whether or not hiding h1 text ( ) negatively effects SEO in 2018. Thanks for the help!
Technical SEO | | Jason-Reid0 -
Facebook widget and blocked images
A Wordpress site has a footer widget for facebook with some images, all of which are served within an iframe. The FB CDN robots is blocking the images from being crawled so Webmaster Tools rendering tool is reporting these 8 or so images as blocked. Should I be concerned?
Technical SEO | | MickEdwards0 -
Personalization software and SEO
Hi guys, I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP. I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide? I'll appreciate your opinions.
Technical SEO | | anagentile1 -
Global Product Tabs and SEO
I am looking at a site that has global or duplicated content in 4 tabs for things like shipping, quality etc on tabs on each product page. The content is very good for UX and helps conversions but poor for SEO. Each product page has unique and tagged photos, unique title and a unique description. Is there away to specifically tell search engines that individual parts of a page are duplicated for good reason? Linking to the content to a single page decreases conversion dramatically but having it on tabs could effect ranking and quality as it is duplicated. Can anyone offer any advice with this?
Technical SEO | | Ian_W0 -
Site Launching, not SEO Ready
Hi, So, we have a site going up on Monday, that in many ways hasn't been gotten ready for search. The focus has been on functionality and UX rather than search, which is fair enough. As a result, I have a big list of things for the developer to complete after launch (like sorting out duplicate pages and adding titles that aren't "undefined" etc.). So, my question is whether it would be better to noindex the site until all the main things are sorted before essentially presenting search engines with the best version we can, or to have the site be indexed (duplicate pages and all) and sort these issues "live", as it were? Would either method be advisable over the other, or are there any other solutions? I just want to ensure we start ranking as well as possible as quickly as possible and don't know which way to go. Thanks so much!
Technical SEO | | LeahHutcheon0 -
SEO for Interspire Relic
Hi All, Does anyone know of optimization best practices for the now largely defunct Interspire Web Publisher? Specifically, I'm looking for a canonical plugin or workaround to try and get rid of a few duplicate content issues (most importantly root vs. index.php). I'd like to just redo the site with a cms that has better support...unfortunately client budget constraints are a little tight at the moment. Thanks!
Technical SEO | | G2W0 -
Seo Yoast Plugin
In Yoast seo plugin, under the general settings is Sitewide meta settings,under which there is an option to either check or leave unchecked Noindex subpages of archives which says: If you want to prevent /page/2/ and further of any archive to show up in the search results, enable this. I am very confuse if to check or not. What is the best seo practice? Most of our posts are in categories which have 2-4 pages.
Technical SEO | | VillasDiani0 -
SEO Tomfoolery
Oh Hai, I recently changed the permalink structure on my Wordpress based site, southwestbreaks.co.uk from the standard ?p=123 to a more SEO chummy /%postname%/. As a result, my site has completely dropped off the board for all my previously well ranked search phrases. Having since gotten into SEOmoz a bit more, I can see there are WP plugins available that apparently would've done this a lot more smoothly. I'd be most grateful if someone could explain if this drop off is just temporary, or have I somehow entered Google's shun book? The site has been like this for about 48 hours. Thanks, Tim
Technical SEO | | Southwesttim0