Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Plugins and SEO
I'd like some expert guidance. I've searched for a theme that does what I want and finally found something I like, but I'm wondering what you all think I should do to increase it's searchability. The plugin has all the listings and styling. All I need to do is past the code into the wordpress site ad voila! I have a page. Using the widget lets me allow upvotes and provide map etc. But it means the content is inside the widget instead of on the page. What would you modify if you wanted to keep the theme & widget to get the best results. http://best-of-sacramento.com/dentists This is my staging site.
Technical SEO | | julie-getonthemap1 -
Content Based on User's IP Address
Hello, A client wants us to create a page on two different sites (www.brandA.com/content and www.brandB.com/content) with similar content and serve up specific content to users based on their IP addresses. The idea is that once a user gets to the page, the content would slightly change (mainly contact information and headers) based on their location. The problem I am seeing with this is that both brandA and brandB would be different Urls so there is a chance if their both optimized for the similar terms then they would both rank and crowd up the search results (duplicate content). Have you seen something similar? What are your thoughts and/or potential solutions? Also, do you know of any sites that are currently doing something similar?
Technical SEO | | Rauxa0 -
Site Launching, not SEO Ready
Hi, So, we have a site going up on Monday, that in many ways hasn't been gotten ready for search. The focus has been on functionality and UX rather than search, which is fair enough. As a result, I have a big list of things for the developer to complete after launch (like sorting out duplicate pages and adding titles that aren't "undefined" etc.). So, my question is whether it would be better to noindex the site until all the main things are sorted before essentially presenting search engines with the best version we can, or to have the site be indexed (duplicate pages and all) and sort these issues "live", as it were? Would either method be advisable over the other, or are there any other solutions? I just want to ensure we start ranking as well as possible as quickly as possible and don't know which way to go. Thanks so much!
Technical SEO | | LeahHutcheon0 -
Differences in Sitemaps SEO wise?
I'm a bit confused about sitemaps. I'm just learning SEO so forgive me if this is a basic question. I've submitted my site to google webmaster using http://pro-sitemaps.com and the sitemap generator it creates. I've also seen sites do this: http://www.johnlewis.com/Shopping/ProductList.aspx and http://www.thesafestcandles.com/site-map.html so I did something similar for my site (www.ldnwicklesscandles.com). You figure you see everyone do it you might as well try it too and hope it works. 😉 So I've done both 1 and 2. Which sitemap is best for SEO purposes or should I do both? Is there any format that should or shouldn't be used for Option 2? Any site examples for good practice would be helpful.
Technical SEO | | cmjolley0 -
Does server location impact SEO?
Hi, I am about to purchase hosting for my WordPress site which is primarily targeting the UK and wondered if server location still has an impact on SEO? Also can anyone recommend a reliable hosting provider with CPanel?
Technical SEO | | Wallander0 -
Are there negative SEO implications to pages without any images?
Hi Mozzers, Do you think there are any negative effects of having no images on a page but several hundreds words of text? (There is a logo image and call to action buttons). Thanks!
Technical SEO | | Charlessipe0 -
Multiple Domains on 1 IP Address
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company. They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses? Thank you!
Technical SEO | | Motivators0 -
Bit.ly URLs. Are they SEO Friendly?
Are URL shorteners like Bit.ly considered 301 redirects? I was thinking about using them for some longer URL's in press releases but i didn't want to loose any link juice through the service. Thanks for the info! - Kyle
Technical SEO | | kchandler0