Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
301 Redirect & Cloaking
-
HEllo~~~~ People.
I have a question regarding on cloaking.
I will be really greatful if you can help me with question.
I have a site www.example.com and it is targeting for multi countries.
So I use sub directories for targeting multi countries.
e.g. www.example.com/us/
www.example.com/hk/ ....... so on and on.
Therefore, when people type www.example.com, I use IP delivery to send users to each coutries.
Here is my question.
I use 301 redirect for IP delivery, which means when user enter www.example.com,
my site read user's IP and send them to right country site by 301 redirect.
In this case, is there any possibility that Google considers it as cloaking?
Please people.... share me some ideas and thoughs.
-
Artience Girl, the information shared by Shane, Aaron and Lewis is correct.
Google wants to see the same page as it would be shown to a user under the same circumstances. If Google is crawling your page from San Jose California, then they want to see what a user from San Jose would see. If they decide to later crawl your site from their center in London, they want to see your site as it would be seen by a London user. The geo-targeting redirects you are presently doing are fine.
If you were to write any code which says to always show the Google crawler the US version of your site, then that tactic would be defined as cloaking. Any time you write code to specifically identify a crawler and show it different content, then you are cloaking.
It seems you are a bit uncomfortable with the answers so let me set you at ease by sharing a Matt Cutts response to your question: http://www.youtube.com/watch?v=GFf1gwr6HJw
-
Hi Shane Thomas.
Thanks for your feedback.
Actually contents is not exactly same, but alot similar. Because I sell different products for different countries.
For example, I sell 30 products for US but only 10 products for UK. In this case, my UK site has only pages for 10 products. Of course, contents lay out and products are similar.
In this case, should I worry about cloaking?
Also, how search engine can see "intent is not deceptive or not"?
I always wondering about that. ^^
-
Hello, Lewis-SEO. Thanks for your reply, but I am not totally following your answer.
What do you mean by "Google only version of the site"?
You mentioned as follow.
"You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL"
Is this meaning that I should set "user agent redirection" for Google bot to send it to particular regional site? e.g. send Google bot to only www.example.com/us/ no matter which country IP address Google bot has?
Please correct me, if I am wrong. But this sounds more cloacking to me.
Google bot with DE IP address should redirect to www.example.com/de/ so google bot can crawl right contents. And when Google bot with UK IP addres should redirect to www.example.com/uk/.
I think if I send alll Google bot to www.example.com/us/ for example, it will confuse google bot more.
Could you please be more specific regarding your answer? PLEASE ~~~
-
Hi Artience Girl
The Google Webmaster guidelines covers topics like these but a key point is that geotargetting using IP address is fine as long as you are not showing Google a separate Google only version of the site. This would be considered cloaking.
You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL
But before you do that check the Google Webmaster guidelines in and around this area as if you follow them you are less likely to end up on the wrong side of them.
Hope this helps.
-
This really does not fit the description of cloaking, the content is the same, just different languages right?
If this is the case IMO this would not bee seen as cloaking as your are not delivering different content, just user experience.
Also as long as you are not separating IP delivery by source (meaning sending spiders somewhere different than humans) this would not be the definition of cloaking.
WIKI:
One use of IP delivery is to determine the requestor's location, and deliver content specifically written for that country. This isn't necessarily cloaking. For instance, Google uses IP delivery for AdWords and AdSense advertising programs to target users in different geographic locations.
As of 2006, many sites have taken up IP delivery to personalise content for their regular customers. Many of the top 1000 sites, including sites like Amazon (amazon.com), actively use IP delivery. None of these have been banned from search engines **as their intent is not deceptive. ** Keyword here..... Deceptive

-
I don't think this would come across as cloaking at all. It's a fairly common practice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirects delay in picking up
Hi I have been involved in the redesign/development of a website which has up until now had a lot of international traffic. On day of migration I uploaded all the 301 redirects to the website (wordpress) using Simple 301 redirect plugin. I tested a number of them and they appeared to be working. I also submitted the new sitemaps to Search Console. Since migration international traffic - particularly from countries such as india, Phillipines, Sri Lanka etc have significantly dropped off whereas the local traffic and some of the international traffic such as USA has remained fairly consistent. Looking at Analytics and entrances recently it appears as though search results are/were showing a number of pages with 404's (one in particular which received significant traffic and for which I had created a 301 redirection) - I have checked this page using the old url and it re-directs correctly for me and today asked a colleague in India to also check - he is getting the redirection fine. Does Google.in take a significantly longer time to pick these up in search results? Or am I missing something?
Technical SEO | | musthavemarketing0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
301 Redirect non existant pages
Hi I have 100's of URL's appearing in Search Console for example: ?p=1_1 These go to on to 5_200 etc.. I have tried to do htaccess and the mod rewrite is on as I can redirect directories to the root i.e RewriteRule ^web_example(.*)$ /$1 [R=301,N,L] However I have tried all kinds of variations to redirect ?p= and either it doesn't work at all or it crashes the website. Can anyone point me in the right direction to fix this.
Technical SEO | | Cocoonfxmedia0 -
Max Number of 301 Redirections?
Hi, We currently made a re-design of a website and we changed all our urls to make them shorter. I made more than 300 permanent redirections but plenty more are needed since WMT is showing some more 404s from old urls that I hadn't seen because they were dynamic. The question is, please, is there a limit? I think we have more than 600 already. We don't want to create a php commando to redirect all the old ones to our home, we are redirecting them to their correspondent url. By the way, Im doing them with the 301 method in .htaccess. Thanks in advance.
Technical SEO | | Tintanus0 -
Redirect URLS with 301 twice
Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks
Technical SEO | | AL123al0 -
301 redirect relative or absolute path?
Hello everyone, Recently we've changed the URL structure on our website, and of course we had to 301 redirect the old urls to the coresponding new ones. The way the technical guys did this is: "http://www.domain.com/old-url.html" 301 redirect to "/new-url.html"
Technical SEO | | Silviu
meaning as a relative redirect path, not an absolute one like this:
"http://www.domain.com/old-url.html" 301 redirect to "http://www.domain.com/new-url.html" This happened for few thousands urls, and the fact is the organic traffic dropped for those pages after this change. (no other changes were made on these pages and the new urls are as seo friendly as possible, A grade on On-Page Grader). The question is: does the relative redirect negatively affects seo, or it counts the same as an absolute path redirect? Thanks,
S.0 -
How do I fix a 301 Redirect Loop?
Saturday I waas doing some correcting of some duplicate titles, including nofollowing tags, etc. (my main problem was duplicate titles due to tags and categories being indexed). Now this morning I see that one of my pages refuses to load, citing a 301 redirect loop. http://www.incredibleinfant.com/feeding/switching-baby-formula/ Originally, the page was posted under the wrong category. http://www.incredibleinfant.com/uncategorized/switching-baby-formula I resaved it under the correct category (feeding) and now it won't load. Can someone help me figure out how to correct this mess? Thanks so much Heather
Technical SEO | | Gotmoxie0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0