Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
301 Redirect & Cloaking
-
HEllo~~~~ People.
I have a question regarding on cloaking.
I will be really greatful if you can help me with question.
I have a site www.example.com and it is targeting for multi countries.
So I use sub directories for targeting multi countries.
e.g. www.example.com/us/
www.example.com/hk/ ....... so on and on.
Therefore, when people type www.example.com, I use IP delivery to send users to each coutries.
Here is my question.
I use 301 redirect for IP delivery, which means when user enter www.example.com,
my site read user's IP and send them to right country site by 301 redirect.
In this case, is there any possibility that Google considers it as cloaking?
Please people.... share me some ideas and thoughs.
-
Artience Girl, the information shared by Shane, Aaron and Lewis is correct.
Google wants to see the same page as it would be shown to a user under the same circumstances. If Google is crawling your page from San Jose California, then they want to see what a user from San Jose would see. If they decide to later crawl your site from their center in London, they want to see your site as it would be seen by a London user. The geo-targeting redirects you are presently doing are fine.
If you were to write any code which says to always show the Google crawler the US version of your site, then that tactic would be defined as cloaking. Any time you write code to specifically identify a crawler and show it different content, then you are cloaking.
It seems you are a bit uncomfortable with the answers so let me set you at ease by sharing a Matt Cutts response to your question: http://www.youtube.com/watch?v=GFf1gwr6HJw
-
Hi Shane Thomas.
Thanks for your feedback.
Actually contents is not exactly same, but alot similar. Because I sell different products for different countries.
For example, I sell 30 products for US but only 10 products for UK. In this case, my UK site has only pages for 10 products. Of course, contents lay out and products are similar.
In this case, should I worry about cloaking?
Also, how search engine can see "intent is not deceptive or not"?
I always wondering about that. ^^
-
Hello, Lewis-SEO. Thanks for your reply, but I am not totally following your answer.
What do you mean by "Google only version of the site"?
You mentioned as follow.
"You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL"
Is this meaning that I should set "user agent redirection" for Google bot to send it to particular regional site? e.g. send Google bot to only www.example.com/us/ no matter which country IP address Google bot has?
Please correct me, if I am wrong. But this sounds more cloacking to me.
Google bot with DE IP address should redirect to www.example.com/de/ so google bot can crawl right contents. And when Google bot with UK IP addres should redirect to www.example.com/uk/.
I think if I send alll Google bot to www.example.com/us/ for example, it will confuse google bot more.
Could you please be more specific regarding your answer? PLEASE ~~~
-
Hi Artience Girl
The Google Webmaster guidelines covers topics like these but a key point is that geotargetting using IP address is fine as long as you are not showing Google a separate Google only version of the site. This would be considered cloaking.
You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL
But before you do that check the Google Webmaster guidelines in and around this area as if you follow them you are less likely to end up on the wrong side of them.
Hope this helps.
-
This really does not fit the description of cloaking, the content is the same, just different languages right?
If this is the case IMO this would not bee seen as cloaking as your are not delivering different content, just user experience.
Also as long as you are not separating IP delivery by source (meaning sending spiders somewhere different than humans) this would not be the definition of cloaking.
WIKI:
One use of IP delivery is to determine the requestor's location, and deliver content specifically written for that country. This isn't necessarily cloaking. For instance, Google uses IP delivery for AdWords and AdSense advertising programs to target users in different geographic locations.
As of 2006, many sites have taken up IP delivery to personalise content for their regular customers. Many of the top 1000 sites, including sites like Amazon (amazon.com), actively use IP delivery. None of these have been banned from search engines **as their intent is not deceptive. ** Keyword here..... Deceptive
-
I don't think this would come across as cloaking at all. It's a fairly common practice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a question about the impact of a root domain redirect on site-wide redirects and slugs.
I have a question about the impact (if any) of site-wide redirects for DNS/hosting change purposes. I am preparing to redirect the domain for a site I manage from https://siteImanage.com to https://www.siteImanage.com. Traffic to the site currently redirects in reverse, from https://www.siteImanage.com to https://siteImanage.com. Based on my research, I understand that making this change should not affect the site’s excellent SEO as long as my canonical tags are updated and a 301 redirect is in place. But I wanted to make sure there wasn’t a potential consequence of this switch I’m not considering. Because this redirect lives at the root of all the site’s slugs and existing redirects, will it technically produce a redirect chain or a redirect loop? If it does, is that problematic? Thanks for your input!
Technical SEO | | mollykathariner_ms0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
301 redirect adding trailing slash to url
I am looking into a .htacess file for a site I look after and have noticed that the urls are all 301 redirecting from a none slash directory to a trailing slashed directory/folders. e.g. www.domain.com/folder gets 301 redirected to www.domain.com/folder/ Will this do much harm and reduce the effect on the page and any links pointing to the site be lessened? Secondly I am not sure what part of my htaccess is causing the redirect. RewriteCond %{HTTP_HOST} !^www.domain.co.uk [NC] RewriteCond %{HTTP_HOST} !^$
Technical SEO | | TimHolmes
RewriteRule ^(.*) http://www.domain.co.uk/$1 [L,R,NE] RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.)index.php$ /$1 [R=301,L] or could a wordpress ifmodule be causing the problem? Any info would be apreciated.0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
Index.php and 301 redirect with Joomla
Hi, I'm running Joomla 1.7 with SEF on and I'm trying to do a htaccess redirect which fails. I have approximately 100 in effect so far and all working fine, but I have one snag. Index.php is not working as I need it to when it's redirected to www.myurl.com/ If I turn on index.php redirect to root using this code #index.php to root
Technical SEO | | NaescentAdam
RewriteCond %{HTTP_HOST} ^myurl.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.myurl.com$
RewriteRule ^index.php$ "http://www.myurl.com/" [R=301,L] And then go to www.myurl.com/test.html I'm redirected to the homepage. I think this is because all pages are index.php in joomla. SEOMOZ and Google both think that index.php and root are duplicate pages. Does anyone have any advice for overcoming this? Thanks, Adam0 -
What tools produce a complete list of all URLs for 301 redirects?
I am project managing the rebuild of a major corporate website and need to set up 301 redirects from the old pages to the new ones. The problem is that the old site sits on multiple CMS platforms so there is no way I can get a list of pages from the old CMS. Is there a good tool out there that will crawl through all the sites and produce a nice spreadsheet with all the URLs on it? Somebody mentioned Xenu but I have never used it. Any recommendations? Thanks -Adrian
Technical SEO | | Adrian_Kingwell0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
How many jumps between 301 redirects is acceptable?
For example, I have a page A that should be redirected to page D, but instead A redirects to B, B redirects to C and C redirects to D. It's something I came across and wondering if its worth the dev time to change it. Thanks!
Technical SEO | | pbrothers240