Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
301 Redirect & Cloaking
-
HEllo~~~~ People.
I have a question regarding on cloaking.
I will be really greatful if you can help me with question.
I have a site www.example.com and it is targeting for multi countries.
So I use sub directories for targeting multi countries.
e.g. www.example.com/us/
www.example.com/hk/ ....... so on and on.
Therefore, when people type www.example.com, I use IP delivery to send users to each coutries.
Here is my question.
I use 301 redirect for IP delivery, which means when user enter www.example.com,
my site read user's IP and send them to right country site by 301 redirect.
In this case, is there any possibility that Google considers it as cloaking?
Please people.... share me some ideas and thoughs.
-
Artience Girl, the information shared by Shane, Aaron and Lewis is correct.
Google wants to see the same page as it would be shown to a user under the same circumstances. If Google is crawling your page from San Jose California, then they want to see what a user from San Jose would see. If they decide to later crawl your site from their center in London, they want to see your site as it would be seen by a London user. The geo-targeting redirects you are presently doing are fine.
If you were to write any code which says to always show the Google crawler the US version of your site, then that tactic would be defined as cloaking. Any time you write code to specifically identify a crawler and show it different content, then you are cloaking.
It seems you are a bit uncomfortable with the answers so let me set you at ease by sharing a Matt Cutts response to your question: http://www.youtube.com/watch?v=GFf1gwr6HJw
-
Hi Shane Thomas.
Thanks for your feedback.
Actually contents is not exactly same, but alot similar. Because I sell different products for different countries.
For example, I sell 30 products for US but only 10 products for UK. In this case, my UK site has only pages for 10 products. Of course, contents lay out and products are similar.
In this case, should I worry about cloaking?
Also, how search engine can see "intent is not deceptive or not"?
I always wondering about that. ^^
-
Hello, Lewis-SEO. Thanks for your reply, but I am not totally following your answer.
What do you mean by "Google only version of the site"?
You mentioned as follow.
"You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL"
Is this meaning that I should set "user agent redirection" for Google bot to send it to particular regional site? e.g. send Google bot to only www.example.com/us/ no matter which country IP address Google bot has?
Please correct me, if I am wrong. But this sounds more cloacking to me.
Google bot with DE IP address should redirect to www.example.com/de/ so google bot can crawl right contents. And when Google bot with UK IP addres should redirect to www.example.com/uk/.
I think if I send alll Google bot to www.example.com/us/ for example, it will confuse google bot more.
Could you please be more specific regarding your answer? PLEASE ~~~
-
Hi Artience Girl
The Google Webmaster guidelines covers topics like these but a key point is that geotargetting using IP address is fine as long as you are not showing Google a separate Google only version of the site. This would be considered cloaking.
You will therefore need to decide which regional variation you want Google to end up at when it tries to visit/crawl the www.example.com URL
But before you do that check the Google Webmaster guidelines in and around this area as if you follow them you are less likely to end up on the wrong side of them.
Hope this helps.
-
This really does not fit the description of cloaking, the content is the same, just different languages right?
If this is the case IMO this would not bee seen as cloaking as your are not delivering different content, just user experience.
Also as long as you are not separating IP delivery by source (meaning sending spiders somewhere different than humans) this would not be the definition of cloaking.
WIKI:
One use of IP delivery is to determine the requestor's location, and deliver content specifically written for that country. This isn't necessarily cloaking. For instance, Google uses IP delivery for AdWords and AdSense advertising programs to target users in different geographic locations.
As of 2006, many sites have taken up IP delivery to personalise content for their regular customers. Many of the top 1000 sites, including sites like Amazon (amazon.com), actively use IP delivery. None of these have been banned from search engines **as their intent is not deceptive. ** Keyword here..... Deceptive
-
I don't think this would come across as cloaking at all. It's a fairly common practice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a question about the impact of a root domain redirect on site-wide redirects and slugs.
I have a question about the impact (if any) of site-wide redirects for DNS/hosting change purposes. I am preparing to redirect the domain for a site I manage from https://siteImanage.com to https://www.siteImanage.com. Traffic to the site currently redirects in reverse, from https://www.siteImanage.com to https://siteImanage.com. Based on my research, I understand that making this change should not affect the site’s excellent SEO as long as my canonical tags are updated and a 301 redirect is in place. But I wanted to make sure there wasn’t a potential consequence of this switch I’m not considering. Because this redirect lives at the root of all the site’s slugs and existing redirects, will it technically produce a redirect chain or a redirect loop? If it does, is that problematic? Thanks for your input!
Technical SEO | | mollykathariner_ms0 -
1000 Pages on old website. What to do with the 301 redirects for this domain?
Hi Moz Community, I have a 301 redirect question... I just acquired an old domain: Totally in my niche Domain is 14 years old Website exists of 1000 pages Great amount of backlinks Website is offline since about 2 weeks Will place a new website online asap with new url structure For the 50 best scoring pages I wrote a new, but fully comparable/related article. I will put a 301 redirect from those old to the new pages. My question: What to do with the 950 other url's? Should I put a 301 redirect to the homepage? Should I forward those pages to the 404 page? Should I divide the 950 url's with a 301 redirect to the 50 new ones? Another solution maybe? Any idea what would be the best solution so we can save as much Google juice as possible? Thanks in advance!
Technical SEO | | snorkel0 -
CNAME vs 301 redirect
Hi all, Recently I created a website for a new client and my next job is trying to get them higher in Google. I added them in OSE and noticed some strange backlinks. To my surprise the client has about 20 domain names. All automatically poiting to (showing) the same new mainsite now. www.maindomain.nl www.maindomain.be
Technical SEO | | Houdoe
www.maindomain.eu
www.maindomain.com
www.otherdomain.nl
www.otherdomain.com
... Some of these domains have backlinks too (but not so much). I suggested to 301 redirect them all to the main site. Just to avoid duplicate content. But now the webhoster comes into play: "It's a problem, client has only 1 hosting account, blablabla...". They told me they could CNAME the 20 domains to the main domain. Or A-record them to an IP address. This is too technical stuff for me. So my concrete questions are: Is it smart to do anything at all or am I just harming my client? The main site is ranking pretty well now. And some backlinks are from their copy sites (probably because everywhere the logo links to the full mainsite url). Does the CNAME or A-record solution has the same effect as a 301 redirect, from SEO perspective? Many thanks,
Hans0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
Remove html file extension and 301 redirects
Hi Recently I ask for some work done on my website from a company, but I am not sure what they've done is right.
Technical SEO | | ulefos
What I wanted was html file extensions to be removed like
/ash-logs.html to /ash-logs
also the index.html to www.timports.co.uk
I have done a crawl diagnostics and have duplicate page content and 32 page title duplicates. This is so doing my head in please help This is what is in the .htaccess file <ifmodule pagespeed_module="">ModPagespeed on
ModPagespeedEnableFilters extend_cache,combine_css, collapse_whitespace,move_css_to_head, remove_comments</ifmodule> <ifmodule mod_headers.c="">Header set Connection keep-alive</ifmodule> <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews</ifmodule> DirectoryIndex index.html RewriteEngine On
# Rewrite valid requests on .html files RewriteCond %{REQUEST_FILENAME}.html -f RewriteRule ^ %{REQUEST_URI}.html?rw=1 [L,QSA]
# Return 404 on direct requests against .html files RewriteCond %{REQUEST_URI} .html$
RewriteCond %{QUERY_STRING} !rw=1 [NC]
RewriteRule ^ - [R=404] AddCharset UTF-8 .html # <filesmatch “.(js|css|html|htm|php|xml|swf|flv|ashx)$”="">#SetOutputFilter DEFLATE #</filesmatch> <ifmodule mod_expires.c="">ExpiresActive On
ExpiresByType image/gif "access plus 1 years"
ExpiresByType image/jpeg "access plus 1 years"
ExpiresByType image/png "access plus 1 years"
ExpiresByType image/x-icon "access plus 1 years"
ExpiresByType image/jpg "access plus 1 years"
ExpiresByType text/css "access 1 years"
ExpiresByType text/x-javascript "access 1 years"
ExpiresByType application/javascript "access 1 years"
ExpiresByType image/x-icon "access 1 years"</ifmodule> <files 403.shtml="">order allow,deny allow from all</files> redirect 301 /PRODUCTS http://www.timports.co.uk/kiln-dried-logs
redirect 301 /kindling_firewood.html http://www.timports.co.uk/kindling-firewood.html
redirect 301 /about_us.html http://www.timports.co.uk/about-us.html
redirect 301 /log_delivery.html http://www.timports.co.uk/log-delivery.html redirect 301 /oak_boards_delivery.html http://www.timports.co.uk/oak-boards-delivery.html
redirect 301 /un_edged_oak_boards.html http://www.timports.co.uk/un-edged-oak-boards.html
redirect 301 /wholesale_logs.html http://www.timports.co.uk/wholesale-logs.html redirect 301 /privacy_policy.html http://www.timports.co.uk/privacy-policy.html redirect 301 /payment_failed.html http://www.timports.co.uk/payment-failed.html redirect 301 /payment_info.html http://www.timports.co.uk/payment-info.html1 -
Can I remove 301 redirects after some time?
Hello, We have an very large number of 301 redirects on our site and would like to find a way to remove some of them. Is there a time frame after which Google does not need a 301 any more? For example if A is 301 redirected to B, does Google know after a while not to serve A any more, and replaces any requests for A with B? How about any links that go to A? Or: Is the only option to have all links that pointed to A point to B and then the 301 can be removed after some time? Thank you for you you help!
Technical SEO | | Veva0 -
Where does Wordpress store the 301 redirects?
Hi, I've just created a campaign for my new wordpress blog and found 11 301 redirects which I was not aware of. It looks like wordpress has created them automatically. Does any one know how wordpress handles this issues or where are they stored so I can delete them? They are of no use for me. 9 of these redirects point to the same url with an added '/' and are in pages 1 is on a post. I've been changing the permalink and some urls several times and maybe one of these times the Wordpress has automatically created the 301 redirect. But why? I do not want to keep the old url. the last redirect is very strange it goes from http://www.mydomain.com/folder to http://www.mydomain.com where folder is the folder where I installed wordpress. But again, I want no one to type the url with the folder name or even know this folder exists. Any comment on this would be greatly appreciated. Thanks a lot, David
Technical SEO | | dballari0 -
Do search engines treat 307 redirects differently from 302 redirects?
We will need to send our users to an alternate version of our homepage for a few hours for a certain event. The SEO task at hand is to minimize the chance of the special homepage getting crawled and cached in the search engines in place of our normal homepage. (This has happened in the past so the concern is not imaginary.) Among other options, 302 and 307 redirects are being discussed. IE, redirecting www.domain.com to www.domain.com/specialpage. Having used 302s and 301s in the past, I am well aware of how search engines treat them. A 302 effectively says "Hey, Google! Please get rid of the old content on www.domain.com and replace it with the content on /specialpage!" Which is exactly what we don't want. My question is: do the search engines handle 307s any differently? I am hearing that the 307 does NOT result in the content of the second page being cached with the first URL. But I don't see that in the definition below (from w3.org). Then again, why differentiate it from the 302? 307 Temporary Redirect The requested resource resides temporarily under a different URI. Since the redirection MAY be altered on occasion, the client SHOULD continue to use the Request-URI for future requests. This response is only cacheable if indicated by a Cache-Control or Expires header field. The temporary URI SHOULD be given by the Location field in the response. Unless the request method was HEAD, the entity of the response SHOULD contain a short hypertext note with a hyperlink to the new URI(s) , since many pre-HTTP/1.1 user agents do not understand the 307 status. Therefore, the note SHOULD contain the information necessary for a user to repeat the original request on the new URI. If the 307 status code is received in response to a request other than GET or HEAD, the user agent MUST NOT automatically redirect the request unless it can be confirmed by the user, since this might change the conditions under which the request was issued.
Technical SEO | | CarsProduction0