Hiding Price html component for all countries except US
-
Hello everybody,
We are planning to have a new website soon, which will be an E-Commerce website for people from the US, and non E-Commerce website for people from other countries.
In other words, in the poduct pages, we would like to have the price of the product shown to the users from the US, and on the other hand we would like it to be invisible for users outside of the US. We thought about setting the html elelment of the price to be visible only for US users (by ip).
My question is - can Google crawler see this as potential cloacking, since we hide some of the content to some of the users (while google might scan it from US iip address)?
Thanks in advance...
-
Thanks everybody!
-
I agree with Martijn. Also I have previously had this same scenario ie hiding/changing elements of text based on IP lookup. I can also recommend using GeoIP2 Country web service from MaxMind.
-
Hi,
Simple answer: No. Just hiding the price for US customers is far from cloaking. What you normally would do with cloaking is hide elements that are not relevant to SEO and include extra text and data for those pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why our competitor with lower DR and PA outranks us in Google?
Hi everyone, I really don't understand why our competitor with lower DR and PA outranks us in Google.lv (Google Latvia). Below is a screenshot showing that our company takes #2 for the following keyword "gāzes baloni" in Google. Our DR is 24 and our PA is: 26, whereas our competitors DR is 23 and their PA is 19. The content on our page is much better too - we have clear Title, description, Q&A section etc, whereas our competitor has very limited content, just photos of the product and titles. Any suggestions would be highly appreciated. Thank you very much in advance. PYNPLNw i1lp4QI NriO6O4
Technical SEO | | Intergaz0 -
Schema.org product offer with a price range, or multiple offers with single prices?
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6). In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set. Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
Technical SEO | | 4RS_John1 -
Why HTML entities gets crawled as content keywords in Google search console?
My Google search console shows HTML parameters such as div, class, img, src, gif, align as content keywords, but why google crawls HTML parameters as keywords? because of this, I would be losing traffic for my on-page content keywords. Please let me know how to solve this. Thanks, Jenifer
Technical SEO | | Jenifer300 -
Website Redesign / Switching CMS / .aspx and .html extensions question
Hello everyone, We're currently preparing a website redesign for one of our important websites. It is our most important website, having good rankings and a lot of visitors from Search Engines, so we want to be really careful with the redesign. Our strategy is to keep as much in place as possible. At first, we are only changing the styling of the website, we will keep the content, the structure, and as much as URLs the same as possible. However, we are switching from a custom build CMS system which created URLs like www.homepage.com/default-en.aspx
Technical SEO | | NielsB
No we would like to keep this URL the same , but our new CMS system does not support this kind of URLs. The same with for instance the URL: www.homepage.com/products.html
We're not able to recreate this URL in our new CMS. What would be the best strategy for SEO? Keep the URLs like this:
www.homepage.com/default-en
www.homepage.com/products Or doesn't it really matter, since Google we view these as completely different URLs? And, what would the impact of this changes in URLs be? Thanks a lot in advance! Best Regards, Jorg1 -
2 similar websites targetting different countries
I have a website that has a .com.au extension running on zencart. If I load up the exact same wesbite (with the same website name) on the .com, will my .com.au be penalised by Google? Thanks in advance.
Technical SEO | | theshining0 -
Targeting US search traffic
Hello, I've noticed the site I'm working on gets about 30-40% of Google organic search traffic from the US and the rest comes from around the world. All the site's customers are in the US and so the thought is to focus getting traffic more from the US. I know google webmaster tools has a geo targeting mechanism for the site in question but what I don't want to do is turn that on and then traffic from non-US sources goes away; I suppose that's not so bad if traffic from the US bumps up accordingly. Do you have any experience on this area? thanks -Mike
Technical SEO | | mattmainpath0 -
Different HTML based on resolution
Is it acceptable in terms of SEO to display different HTML based on a users resolution size? I feel I'm wasting space on my site catering for all the 1024 x 768ers
Technical SEO | | niallfred0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0