Re-Direct Users But Don't Affect Googlebot
-
This is a fairly technical question... I have a site which has 4 subdomains, all targeting a specific language. The brand owners don't want German users to see the prices on the French sub domain and are forcing users into a re-direct to the relevant subddomain, based on their IP address. If a user comes from a different country, (ie the US) they are forced on the UK sub domain. The client is insistent on keeping control of who sees what (I know that's a debate in it's own right), but these re-directs we're implementing to make that happen, are really making it difficult to get all the subdomains indexed as I think googlebot is also getting re-directed and is failing to do it's job. Is there are a way of re-directing users, but not Googlebot?
-
yes u can cookie the user based on IP and send them to the appropriate place. since bots aren't browser based they can't accept cookies and see the default content. dunno why they need to do this though
-
Hi Alex,
Just check out awesome post by Gianluca on international seo issue.
http://www.seomoz.org/blog/international-seo-dropping-the-information-dust
In above post, read out "h_reflang_" tag implementation which helpful in your case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Re-direct Domains to Internal Pages on Money Site
I have an ecommerce site that is fully built out with thousands of products. I own many industry related domains for the products that i sell. Many of these domains are sitting unused. I started to think that it would beneficial if i 301 redirect (at the registrar level) these domains to their SPECIFIC subcategories on my main money site. For example, i sell sporting goods and my main website is buysportinggoods.com I also own the following domains: basketballoutlet.com & baseballequipmentstore.com & footballpads.com Would it be wise or foolish (and potentially cause a Google penalty) if i did the following: Point basketballoutlet.com to buysportinggoods.com/basketballs Point baseballequipmentstore.com to buysportinggoods.com/baseball Point footballpads.com to buysportinggoods.com/football Please let me know your thoughts or experiences with similar situations. Thanks!
Intermediate & Advanced SEO | | Prime850 -
Is re-branding safe?
I am not entirely pleased with my website's name and have been willing to change it for years. I feel it is not brand-able. But since its an old domain name and overall figures of DA, PR, Moz score etc. are very good, I have been wary of changing the name and doing a 301 permanent re-direct from the existing name to the new one. Please suggest me if I should go for it. If yes, what are the best practices to go about it.
Intermediate & Advanced SEO | | KS__0 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
Is my text readable? I don't see it in the page source
Text on my site seems to be readable in a text only version (the page is not cached so I viewed it by disabling JAVA and then copy and pasted the page into Word) However, when I look in the page source I don't see the text there. The text was created using Open X html boxes to help us with formatting, but is this causing an SEO problem?
Intermediate & Advanced SEO | | theLotter0 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
Is it worth re-doing SEO for all existent products
We have a website and when we started, we had no clue about SEO, nor did we really understand the full extent of CRO amongst other things. We have slowly learnt that there are many changes that need to happen to our site; however...do we need to re SEO all the content that is already on the website or can we purely start a fresh with the new products we feed through? The website is: www.onlineforequine.co.uk if you need to take a look at the kind of platform we are working with.
Intermediate & Advanced SEO | | onlineforequine0 -
Do widgets and gadgets affect SEO?
I have added a number of widgets and gadgets to my site that I suspect act like Iframes. If true do these widgets and gadgets and the content that they are linked to help or hurt my site from an SEO perspective? Examples are facebook gadget, wordpress blidget, weather gadget, google maps widget.
Intermediate & Advanced SEO | | casper4340