Geographical targeting with Magento
-
We have a Magento store, with multiple stores/domains setup. There is only really one reason that we have the multiple domains; we use an automatic GEOIP store switcher to send a customer to the right store, so that they pay the proper shipping, see the proper pricing etc, and a couple small differences in the design templates. But all the content is identical.
So we have:
domain.com (main website)
domain.ca (where most other countries are directed to based on GEOIP)
domain.euSince the content is the same, what is the best strategy here? I looked at several options:
1. Custom canonical urls, making each page on the .ca and .eu use canonical url of the .com
2. Completely block the .ca and .eu from robots.
3. Leave it the way it is -
Hello Maarten,
I think you should consider using the rel alternate hreflang tags, as discussed here. If you need more background here is another great resource. And start following Aleyda Solis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
Different links to ultimately the same page on Magento
Hi Everyone, I'm wondering if some of you could help me out a bit here as I'm a bit consfused. If you please take a quick look at my site: https://tesorotiles.co.uk the way it's setup is that you can get to the same page via 3 or 4 different routes as below: https://tesorotiles.co.uk/type/wall-tiles/rho https://tesorotiles.co.uk/by-area/bathroom-tiles/rho https://tesorotiles.co.uk/collections/rho These 3 are the exact same page and we've done it this way to make sure there is no break in the breadcrumb. Is this ok SEO wise or anyone have any recommendation. Thanks in advance
Technical SEO | | VIVO0 -
Robots.txt and Magento
HI, I am working on getting my robots.txt up and running and I'm having lots of problems with the robots.txt my developers generated. www.plasticplace.com/robots.txt I ran the robots.txt through a syntax checking tool (http://www.sxw.org.uk/computing/robots/check.html) This is what the tool came back with: http://www.dcs.ed.ac.uk/cgi/sxw/parserobots.pl?site=plasticplace.com There seems to be many errors on the file. Additionally, I looked at our robots.txt in the WMT and they said the crawl was postponed because the robots.txt is inaccessible. What does that mean? A few questions: 1. Is there a need for all the lines of code that have the “#” before it? I don’t think it’s necessary but correct me if I'm wrong. 2. Furthermore, why are we blocking so many things on our website? The robots can’t get past anything that requires a password to access anyhow but again correct me if I'm wrong. 3. Is there a reason Why can't it just look like this: User-agent: * Disallow: /onepagecheckout/ Disallow: /checkout/cart/ I do understand that Magento has certain folders that you don't want crawled, but is this necessary and why are there so many errors?
Technical SEO | | EcomLkwd0 -
Google Penguin - Target Landing Page
Hello, One of our sites have been hit by the first penguin update back in April and ever since then we have been removing links and submitting reconsideration requests... It only seems to have affected our home page as some of our internal landing pages are still ranking OK in the SERPS #1 / #2. I'm just wondering if we created a landing page for this keyword and drove high quality / relevant links to this landing page could we get it to rank higher than our homepage even though our Homepage is on the 5th page.Hope the above make sense. Has anybody had any joy with this?
Technical SEO | | ScottBaxterWW0 -
Magento- 1675 302 Redirects - How to Fix?
Hi Folks, I would appreciate some help on this. My ecommerce website is built on magento and we currently have 1675 302 Redirects on it as pointed out by the SEOMOZ crawler. Has anyone any idea on how to fix this or ave you being on the same boat as me. Im pretty sure this is why it coming up that I have a lot of dup content on the website also. Help appreciated as always. Thank you!
Technical SEO | | dean19860 -
Unsure of sudden dramatic drop for target keyword(s)
Even after studying the latest Panda algorithm, which frowns upon templated looks (kind of odd), I have been doing some thorough research trying to find out why a site I have been optimizing plummeted. It was hovering around 10, and fell to 37 in one to two weeks. After I read all about the Panda update, I assumed it was that. But the site falls in line and shouldn't be affected. I used that 3rd party duplicate content checker and the content checked out okay. I looked on my SEO Pro campaign and noticed that the on-page report got an A grade, hitting on all factors except having the keyword in the URL. So, I have a site that loads in good time and should be fairly well optimized, and falls in line from what I can tell with Google's guidelines, which I read through. So why the drop? I'm not saying this site is 100% perfect couldn't be any better, optimized to the hilt. It was designed a few years ago, but still, I am just having a hard time finding out why this site got slammed the way it did. www.spineandsportschiro.com Target Keyword(s) Roseville Chiropractor Roseville Chiropractic I welcome any on-page critiques, but mostly would like to know where to look or have some kind of insight to why the rankings dropped, so I can try and fix them the right way. Thank you,
Technical SEO | | Boogily0 -
KW Targeted Domain Format For Local Biz
My brother owns a laundromat...Im putting together his site.....I am targetinga small geographic area obviously.... Which domain is more seo friendly for the kw north brunswick nj laundromat? northbrunswicknjlaundromat.com north-brunswick-nj-laundromat.com northbrunswick-nj-laundromat.com northbrunswicknj-laundromat.com I doubt there is a huge difference between any of them and my first instict was to go with "northbrunswicknjlaundromat" but it seems it may be too wordy to not use hyphens. Do search engines care? Thanks in advance for any and all help Nolan
Technical SEO | | NInc810 -
Should you worry about adding geo-targeted pages to your site?
Post-Panda, should I worry about adding a bunch of geo-targeted landing pages at once? It's a community, people have added their location on their profile pages. I'm worried if we decide to make all the locations into hyperlinks that point to new geo-targeted pages, it could get us extra traffic for those geo-specific keyword phrases but penalize the site as a whole for having so many low-quality pages. What I'm thinking is maybe to start small and turn, say, United States into a hyperlink that points to a page (that would house our community members that reside in the United States) and add extra unique content to the page. And only add a new location page when we know we'll be adding unique content to it, so it's not basically just page sorting. Thoughts? Hope that makes sense. Thanks!
Technical SEO | | poolguy0