Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitelinks Issue - Different Languages
-
Hey folks,
We run different ccTLD's for revolveclothing.com (revolveclothing.es, revolveclothing.com.br, etc. etc.) and they all have their own WMT/Google Console with their own href lang tags etc.
The problem is this.
https://www.google.fr/#q=revolve+clothing
When you look at the sitelinks, you'll see that one of them (sales page) happens to be in Portuguese on the French site. Can anyone investigate and see why?
-
The Dirk answer points to some potential answers.
Said that, when I click on your SERP's link, I see others sitelinks (just two):
- the first >>> Robes
- the second >>> Вся распродажа.
As Dirk pointed out, your site has detected my IP (quite surely, but maybe it is user agent), and when I click on the second sitelink I see this url: http://www.revolveclothing.es/r/Brands.jsp?aliasURL=sale/all-sale-items/br/54cc7b&&n=s&s=d&c=All+Sale+Items.
The biggest problem, when it comes to IP redirections, is that they are a big problem in terms both of SEO and usability:
- SEO, because googlebot (and others bots) will mostly be redirected to the USA version due to their IPs, even though Google crawls site also from datacenters present in other country (but much less);
- Users, because you are making impossible, for instance, to a Spanish user to see the Spanish site whenever they are not in Spain. And that really sucks and pisses off users.
There's a solution:
-
making the IP redirection just the first time someone click on a link to your site and if that link is not corresponding to the version of the country from were users and bots are clicking;
-
presenting the links to the others country versions of your site, so that:
-
bots will follow those links and discover those versions (but not being redirected again);
-
users are free to go to the version of your site they really need (but not being redirected again if coming from those country selector links).
Said that, it would be better using a system like the one Amazon uses, which consists not forcing a redirection because of IP, but detecting it and launching an alert on-screen, something like: "We see that you are visiting us from [Country X]. Maybe you will prefer visiting [url to user's country site]".
Then, i just checked the hreflang implementation, and it seems it was implemented correctly (at least after a very fast review with Flang).
I tried to search for "Resolve clothing" in Spain incognito and not personalized search, and it shows me the Spanish website and Spanish sitelinks correctly;
I tried the same search from Spain but letting Google consider my user-agent (setup for English in search), and I saw the .com version and English sitelinks (which is fine).
Remember, sitelinks are decided by Goggle and we can only demote them.
To conclude, I think the real reason has to be searched not in a real international SEO issue (but check out the IP redirection), but to a possible and more general indexation problem.
-
If you look at the results on Google fr - I find it more surprising that apart from the first result - all the other results that are shown are coming from the .com version rather than the .fr version. If I search for Revolve cloathing on google.pt - I only get the US results & instagram.
You seem to use a system of ip detection - if you visit the French site from an American ip address you are redirect to the .com version (at least for the desktop version) - check this screenshot from the French site taken with a American ip address: http://www.webpagetest.org/screen_shot.php?test=150930_BN_1DSQ&run=1&cached=0 => this is clearly the US version. Remember that the main googlebot is surfing from a Californian ip - so he will mainly see the US version - there are bots that visit with other ip's but they don't guarantee that these visit with the same frequency & same depth (https://support.google.com/webmasters/answer/6144055?hl=en). This could be the reason of your problem.
On top of that - your HTML is huge - the example page you mention has 13038 lines of HTML code and takes ages to load ( 16sec - http://www.webpagetest.org/result/150930_VJ_1KRP/ ). Size is a whopping 6000KB. Speed score for Google : 39%. You might want to look to that.
Hope this helps,
Dirk
-
Hey Jarred, Which one? http://take.ms/xTPyo My Portugese is terrible these days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CcTLD + Subdirectory for languages
Hey, a client has as .de domain with subdirectories for different languages, so domain.de/de, domain.de/en, domain.de/fr etc. hreflang Tags are implemented, so each subdirectory of each language references to the other languages, so for domain.de/en it is: My question is about the combination of ccTLD + language subdirectory. Do you think this is problematic for Google and should be replaced with .com + language subdirectory? We have lots a high quality domains (from countries with corresponding languages) linking to .de/de and .de/en, some links on .de/fr & .de/es and 0 links pointing to .de/cn. Thanks in advance!
Technical SEO | | Julisn
Julian0 -
Does using a canonical with ?utm_source=gmb cause any issues?
All of our URLs in Google My Business are tagged with ?utm_source=gmb. This way when people click on it within a Google Map listing, knowledge graph, etc we know it came from there. I'm assuming using a canonical on all ?_utm_source _pages (we have others, including some in the index) won't cause any problems with this, correct? Since they're not technically traditional organic SERPs? Dumb question I know, but better safe than sorry. Thanks.
Technical SEO | | Alces1 -
Hide sitelinks from Google search results
Does anyone have any recommendations on how you can tell Google (hopefully via a URL) not to index that page of a website? I have tried through SEO Yoast to hide certain sitemaps (which has worked to a degree) but certain functionalities of Wordpress websites show links without them actually being part of a "sitemap" so those links are harder to hide. I'm having an issue with one of my websites - the sitelinks that Google is suggesting are nowhere near the most popular pages and I know that you can't make recommendations through Google not to show certain pages through Search Console. anymore. Any suggestions are greatly appreciated! Thanks!
Technical SEO | | MainstreamMktg0 -
Best practice for URL - Language/country
Hi, We are planning on having our website localized into more languages. We already have an English and German version. The German version is currently a sub-domain: www.example.com --> English version de.example.com --> German version Is this recommended? Or is it always better to have URLs with language prefixes such a: www.example.com/de www.example.com/es Which is a better practice in terms of SEO?
Technical SEO | | Kilgray1 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Image Indexing Issue by Google
Hello All,My URL is: www.thesalebox.comI have Submitted my image Sitemap in google webmaster tool on 10th Oct 2013,Still google could not indexing any of my web images,Please refer my sitemap - www.thesalebox.com/AppliancesHomeEntertainment.xml and www.thesalebox.com/Hardware.xmland my webmaster status and image indexing status are below,
Technical SEO | | CommercePundit
Can you please help me, why my images are not indexing in google yet? is there any issue? please give me suggestions?Thanks!
0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
How much authority does a 301 pass to a different domain?
Hi, A client of mine is selling his business to a brand new company. The brand new company will be using a brand new domain (no way to avoid that unfortunately) and the current domain (which has tons of authority, links, shares, tweets, etc.) will not be used. Added to that, the new company will be taking over all the current content with just a few minor changes. (I know, I wish we could use the old domain but we can't.) Obviously, I am redirecting all pages on the current domain to the new domain via 301 redirects on a page by page basis. So, current.com/product-page-x.html redirects to new.com/product-page-x.html. My client and the new company both are asking me how much link juice (and other factors) are passed along to the new domain from the old domain. All I can find is "not the full value" or variants thereof.My experience with 301 redirects in the past has been within a single domain and I've seen some of those pages have decent authority and decent rankings as a result of the 301 (no other optimization work was done or links were added). Are there any studies out there that I'm missing that show how much authority/juice gets passed and/or lost via a 301 redirect? Anybody with a similar issue see any trends in page/domain authority and/or rankings? Thanks for any insights and opinions you have.
Technical SEO | | Matthew_Edgar0