Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitelinks Issue - Different Languages
-
Hey folks,
We run different ccTLD's for revolveclothing.com (revolveclothing.es, revolveclothing.com.br, etc. etc.) and they all have their own WMT/Google Console with their own href lang tags etc.
The problem is this.
https://www.google.fr/#q=revolve+clothing
When you look at the sitelinks, you'll see that one of them (sales page) happens to be in Portuguese on the French site. Can anyone investigate and see why?
-
The Dirk answer points to some potential answers.
Said that, when I click on your SERP's link, I see others sitelinks (just two):
- the first >>> Robes
- the second >>> Вся распродажа.
As Dirk pointed out, your site has detected my IP (quite surely, but maybe it is user agent), and when I click on the second sitelink I see this url: http://www.revolveclothing.es/r/Brands.jsp?aliasURL=sale/all-sale-items/br/54cc7b&&n=s&s=d&c=All+Sale+Items.
The biggest problem, when it comes to IP redirections, is that they are a big problem in terms both of SEO and usability:
- SEO, because googlebot (and others bots) will mostly be redirected to the USA version due to their IPs, even though Google crawls site also from datacenters present in other country (but much less);
- Users, because you are making impossible, for instance, to a Spanish user to see the Spanish site whenever they are not in Spain. And that really sucks and pisses off users.
There's a solution:
-
making the IP redirection just the first time someone click on a link to your site and if that link is not corresponding to the version of the country from were users and bots are clicking;
-
presenting the links to the others country versions of your site, so that:
-
bots will follow those links and discover those versions (but not being redirected again);
-
users are free to go to the version of your site they really need (but not being redirected again if coming from those country selector links).
Said that, it would be better using a system like the one Amazon uses, which consists not forcing a redirection because of IP, but detecting it and launching an alert on-screen, something like: "We see that you are visiting us from [Country X]. Maybe you will prefer visiting [url to user's country site]".
Then, i just checked the hreflang implementation, and it seems it was implemented correctly (at least after a very fast review with Flang).
I tried to search for "Resolve clothing" in Spain incognito and not personalized search, and it shows me the Spanish website and Spanish sitelinks correctly;
I tried the same search from Spain but letting Google consider my user-agent (setup for English in search), and I saw the .com version and English sitelinks (which is fine).
Remember, sitelinks are decided by Goggle and we can only demote them.
To conclude, I think the real reason has to be searched not in a real international SEO issue (but check out the IP redirection), but to a possible and more general indexation problem.
-
If you look at the results on Google fr - I find it more surprising that apart from the first result - all the other results that are shown are coming from the .com version rather than the .fr version. If I search for Revolve cloathing on google.pt - I only get the US results & instagram.
You seem to use a system of ip detection - if you visit the French site from an American ip address you are redirect to the .com version (at least for the desktop version) - check this screenshot from the French site taken with a American ip address: http://www.webpagetest.org/screen_shot.php?test=150930_BN_1DSQ&run=1&cached=0 => this is clearly the US version. Remember that the main googlebot is surfing from a Californian ip - so he will mainly see the US version - there are bots that visit with other ip's but they don't guarantee that these visit with the same frequency & same depth (https://support.google.com/webmasters/answer/6144055?hl=en). This could be the reason of your problem.
On top of that - your HTML is huge - the example page you mention has 13038 lines of HTML code and takes ages to load ( 16sec - http://www.webpagetest.org/result/150930_VJ_1KRP/ ). Size is a whopping 6000KB. Speed score for Google : 39%. You might want to look to that.
Hope this helps,
Dirk
-
Hey Jarred, Which one? http://take.ms/xTPyo My Portugese is terrible these days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why My site pages getting video index viewport issue?
Hello, I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles. I have tried fixing it but the error is still showing in Google Search Console. Can anyone help me fix it out?
Technical SEO | | mitty270 -
Any crawl issues with TLS 1.3?
Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -
Technical SEO | | TimDickey0 -
Does having a sub-domain on a different server affect SEO?
I'm working with a company that has a hard-coded website on the root domain, and then a WordPress blog on a subdomain on a separate server. We're planning on implementing a hub and spoke model for their content, hosting the main hubs on the root domain and the linked articles on the blog. Is having the blog on a different server going to hinder our SEO efforts?
Technical SEO | | KaraParlin0 -
CcTLD + Subdirectory for languages
Hey, a client has as .de domain with subdirectories for different languages, so domain.de/de, domain.de/en, domain.de/fr etc. hreflang Tags are implemented, so each subdirectory of each language references to the other languages, so for domain.de/en it is: My question is about the combination of ccTLD + language subdirectory. Do you think this is problematic for Google and should be replaced with .com + language subdirectory? We have lots a high quality domains (from countries with corresponding languages) linking to .de/de and .de/en, some links on .de/fr & .de/es and 0 links pointing to .de/cn. Thanks in advance!
Technical SEO | | Julisn
Julian0 -
Discovered - currently not indexed issue
Hello all, We have a sitemap with URLs that have mostly user generated content. Profile Overview section. Where users write about their services and some other things. Out of 46K URLs, only 14K are valid according to search console and 32K URLs are excluded. Out of these 32K, 28K are "Discovered - currently not indexed". We can't really update these pages as they have user generated content. However we do want to leverage all these pages to help us in our SEO. So the question is how do we make all of these pages indexable? If anyone can help in the regard, please let me know. Thanks!
Technical SEO | | akashkandari0 -
I have multiple URLs that redirect to the same website. Is this an issue?
I have multiple URLs that all lead to the same website. Years ago they were purchased and were sitting dormant. Currently they are 301 redirects and each of the URLs feed to different areas of my website. Should I be worried about losing authority? And if so, is there a better way to do this?
Technical SEO | | undrdog990 -
Has anyone had issues with Bazaarvoice and schema.org?
About a year ago we started using Bazaarvoice to get reviews for our products, and it has been great as far as accumulating content, but Google is not taking the schema.org data and displaying it on the SERP. Someone has told me it is because we are offering multiple products, or that our schema.org tags are incorrect but when I compare our code to other travel sites it seems like everyone is doing something different. This is especially annoying since the Google schema markup check says everything is fine. Does anyone have any advice or similar experiences? Thanks.
Technical SEO | | tripcentral0 -
Duplicate content on Product pages for different product variations.
I have multiple colors of the same product, but as a result I'm getting duplicate content warnings. I want to keep these all different products with their own pages, so that the color can be easily identified by browsing the category page. Any suggestions?
Technical SEO | | bobjohn10