Google is mixing subdomains. What can we do?
-
Hi!
I'm experiencing something that's kind of strange for me.
I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com.
When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links.
For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc
What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not.
thanks!!
-
Please do go through this link which has a wealth of information and its by Google so nothing better to trust:
But yes for Brazil related pages use
<meta http-equiv="content-language" content="pt-br"> </meta http-equiv="content-language" content="pt-br">
and
-
So your suggestion is to use something like this:
<meta http-equiv="content-language" content="pt-br">and the expression br-PT constructed the first part with the website language and geodetecting the second part of the string (the PT)?</meta http-equiv="content-language" content="pt-br">
-
Hi,
I understand you dont have two website, but you said somewhere you are using subdomains. For search engines every subdomain is a completely different DNS recored, so treated as a different website.
No one is saying you need to translate your website, however, the changes above need to be done to whatever languages you already have. You would need an army of people to translate to all languages and of course a million USD! Haha!
As I said before, language approach is not enough, you need to use the locale approach too. For example, English is spoken in many countries (like Australia, Canada, US, UK, New Zealand, South Africa). Same as German and a few other languages, so if you dont couple language with country, search engines will get confused.
I hope this helps
Issa
-
But I don't have two websites for portuguese. I have one.
Same happens with German. It is not only speaked in Germany, Austria also has a big part of the country speaking German.
I can't translate my website into all different countries and language variations. I already have more than 10 so I can tell that is hard to maintain
Basically what sounds contradictory to me is that I'm not using a country approach but a language approach like many websites. But still Google is getting confused with it.
-
Hi again,
First of all, canonicals are not enough but definitely its good that you use them.
Alternate rel link tag is very important. Read this link please: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
As for the XML sitemap, do you use the language markup for each link there? If you want to know how to do that follow this link: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
As for the Portugal and Brazil subdomains, using webmaster tools will surely solve this issue, but even with the language rel tag you have to use different language codes, so "pt" is incorrect, you need to specify the locale as well, so "pt-BR" for Brazilian Portuguese and "pt-PT" for European/Continental/Portugal Portuguese
I hope this clears things up.
Sorry there is no easy way
Best,
Issa
-
Thanks for the answer.
Of what you suggested, I have canonicals and content language meta tag.
I haven't tried the yet. Maybe that helps.
I have sitemaps too.
The problem I see with geotargeting with WM is what I mentioned above. Portugal and Brazil share the same language: portuguese. But in webmaster tools I can't say pt.domain.com is intented for Brzil and Portugal. I need to pick only one.
-
Exactly that's the issue. For example I go to google.com.mx and I see my domain spanish domain with sitelinks pointing to my dutch domain!
The problem I see with geotargeting with WM is what I mentioned above. Portugal and Brazil share the same language: portuguese. But in webmaster tools I can't say pt.domain.com is intented for Brzil and Portugal. I need to pick only one.
-
Hi Fabrizzo,
There are a few things you will need to do to help Google make a decision of which part of your website (whether its a subdomain or a subfolder). For example on the mobile-friendly website you will need to use the HTML annotation:
And on the desktop site you will need to add the canonical meta:
This way, you are telling google that these two pages are the same pages, but one is for mobile and the other is for desktop users.
As for countries websites, you this is what Google looks at when they crawl your web pages:
- ccTLDs (country-code top-level domain names).
- Geotargeting settings. You can use the geotargeting tool in Webmaster Tools to indicate to Google that your site is targeted at a specific country. (If you have different subdomains then create a separate profile for each on Webmaster tool and assign each to a different country.)
- Server location (through the IP address of the server). The server location is often physically near your users and can be a signal about your site’s intended audience.
- Other signals. Other sources of clues as to the intended audience of your site can include local addresses and phone numbers on the pages, the use of local language and currency, links from other local sites, and/or the use of Google Places (where available).
(Source for this is Google support article #182192
In your situation i think you will need to 1) Use a dedicated Webmastertools profile for each countries domain. 2) use rel="alternate" hreflang="x" (see examples below)
-
HTML link element. In the HTML section of http://www.example.com/, add a
link
element pointing to the Spanish version of that webpage at http://es.example.com/, like this: -
HTTP header. If you publish non-HTML files (like PDFs), you can use an HTTP header to indicate a different language version of a URL:
Link: <http: es.example.com="">; rel="alternate"; hreflang="es"</http:>
-
Sitemap. Instead of using markup, you can submit language version information in a Sitemap.
I hope this helps, let me know if you have more Qs
Best,
Issa
-
What is making you think your rankings are compromised?
This is new Google, treating subdomains like part of your site, really they are - just separated by a dot instead of a slash. now if they are showing results from one country in another countries google, that's an issue but geo targeting subdomains in WMT will take care of that.
-
oh i gotcha. yeah that makes sense then... irving has you on the right track. i don't know much about multi-language web work
still i would no-crawl that mobile site and that will fix one of your problems at least.
good luck!
-
Maybe the mobile in particular is a bad example because you are right, I can restrict access to it. But It's happening with the site in other languages too.
-
I have this on all my pages:
http-equiv="Content-Language" content="nl" /> or this http-equiv="Content-Language" content="de" />
that's why I'm clueless
-
why would you want google to crawl your mobile site?
-
Add meta language tags to their respective pages.
you can also add local content like country name to the content to help give google more hints.
-
The problem I see with geo targeting through webmasters tool is that I can pick a country and not a language.
For example I have a portuguese version for Brzil and Portugal. I know this is not the best approach cause both languages has its differences, but I can say this website is for Portugal OR Brazil. Not for poruguese speaking countries.
-
I don't want Google not to crawl the website. I want to set this up properly so he sees that they are different
-
Google is more and more treating subdomains like part of the site, this is one example of how. You can demote the sitelinks. If you have a german version for example you can geo target that subdomain for germany results.
-
add a no-crawl in your robots.txt for each subdomain you don't want crawled?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How often is google pushing data ?
Hello, I know that google index quickly but how often is the data pushed into their search results ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Prevent Google from crawling Ajax
With Google figuring out how to make Ajax and JS more searchable/indexable, I am curious on thoughts or techniques to prevent this. Here's my Situation, we have a page that we do not ever want to be indexed/crawled or other. Currently we have the nofollow/noindex command, but due to technical changes for our site the method in which this information is being implemented if it is ever displayed it will not have the ability to block the content from search. It is also the decision of the business to not list the file in robots.txt due to the sensitivity of the content. Basically, this content doesn't exist unless something super important happens, and even if something super important happens, we do not want Google to know of its existence. Since the Dev team is planning on using Ajax/JS to pull in this content if the business turns it on, the concern is that it will be on the homepage and Google could index it. So the questions that I was asked; if Google can/does index, how long would that piece of content potentially appear in the SERPs? Can we block Google from caring about and indexing this section of content on the homepage? Sorry for the vagueness of this question, it's very sensitive in nature and I am trying to avoid too many specifics. I am able to discuss this in a more private way if necessary. Thanks!
Intermediate & Advanced SEO | | Shawn_Huber0 -
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
Site not indexed in Google UK
This site was moved to a new host by the client a month back and is still not indexed in Google UK if you search for the site directly. www.loftconversionswestsussex.com Webmaster tools shows that 55 pages have been crawled and no errors have been detected. The client also tried the "Fetch as Google Bot" tactic in GWT as well as running a PPC campaign and the site is still not appearing in Google. Any thoughts please? Cheers, SEO5..
Intermediate & Advanced SEO | | SEO5Team0 -
Sitemap on a Subdomain
Hi, For various reasons I placed my sitemaps on a subdomain where I keep images and other large files (static.example.com). I then submitted this to Google as a separate site in Webmaster tools. Is this a problem? All of the URLs are for the actual site (www.example.com), the only issue on my end is not being able to look at it all at the same time. But I'm wondering if this would cause any problems on Google's end.
Intermediate & Advanced SEO | | enotes0 -
How Can Google tell, if a anchor text is exact match
So, I was thinking to myself today. Couldn't Google say everything is an exact match anchor text in reality? Such as, Hyundai in Boston, Or cars in boston? I'm just concerned, that's all. Thanks for your help.
Intermediate & Advanced SEO | | PeterRota0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Get Higher in Google Shopping
Hello, A few days ago i imported my product list into Google Shopping and everything got accepted, but when i look in Google Shopping for my product. It's on page 3, how can i get my product higher in Google shopping? I assume this thing is different from just normal SEO? Regards, Yannick
Intermediate & Advanced SEO | | iwebdevnl0