Need advice for indexing a multilingual website
-
We are in the process of creating a Spanish subdomain of our website. I want to know what needs to be done in regard to meta tags, sitemap.xml and robots.txt so that Google and Bing will index both website properly and not causing the web page on the English site to lost rank.
Our English site is www.mydomain.com with the Spanish site being es.mydomain.com
We are planning to put a button or link on both sites so that visitors can switch between both sites.
The two sites are similar but not all pages are mirror images.
-
Thank you, John. That is helpful.
-
First, make sure you set the geographic target in your Google Webmaster Tools if you have a specific geographic target. If not, Google recommends you don't set a geographic target it just for language purposes (see here). In that case, Google will figure out the language on its own (see here).
You have a few options for setting the language on the page, which is what Bing is looking for. They have a nice write-up about it here.
Each subdomain can have its own sitemap and robots.txt, so set those up like you would normally.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advice for structuring hotel website
Hey guys, I am currently setting up a hotel booking website and I'm not so sure how to structure it. I have landing pages for: 1. Cities
Intermediate & Advanced SEO | | baresound
2. Sights
3. States The main keywords are mainly "Hotels in Cityname" or "Hotels near Sightname". What would be the best SEO friendly way of structuring the url? https://hotels-example.com/hotels/cities/cityname
https://hotels-example.com/hotels/sights/sightname
https://hotels-example.com/hotels/states/statename or https://hotels-example.com/hotels/cityname
https://hotels-example.com/hotels/sightname
https://hotels-example.com/hotels/statename or https://hotels-example.com/hotels-in-cityname
https://hotels-example.com/hotels-in-sightname
https://hotels-example.com/hotels-in-statename Or are there better ways of structuring it or am I just overthinking it? I would greatly appreciate any advice and suggestions 🙂 Best, Max0 -
Need references to a company that can transition our 1000 page website from Http to Https without breaking our SEO backlinks and site structure
Hi Ya'll I'm looking for a company or independent who can transition our website from http to https. I want to make sure they know what they're doing with a Wordpress website. More importantly, i want to make sure they don't break any seo juice from external sources while internally nothing gets broken. Anyone have any good recommendations? You can reply back or DM me. Best, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Moving half my website to a new website: 301?
Good Morning! We currently have two websites which are driving all of our traffic. Our end goal is to combine the two and fold them into each other. Can I redirect the duplicate content from one domain to our main domain even though the URL's are different. Ill give an example below. (The domains are not the real domains). The CEO does not want to remove the other website entirely yet, but is willing to begin some sort of consolidation process. ABCaddiction.com is the main domain which covers everything from drug addiction to dual diagnosis treatment. ABCdualdiagnosis.com is our secondary website which covers everything as well. Can I redirect the entire drug addiction half of the website to ABCaddiction.com? With the eventual goal of moving everything together.
Intermediate & Advanced SEO | | HashtagHustler0 -
Google is Really Slow to Index my New Website
(Sorry for my english!) A quick background: I had a website at thewebhostinghero.com which had been slapped left and right by Google (both Panda & Penguin). It also had a manual penalty for unnatural links which had been lifted in late april / early may this year. I also had another domain, webhostinghero.com, which was redirecting to thewebhostinghero.com. When I realized I would be better off starting a new website than trying to salvage thewebhostinghero.com, I removed the redirection from webhostinghero.com and started building a new website. I waited about 5 or 6 weeks before putting any content on webhostinghero.com so Google had time to notice that the domain wasn't redirecting anymore. So about a month ago, I launched http://www.webhostinghero.com with 100% new content but I left thewebhostinghero.com online because it still brings a little (necessary) income. There are no links between the websites except on one page (www.thewebhostinghero.com/speed/) which is set to "noindex,nofollow" and is disallowed to search engines in robots.txt. I made sure the web page was deindexed before adding a "nofollow" link from thewebhostinghero.com/speed => webhostinghero.com/speed Since the new website launch, I've been publishing new content (from 2 to 5 posts) daily. It's getting some traction from social networks but it gets barely any clicks from Google search. It seems to take at least a week before Google indexes new posts and not all posts are indexed. The cached copy of the homepage is 12 days old. In Google Webmaster Tools, it looks like Google isn't getting the latest sitemap version unless I resubmit it manually. It's always 4 or 5 days old. So is my website just too young or could it have some kind of penalty related to the old website? The domain has 4 or 5 really old spammy links from the previous domain owner which I couldn't get rid of but otherwise I don't think there's anything tragic.
Intermediate & Advanced SEO | | sbrault740 -
De Index Section of Page?
Hey all! We're having a couple of issues with a certain section of our page that we don't want to index. Basically, our cross sells change really quickly, and big G is ranking them and linking to them even when they've long gone. Is it possible to put some kind of no index tag for a specific section of the page? See below 🙂 http://www.freestylextreme.com/uk/Home/Brands/DC-Shoe-Co-/Mens-DC-Shoe-Co-Hoodies-and-Sweaters/DC-Black-Rob-Dyrdek-Official-Sweater.aspx Thanks!
Intermediate & Advanced SEO | | elbeno0 -
Reducing Booking Engine Indexation
Hi Mozzers, I am working on a site with a very useful room booking engine. Helpful as it may be, all the variations (2 bedrooms, 3 bedrooms, room with a view, etc, etc,) are indexed by Google. Section 13 on Search Pagination in Dr. Pete's great post on Panda http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world speaks to our issue, but I was wondering since 2 (!) years have gone by, if there are any additional solutions y'all might recommend. We want to cut down on the duplicate titles and content and get the useful but not useful for SERPs online booking pages out of the index. Any thoughts? Thanks for your help.
Intermediate & Advanced SEO | | Leverage_Marketing0 -
Indexing techniques
Hi, I just want a confirmation about my indexing technique, if is good or can be improved. The technique is totally whitehat and can be done by one person. Any suggestions or improvements are welcome. I create the backlinks ofcource first 🙂 I make a list on public doc from Google. On the doc are only ten links. After I digg it , and add some more bookmarks 5-6. I tweet the digg and each doc. (my 2 twitter accounts have page authority 98) I like them in Fb. I ping them thru ping serviecs. Thats it. Works ok for moment. Is anything what I can do to improve my technique? Thanks lot
Intermediate & Advanced SEO | | nyanainc0