Thank you Martign! That makes sense. I'm excited to see how using the HREFLANG markup helps with multi-lingual users.
- Home
- J-Banz
J-Banz
@J-Banz
Job Title: President/CEO
Company: Banz Marketing Services, LLC
Favorite Thing about SEO
It is ever changing, thus requiring creativity.
Latest posts made by J-Banz
-
RE: XML Sitemaps - Multi-lingual website
-
XML Sitemaps - Multi-lingual website
Hi Mozzers,
I am working with a large website that has some of its content translated across multiple languages. I am planning on using The Media Flow to create an HREFLANG Sitemap for content on various languages. Please see the attached image for the questions below. Thanks!
Section Highlighted Yellow:
- When there is a URL that does not have a translated version, should it not be included on the same HREFLANG sitemap?
- Alternately, could I just remove the languages that are not being targeted, so this would just reflect English language targeting?
-
RE: Robot.txt File Not Appearing, but seems to be working?
I verified that I was checking /robots.txt. I had trouble verifying if it was under the non-www because everything redirects to the www. I also checked to see if it was being blocked, and it is not.
I went to Archive.org (Wayback Machine), and I can see the robot.txt file in previous versions of the site. I cannot, however, view it online, even though Google says they are downloading it successfully, and the robots.txt file is successfully blocking URLs from the search index.
-
Robot.txt File Not Appearing, but seems to be working?
Hi Mozzers,
I am conducting a site audit for a client, and I am confused with what they are doing with their robot.txt file. It shows in GWT that there is a file and it is blocking about 12K URLs (image attached). It also shows in GWT that the file was downloaded 10 hours ago successfully. However, when I go to the robot.txt file link, the page is blank.
Would they be doing something advanced to be blocking URLs to hide it it from users? It appears to correctly be blocking log-ins, but I would like to know for sure that it is working correctly. Any advice on this would be most appreciated. Thanks!
Jared
-
RE: International Site - Language Targetting
Thank you very much Robert for your thorough follow-up. I am humbled at the insights you offered, and am very glad I asked about this. It is much more detailed than I was expecting, and definitely not something to make a hasty, uninformed decision on.
-
RE: International Site - Language Targetting
Thank you Robert for your thorough explanation! I am sorry your first post timed-out, and I appreciate the follow up post. I added a little clarification based off of what you said.
-
International Site - Language Targetting
Hi Mozzers,
I am currently conducting a technical site audit on a large website. Their main content and audience is in the US, but they have started to add translated versions of the content in different languages (about 30 different languages). Also, they are not using cookies or scripts to auto-populate the language on the page, and the pages seem to be getting indexed just fine.
Currently, they have their language distinguished by sub-folder (i.e. example.org/blog/by-language/spanish/), which I plan to 301 redirect to example.org/blog/es/ for each language. However, they are not implementing any sitemaps or hreflang header tags.
I have not dealt with this in the past as all of my work has been done on smaller US sites, so I wanted to verify the steps I plan to take to ensure this is a solid approach.
- 301 redirect example.org/language/spanish/blog/ to example.org/es/blog/
- Recommend adding hreflang markup into the header for each language. (They have a lot of pages, so they may not implement this if it is too much work.)
- Highly recommend adding XML sitemaps for each content version of the site using the media flow HREFLANG Siitemap Tool.
- Setting up multiple Webmaster Tools accounts and geotargetting them by language. I would also add the XML sitemap for each language.
Is this a solid approach, given the information above? I want to make sure I am fundamentally sound on this before suggesting so many large changes. Thank you in advance for any thoughts / wisdom you can instill!
---------------------additional information---------------------
If I am hearing you correctly, I would only submit one XML Sitemap for international content. It would look something like the below image. I would only use one GWT account to upload the file, and I would not need to add any additional markup on each page, as it will be located in the hreflang xml sitemap.
Finally, would it be a good or bad idea to 301 redirect their naming convention to a new, shorter one?
example.org/by-language/spanish/blog/this-is-an-example --> example.org/es/blog/this-is-an-example
-
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site.
For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4
What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues.
Any advice would be much welcomed.
Looks like your connection to Moz was lost, please wait while we try to reconnect.