Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
-
Hello all,
I am Greek and I have a quite strange question for you.
Greek characters are generally recognized as special characters and need to have UTF-8 encoding.
The question is about the URLs of Greek websites.
According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages.
To give you an example let's look at
A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία
The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks.
On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)!
So the question is:
For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ?
Thank you very much for your help!
Regards,
Lenia
-
Hi Tom,
I really appreciate your detailed answer.
You give a lot of information here. Thanks.
Taking into account the 3 main points you mention I would go with the Greeklish.
I think I should provide explanations about the term Greeklish. The official language of Greece is modern Greek. Modern Greek has a specific alphabet and it is not the same as the latin alphabet. On the other hand when a Greek person write Greek words by using the latin alphabet, then that is called Greeklish (Greek + english). It is a quick and easy way to write mms, imessages without paying attention on the orthography.
A URL in Greeklish is understandable by people in Greece => it can be considered as localised URL
A URL in Greeklish can easily be shared with no particular technical implications.
On the other hand the wikipedia articles use the encoded Greek Characters in the URL.
Well, I think if the SEO benefit is not that big, I would go with the Greeklish solutions.
I would be glad to have the feedback of other experts about that subject.
Tom thank you very much!
Regards,
Lenia
-
This is a really good question, Lenia. Really, really good, in fact.
Let's break this down into a number of factors:
Having localised URLs (by that I mean URLs, written in the country's language) - From an SEO perspective, I do believe there is some correlation that having localised URLs helps to rank higher, in the same way that having a keyword in the URL may help - having this keyword in the country's language would, by default, work the same way. However, the SEO benefit of doing so isn't that big, I'd see it as only a little boost, so I wouldn't let the SEO side weigh too heavily on your decision.
Now, having localised URLs for a user perspective is something that I think is very useful. I'd see it as a bigger plus for a user than I would for SEO purposes. I think having localised URLs shows the user that you're a part of that country, not just a larger corporation with an international presence but no real interest in the country for example. I think it helps users recognise and anticipate what the URLs would be for their user journey as well. Also, (I don't know how relative this might be for you) but having localised URLs can definitely help for offline campaigns and promotion. Say you were running some newspaper or billboard ads and you wanted to track how many people were then visiting your site as a result, you might want to setup a custom URL or search term for the campaign. So, you're newspaper advert would have "Visit www.domain.com/customURLhere/" on it. This would look infinitely better if it was written in the localised language (although I suppose you could always setup a 301 redirect for the URL).
Ultimately, however, I think you're decision should largely be influenced on the technical implications. The SEO value would be slight, but not that significant whichever method you choose. I would go with whatever solution would be easiest for you technically - it sounds like it would be easier to accommodate user and SEO factors, rather than having to accommodate technical factors for a slight SEO gain.
Just my input on the issue, and so I'd love to hear more from others on it - as I think it's a great question which could do with the input of some of the talented folk here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In writing the url, it is better to use the language used by the people of my country or English?
We speak Persian and all people search in Persian on Google. But I read in some sources that the url should be in English. Please tell me which language to use for url writing?
Technical SEO | | ghesta
For example, I brought down two models: 1fb0e134-10dc-4737-904f-bfdf07143a98-image.png https://ghesta.ir/blog/how-to-become-rich/
2)https://ghesta.ir/blog/چگونه-پولدار-شویم/0 -
Do URLs with canonical tags get indexed by Google?
Hi, we re-branded and launched a new website in February 2016. In June we saw a steep drop in the number of URLs indexed, and there have continued to be smaller dips since. We started an account with Moz and found several thousand high priority crawl errors for duplicate pages and have since fixed those with canonical tags. However, we are still seeing the number of URLs indexed drop. Do URLs with canonical tags get indexed by Google? I can't seem to find a definitive answer on this. A good portion of our URLs have canonical tags because they are just events with different dates, but otherwise the content of the page is the same.
Technical SEO | | zasite0 -
Best & easiest way to 301 redirect on IIS
Hi all, What is the best and easiest way to 301 redirect URLs on IIS server? I got access to the FTP and WordPress back office, but no access to the server admin. Is there an easy way to create 301 redirect without having to always annoy the tech in charge of the server? Thanks!
Technical SEO | | 2MSens0 -
Special characters in URL
Will registered trademark symbol within a URL be bad? I know some special characters are unsafe (#, >, etc.) but can not find anything that mentions registered trademark. Thanks!
Technical SEO | | bonnierSEO0 -
SEO-optimized Urls for Japan: English or Japanese Characters
Hi, Anyone got experience with Japanese Urls? I'm currently working on the relaunch of the Japanese site of the troteclaser.com and I wonder if we should use English or Japanese characters for the Urls. I found some topics on the forums about this, but they only tell you that Google can crawl both without problems. The question is if there is a benefit if Japanese characters are used.
Technical SEO | | Troteclaser1 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
Subdomain and Domain Rankings
I have read here that domain names with keywords might add a boost to your search rank For instance using a completely inane example monkey-fights.com might get a boost compared to mfl.com (monkey fighting league) when searching for "monkey fights" There seems to be a hot debate as to how much bonus the first domain might get over the second, but leaving that aside for the moment. Question 1. Would monkey-fights.mfl.com get the same kind of bonus as a root domain bonus? Question 2. If the answer to 1 above was yes would a 301 redirect from the suddomain URL to root domain URL retain that bonus I was just thinking on how hard it is to get root domains these days that are not either being squatted on etc. and if this might be a way to get the same bonus, or maybe subdomains are less bonus prone and so it would be a waste of time Thanks
Technical SEO | | bThere0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0