URLs in Greek, Greeklish or English? What is the best way to get great ranking?
-
Hello all,
I am Greek and I have a quite strange question for you.
Greek characters are generally recognized as special characters and need to have UTF-8 encoding.
The question is about the URLs of Greek websites.
According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages.
To give you an example let's look at
A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία
The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks.
On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)!
So the question is:
For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ?
Thank you very much for your help!
Regards,
Lenia
-
Hi Tom,
I really appreciate your detailed answer.
You give a lot of information here. Thanks.
Taking into account the 3 main points you mention I would go with the Greeklish.
I think I should provide explanations about the term Greeklish. The official language of Greece is modern Greek. Modern Greek has a specific alphabet and it is not the same as the latin alphabet. On the other hand when a Greek person write Greek words by using the latin alphabet, then that is called Greeklish (Greek + english). It is a quick and easy way to write mms, imessages without paying attention on the orthography.
A URL in Greeklish is understandable by people in Greece => it can be considered as localised URL
A URL in Greeklish can easily be shared with no particular technical implications.
On the other hand the wikipedia articles use the encoded Greek Characters in the URL.
Well, I think if the SEO benefit is not that big, I would go with the Greeklish solutions.
I would be glad to have the feedback of other experts about that subject.
Tom thank you very much!
Regards,
Lenia
-
This is a really good question, Lenia. Really, really good, in fact.
Let's break this down into a number of factors:
Having localised URLs (by that I mean URLs, written in the country's language) - From an SEO perspective, I do believe there is some correlation that having localised URLs helps to rank higher, in the same way that having a keyword in the URL may help - having this keyword in the country's language would, by default, work the same way. However, the SEO benefit of doing so isn't that big, I'd see it as only a little boost, so I wouldn't let the SEO side weigh too heavily on your decision.
Now, having localised URLs for a user perspective is something that I think is very useful. I'd see it as a bigger plus for a user than I would for SEO purposes. I think having localised URLs shows the user that you're a part of that country, not just a larger corporation with an international presence but no real interest in the country for example. I think it helps users recognise and anticipate what the URLs would be for their user journey as well. Also, (I don't know how relative this might be for you) but having localised URLs can definitely help for offline campaigns and promotion. Say you were running some newspaper or billboard ads and you wanted to track how many people were then visiting your site as a result, you might want to setup a custom URL or search term for the campaign. So, you're newspaper advert would have "Visit www.domain.com/customURLhere/" on it. This would look infinitely better if it was written in the localised language (although I suppose you could always setup a 301 redirect for the URL).
Ultimately, however, I think you're decision should largely be influenced on the technical implications. The SEO value would be slight, but not that significant whichever method you choose. I would go with whatever solution would be easiest for you technically - it sounds like it would be easier to accommodate user and SEO factors, rather than having to accommodate technical factors for a slight SEO gain.
Just my input on the issue, and so I'd love to hear more from others on it - as I think it's a great question which could do with the input of some of the talented folk here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle 301 redirects on a business directory
We work with quite a few sites that promote retail traders and feature a traders' directory with pages for each of the shops (around 500 listings in most cases). As retail strips, shops come and go all the time, so I get a lot of pages that are removed as the business is no longer present. Currently I've been doing 301 redirects to the home page of the directory if you try to access a deleted trader page, but this means a ever growing htaccess file with thousands of 301 redirects. Are we handling this the best way or is there a better way to tackle this situation?
Technical SEO | | Assemblo0 -
What is the best way to show content in Listing Pages?
If it is e-commerce site and a product listing page there is always a conflict how to show the content? As per my understanding we can show content in two different ways. 1. To show little content and use **Read more. (**In this case there is a direct message to the google: Here is the content visible and rest content is hidden but available for visitors to read more 2. Can use** Scroll bar**. So here is the message to Google and visitors that my full content is available here. So just scroll down to read further. So I want to know that which method of showing content is best and it's impact of SEO where there is UI constraint or both the method is ok without any SEO impact. Please share your suggestions. DCdRJpH
Technical SEO | | kathiravan0 -
URL structure
Hello Guys, Quick Question regarding URL strucutre One of our client is an hotel chain, thye have a group site www.example.com and each property is located in a subfolder: www.example.com/example-boston.html , www.example.com/example-ny.html etc. My quesion is : where is better to place the language extension at a subfolder level?
Technical SEO | | travelclickseo
Should i go for www.example.com/en/example-ny.html or it is preferable to specify the language after the property name www.example.com/example-ny/en/accommodation.html? Thanks and Regards, Alessio0 -
Best way to implement noindex tags on archived blogs
Hi, I have approximately 100 old blogs that I believe are of interest to web browsers that I'd potentially like to noindex due to the fact that they may be viewed poorly by Google, but I'd like to keep on our website. A lot of the content in the blogs is similar to one another (as we blog about the same topics quite often), which is why I believe it may be in our interests to noindex older blogs that we have newer content for on more recent blogs. Firstly does that sound like a good idea? Secondly, can I use Google Tag Manager to implement noindex tags on specific blog pages? It's a hassle to get the webmaster to add in the code, and I've found no mention of whether you can implement such tags on Tag Manager on the usual SEO blogs. Or is there a better way to implement noindex tags en masse? Thanks!
Technical SEO | | TheCarnage0 -
Disallowing https URLs
It there a problem disallowing all https URLs to be indexed in order to avoid duplication? This is the article recommending this practice - http://blog.leonardchallis.com/seo/serve-a-different-robots-txt-for-https/ Thanks!
Technical SEO | | theLotter0 -
Best hosting
We understand that some companies offering class c ips can still be fingerprinted.. Is there any hosting site that does offer class c ips that prevents that? Or is the best bet using privacy on all domains and then using multiple hosting companies, checking the ips they offer as you go? If that is the case, are there any recommendations for the best host companies that offer the least fingerprinting?
Technical SEO | | Stevej240 -
URL Structure Question
Hey folks, I have a weird problem and currently no idea how to fix it. We have a lot of pages showing up as duplicates although they are the same page, the only difference is the url structure. They seem to show up like: http://www.example.com/page/ and http://www.example.com/page What would I need to do to force the URLs into one format or the other to avoid having that one page counting as two? The same issue pops up with upper and lower case: http://www.example.com/Page and http://www.example.com/page Is there any solution to this or would I need to forward them with 301s or similar? Thanks, Mike
Technical SEO | | Malarowski0 -
As an agency, what is the best way to handle being the webmaster and hosting provider for several sites (some of which are in the same industry and have natural links to each other)?
We are an agency that builds and hosts websites for several companies (some of which happen to be in the same industry - and therefore naturally link to each other - we do not dictate). In regards to handling their domain registrations, webmaster tools account, google analytics account, and servers, what is the best practice to avoid Google thinking that these companies are affilliated? Even though they aren't affiliated, we are afraid that us being the "webmaster" of these sites and having shared servers for them that we may be affecting them.
Technical SEO | | grayloon0