Do search engines understand special/foreign characters?
-
We carry a few brands that have special foreign characters, e.g., Kühl, Lolë, but do search engines recognize special unicode characters? Obviously we would want to spend more energy optimizing keywords that potential customers can type with a keyboard, but is it worthwhile to throw in some encoded keywords and anchor text for people that copy-paste these words into a search?
Do search engines typically equate special characters to their closest English equivalent, or are "Kuhl", "Kühl" and "Kühl" three entirely different terms?
-
Thanks Tom.
While it seems that search engines generally handle these characters well I still had to ask because I did two Google searches: One for "lole" and another for "lolë", and I got very different results. On the first search my website came up on SERP page one but on the second one we were nowhere to be found on any of the first 15 pages.
Whats more, every one of the first 10 SERP pages or so contained at least one instance of the character with an accent in the title, description, or within the on-page copy. So it seems that special characters and regular ascii characters are not one in the same, or at least they are not weighted the same.
I do have to agree with you on the fact that most users will not go through the trouble entering Alt codes on their search bars. On the other hand, the fact that we barely register on SERPs for the company's DBA name might be cause for some concern.
-
Hi David,
Google/Bing etc. have very few problems recognising such characters in the Latin alphabet. It looks like you are mainly concerned with umlauts, which Google handles intelligently. For example...
-
Google will identify the difference between a search for "Küchen" (kitchens in German) and "Kuchen" (cake in German) and offer up relevant results. This is true in Google US and Google UK, not just localised Googles.
-
Search suggestions work just fine with these characters, and even with the standardised way of rewriting them when there is no accessible way to type them (for umlauts this is with an e following the letter). For example, in Google.de, type "Kue" and you will be given the suggestion "Küchen".
You are mainly concerned with brands, which muddies the waters a little because many people in English speaking markets won't bother/know how to type the umlauts. However, Google normally handles this well and recognises the intent.
I would recommend you ensure you consistent use the brand name with the foreign characters, as intended. Google/Bing and co. shouldn't have any problems. Which HTML encoding you use is by the by, in my opinion, as long as the characters are rendering correctly.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Our Web Site Is candere.com. Its PA and back link status are different for https://www.candere.com, http://www.candere.com, https://candere.com, and http://candere.com. Recently, we have completely move from http to https.
How can we fix it, so that we may mot lose ranking and authority.
Intermediate & Advanced SEO | | Dhananjayukumar0 -
Duplicate content on .com .au and .de/europe/en. Would it be wise to move to .com?
This is the scenario: A webstore has evolved into 7 sites in 3 shops: example.com/northamerica example.de/europe example.de/europe/en example.de/europe/fr example.de/europe/es example.de/europe /it example.com.au .com/northamerica .de/europe/en and .com.au all have mostly the same content on them (all 3 are in english). What would be the best way to avoid duplicate content? An answer would be very much appreciated!
Intermediate & Advanced SEO | | SEO-Bas0 -
Maintaining SEO with Ecommerce Search Refinement
Hey Everyone, i have an interesting scenario I'd appreciate some feedback on. I'm working on restructuring a client site for a store design, and he had previously built a bunch of landing pages mostly for SEO value- some of them aren't even accessible from the main nav and contain a lot of long-tail type targets. These pages are generating organic traffic but the whole thing is pretty not user-friendly because it's cumbersome to drill down into specific categories (that many of the landing pages fulfill) without going through 3 or 4 pages to get there. For example, if I want to buy orange shoes, i can see specific kinds of orange shoes, but not ALL the orange shoes, even though there is an SEO page for orange shoes that is otherwise inaccessible from the main navigation. If that wasn't too confusing, essentially the usability solution to this is implementing some search refinement so that the specific sub categories can be drilled into easily with less steps. My issue is that I'm hesitant to implement this even though I know it would be an overall benefit to the site, because the existence of these SEO pages and being wary of destroying the organic traffic they're already receiving. My plan was to see to it that the specific category pages are built with the necessary keywords and content to attract those organic visits, but I'm still nervous it might not be enough. Does anyone have any suggestions for this circumstance, but also just maximizing SEO efforts on a site with search refinement and how to minimize loss. From a usability standpoint, search refinement is great, but how do you counter the significant SEO risks that come with it? Thanks for your help!
Intermediate & Advanced SEO | | BrandLabs0 -
Penguin/Panda/Domain Purchase
If I move forward with the acquisition: 1. Should I, if there is a way, just acquire the domain and then attempt to unlink existing links? 2. Can I just buy the domain, completely kill the site, and then build again from scratch? Even if I do that, the links to the domain will still be out there. 3. Should I even move forward with the purchase if I know these tactics have been used? Thanks!
Intermediate & Advanced SEO | | dbuckles0 -
Alternative links in the search results.
Hello, This is a short question Please look at this SERP screenshot: http://imgur.com/1EMen Who do they get the other links under their results. Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Question about 301 redirect for trailing / ?
I am cleaning up a fairly large site. Some pages have a trailing slash on the end some don't. Some of the existing backlinks built used a trailing slash in the url and some didn't. We aren't concerned with picking a particular one but just want to get one set and stick to it from now on. I am wondering, would I clean this up within the same redirect in the htaccess file that takes care of the www and non www? example RewriteEngine On
Intermediate & Advanced SEO | | PEnterprises
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.domain.com/ [NC]
RewriteRule ^(.*)$ http://domain.com$1 [L,R=301] I currently use that to redirect the www. to the non www as you can see. However here is what I was confused about. Would this code be enough to redirect ALL pages with a / to the ones without? or would I also need to add another code (so there is 2) to my htaccess like below? RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^domain.com/ [NC]
RewriteRule ^(.*)$ http://domain.com$1 [L,R=301] RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.domain.com/ [NC]
RewriteRule ^(.*)$ http://domain.com$1 [L,R=301] That way, now, even the non www pages with a trailing slash will redirect to the non www without the trailing slash. Hopefully you understand what I am getting at. I just want to redirect EVERYTHING to the non www WITHOUT a / Thank you Jake0 -
Is there an search marketing / keyword tool in existence that can solve my need?
I'm looking for a tool that can do the following:
Intermediate & Advanced SEO | | PTC4SEO
Organize a keyword universe and its data/metrics:
-track keyword data over time (search volumes/trends, relative competition metrics, rankings,
etc)
-Sort keywords into buckets/silos/ad groups
-allow you to assign individual keywords to multiple silos/groups and show the relationships between groups based on keyword relationships.
-incorporate a site map
-tie keyword targets to static pages, informational content (SEO) & landing pages (PPC)
-help with KW and/or competitive research (optional)
-tie into web analytics / marketing on-demand software (optional) I know that this is a lot of functionality, but for enterprise search marketing, this could be a game changer for my strategy (if it exists currently) or for the industry (if it doesn't exist). Please share you solution suggestions here...0