OnPage Issues with UTF-8 and ISO-8859-1
-
Hi guys,
I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside.
If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars.
I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8..
You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8
Remove the ?charset parameter and the charset it set to iso-8859-1.
Hope somebody has an answer or can push me into the right direction.
Thanks in advance and have a great day all.
Jan
-
Hi Jan,
I'm following up on old questions. Did Theo's answer solve your problem for you?
-
Wow good answer!
-
Getting a web page to display your content as TRUE utf-8 requires everything to be set at the utf-8 encoding. 'Everything' includes: your database, your database tables, your database fields, your connection from php to your database, you header as set by php, your header as set by html, your content itself etc.
The following resources were extremely helpful to me when I was switching to utf-8 (which is by far the better encoding over ISO):
http://www.phpwact.org/php/i18n/utf-8
http://www.phpwact.org/php/i18n/utf-8/mysql
Bonus tip: make sure your content and files are saved as utf-8 without BOM (Byte Order Mark), this will save you lots of trouble later!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console showing canonical issues
I have a problem , canonical tags present in my web pages, but still google search console showing canonical issue for example check this page https://kilid.com
Technical SEO | | ParastooDezyani0 -
Subpages have Page Authority of 1 behind Home with DA 50
How come that our subpages all have a PA 1 if the home got DA 50? technical specialities: Megamenue opens on click only Category pages dont exist (home/i-do-not-exist-as-page-category/PA-1-subpage) All subpages have a high amount of links to ressources (over 200) what would be the most obvious cause for the low PA? would the external link profile be the main reason? thanks in advance. I would be happy to answer your questions Kind regards
Technical SEO | | brainfruit0 -
Best Practice for Blocking a site from 1 countries search engines
A client cannot appear in any search engines in one given country but they are ok in rest of the world. Has anybody had any experience blocking a site from appearing in just google.de, bing.de and yahoo.de for example?
Technical SEO | | Salience_Search_Marketing0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Please Help! I lost ranking from #1 SERP
PensacolaRealEstate.com I am new to SEO. I lost my first place ranking on google. I held it for YEARS. I think I lost it with the last update... How do I analyze my site to find what is penalizing me? How can I get back to the top?
Technical SEO | | JML11790 -
Canonical Issues with Wordpress
Hi all, I have just started using Wordpress SEO by Yoast and still having a hard time correcting my Canonical issues for all posts with a .html at the end. The pluggin allows you to add a '/' to the end for canonical issues, but just for pages, not posts. How best in Wordpress to make my post change from .html/ to .html. I really don't want to go to the hassle to make each URL a new 301 redirect in my .htaccess. I hate the .html, but if they are going to stay, how can I make sure I get the .html/ link juice back to them. Many thanks!
Technical SEO | | RunningInTheRain0 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0