Doctype language declaration problem
-
Hello,
I have a problem with an SEM Rush warning on a website audit, for www.enjoyprepaid.com. It tells me "5852 pages are lacking language declaration", but I don't understand what it means and how to actually fix this problem. Also I run a W3 validator and have a doctype and language problem but again don't understand what they mean and how to fix them https://validator.w3.org/nu/?doc=http%3A%2F%2Fwww.enjoyprepaid.com%2FAfghanistan-calling-cards-2.html -
Thanks Peter! The reposts were due to a bug we ran into yesterday. I'll lock this thread to further responses.
-
@mods - maybe respost from here: https://moz.com/community/q/doctype-language-declaration-problem-2
I'll drop answer here too just in case you will delete old post.Yes - iso-8859-15 is very outdated encoding. Validator suggest that you should use UTF-8.I believe that this is also SEMRush issue too.
Fix just wrote this:
and bug will be fixed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking to a Resource from a multi-language Page
I have a multi-language page where the content is available in several versions (translated). I want to link to a resource that is only available in one English. Is it a good idea to link to this resource from all language versions or should I better include the link only in the English version of my page? In the first scenario for example a Spanisch and a German language version would link to a page in English. Is this ok or could it be considered spam?
Technical SEO | | ConverterApp0 -
301 redirected all my pages to my new domain, now I have a problem with Google Search Console
Hi guys! I bought a new domain name and redirected all my URLs from the old domain to the new one. Everything worked perfectly but now I have a little problem. I want to use the option 'Address Change' in google search console. Step 1 Works (Select new website in the list) Step 2 Works (Confirm that the 301 are working) Step 3 Asks me to Verify the old domain (huh!?) in order to complete the request. Obviously that doesn't work because my 301s WORKS! So if I try to verify the old website by putting a google file in the root of my domain Google tries to access it and it automatically redirects to the new domain. I must be missing something lol help!
Technical SEO | | benoit_20180 -
is pointing to the same page that it is already on, is this a problem?
So we have a wordpress site with the all-in-one-seo-pack installed. I have just noticed in our crawl diagnostics that a canonical tag has been put in place on every single one of our pages, but they are all pointing to the pages that they are already on. Is this a problem? Should I be worried about this and delve more deeply to figure out as to why this has happened and get it removed? Thanks
Technical SEO | | cttgroup0 -
Multi Language websites!
Hello, What is the best way for a website to have more languages? For example www.site.com is in english BUT the content will be translated in other 4 languages .es .it .fr .de . Should i buy 4 diferent domains www.site1.es www.site2.it etc... or create folders like site.com/es/ site.com/it etc OR create subdomains es.site.com Few months ago i was 100% sure..to go and buy diferent domains then to create diferent content on that languages..but now..i am not so sure. Thank you very much for your help!
Technical SEO | | willyg0 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0 -
Problem generating backlinks
Hello everyone, Over the past couple of days I have been using a variety of different free and paid programs to check how many backlinks I have. I have used at least 15 different ones, SEOmoz is the newest and am awaiting those results. All but one of those programs have said that I have 0 backlinks. The one said I have 11. The thing is...over the past month I have submitted my site to over 500 PR0 directories, 250+ PR3-PR7 directories, have made well over 75 article submissions to over 20 websites(almost all of them were approved and up and running, and some of these websites have a PR of 7), a couple dozen Press Releases, set up profiles linking to my site on over 50 forums, set up profiles linking to my site on a variety of different Web 2.0 sites, and still nothing. I have been doing this every day for over a month. Anyone have any ide?
Technical SEO | | tarik30010 -
Problem with canonical url and session ids
Hi, i have a problem with the following website: http://goo.gl/EuF4E Google always indexes the site with session-id, although i use canonical url in this page. Indexed sites: http://goo.gl/RQnaD Sometimes it goes right, but sometimes wrong. Is it because we separate our session-id with ";" as separator? In the Google Webmaster Tools, i can´t choose jsessid as a parameter, so i think google does not recognize this. But if we have to change it (f.e. ? as separator) we have to spend many days in programming. Any ideas? thanks for your help!
Technical SEO | | tdberlin0 -
Product ratings causing 302 redirect problem
I am working on an ecommerce site and my crawl report came back with 7000+ 302 redirects and maxed out at 10,000 pages because of all the redirects. The site really only has maybe 1500 pages (dynamic content aside). After looking into it a little more I see it is because of the product rating system. They have a star rating system that kinda looks like amazons. The only problem is that each star is a link to a dynamic address that records the vote and then 302's back to the original page the vote was cast from. So virtually every page on this site links out anywhere from 15 to 45 times and 302's back to itself, losing virtually all of its PR. Am I correct in that assumption or am I missing something? I don't see the links being blocked by robots.txt or noindex, nofollowed. Also it is an anonymous rating system where a rating can be cast from any category page displaying a product or any product page. To make matters worse every page links to a printable version which duplicates the issue by repeating the whole thing over again. So assuming I am correct that is site has a major PR leak on virtually every page, what is the best recommendation to fix this. 1. Block all of those links in robots.txt, 2. no index, nofollow these links or 3. put the rating system behind a submit button or disallow anon ratings 4. something else??? Looking at their product ratings on the site virtually everything is between 2-3 starts out of 5 and has about the same number of votes except less votes on deeper pages. I dont believe this is real at all since this site gets almost no traffic and maybe 1 sale a week, there is no way that any product has been rated 50 times. I think the crawler is voting as it crawls and doing it 5 times for every product which is why everything is rated 2.5 out of 5. This is an x-cart site in case anyone cares. Any suggestions?
Technical SEO | | BlinkWeb0