Error after scanning with browseo.net
-
Good day!
I have done a scan on my site with browseo.net ( and a few other similar scanners ) and got the mess seen in the screenshot.
I've tried deleting all the files in the website folder, replace it with a single image file, but it still shows the same error.
What could this mean and should i be worried?
P.S
Found my answer after contacting the helpful support of browseo.net :
It took me some time to figure out what was going on, but it seems as if you are mixing content types. Browsers are quite smart when it comes to interpreting the contents, so they are much more forgiving than we are.
Browseo crawls your website and detects that you are setting utf-8 as part of the meta information. By doing so, it converts the content in a different character encoding then what they are supposed to be. In a quick test, I tried to fetch the content type based on the response object, but without any success. So I am suspecting that in reality your content is not utf-8 encoded when you parse it into joomla. The wrong character type is then carried over for the body (which explains why we can still read the header information). All of this explains the error.
In order for it to work in browseo, you’d have to set the content type correctly, or convert your own content into utf-8 before parsing. It may be that you are either storing this incorrectly in the database (check your db settings for a different content type other than utf-8) or that other settings are a bit messed up. The good news is, that google is probably interpreting your websites correctly, so you won’t be punished for this, but perhaps something to look into…
From Paul Piper
-
Is the link to an image? If so, that is the cause. Try uploading a simple html page--should work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quest about 404 Errors
About two months ago, we deleted some unnecessary pages on our website that were no longer relevant. However, MOZ is still saying that these deleted pages are returning 404 errors when a crawl test is done. The page is no longer there, at least that I can see. What is the best solution for this? I have a page that similar to the older page, so is it a good choice to just redirect the bad page to my good page? If so, what's the best way to do this. I found some useful information searching but none of it truly pertained to me. I went around my site to make sure there were no old links that directed traffic to the non existent page, and there are none.
Technical SEO | | Meier0 -
Help! How to Remove Error Code 901: DNS Errors (But to a URL that doesn't exist!)
I have 2 urgent errors saying there are 2 x error code 909's detected. These don't link to any page - but I can tell there is a mistake somewhere - I just don't know what needs changing. http://www.justkeyrings.co.ukhttp/www.justkeyrings.co.uk/printed-promotional-keyrings http://www.justkeyrings.co.ukhttp/www.justkeyrings.co.uk/blank-unassembled-keyrings Could someone help please? screen-shot-2015-08-11-at-13.18.17.png?t=1439292942
Technical SEO | | FullSteamBusiness0 -
301 vs 500 Errors for discontinued products
I have a client that has a around 15 "products" (they are pages containing details of the products rather than e-Commerce products) that have been discontinued. The client has suggested 301s but unless the alternative products are replacement products am I correct that we should be using a 500 error?
Technical SEO | | MentorDigital0 -
What should I do with URLs that cause site map errors?
Hi Mozzers, I have a client who uses an important customer database and offers gift cards via https://clients.mindbodyonline.com located within the navigation which causes sitemap errors whenever it is submitted since domain is different. Should I ask to remove those links from navigation? if so where can I relocate those links? If not what should I do to have a site map without any errors? Thanks! 1n16jlL.png
Technical SEO | | Ideas-Money-Art0 -
4xx error - but no broken links founded by Xenu
In my SeoMoz crawl report I get multiple 4XX errors reported and they are all on the same type of links. www.zylom.com/nl/help/contact/9/ and differiate between the number at the end and the language. But I i look in the source code we nice said: <a class="<a class="attribute-value">bigbuttonblue</a>" style="<a class="attribute-value">float:right; margin-left:10px;</a>" href="[/nl/help/contact/9/?sid=9&e=login](view-source:http://www.zylom.com/nl/help/contact/9/?sid=9&e=login)" onfocus="<a class="attribute-value">blur()</a>" title="<a class="attribute-value">contact</a>"> contact a> I already tested the little helpfull tool Xenu, but this also doesn't give any broken links for the url's which I found in the 4xx error report. Could somebody give me a suggestion Why these 4xx errors keep coming? Could it be that the SeoMoz crawlers break the part ?sid=9&e=login' from the URL. Because if you want to enter the link, you first get a pop-up to fill in a login screen. Thanks for you answers already
Technical SEO | | Letty0 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0 -
Domain.com and domain.com/ redirect(error)
When I view my campaign report I'm seeing duplicate content/ meta for mydomain.com and mydomain.com/ (with a slash) I already applied a 301 redirect as follows: redirect 301 /index.php/ /index.php Where am I messing up here?
Technical SEO | | cgman0 -
Most Common Errors & Warnings
Hello there, i would like to ask some basic tips.. regarding found common errors & Warnings. list : Tittle Element Too Long
Technical SEO | | Bretly
Duplicate Page Content
and Duplicate Page Tittle. how could i fixed this one? any help would be greatly appreciated regards,0