How do I fix 608's please?
-
Hi,
I'm on the free trial and finding it very useful I've fixed all my 301's. but now I have a load of 608's. I don't no what this is!
I feel like I've cured herpes only to get gonorrhea! can any one help. I have 41 608's which is more than the 301's I had. I hope they are non-related!
I won't bore you with the whole list but some of the url's are:
Error Code 608: Page not Decodable as Specified Content Encoding
http://sussexchef.com/catering-at-mr-mrs-currys-50th-wedding-anniversary/guestsarrive----608
Error Code 608: Page not Decodable as Specified Content Encoding
http://sussexchef.com/funeral-catering/picture4-2----608
Error Code 608: Page not Decodable as Specified Content Encoding
-
Weird... yeah, under normal operations, output buffering shouldn't be on. There probably are legitimate uses for it for more complex sites, but not as a default option.
-
Hello folks
I've read this post when I had this 608 issue. But I couldn't figure out what the problem was.
So I'm here to share what was my problem.I used PHP and my website had this buffering option "ob_start()"
Since I wasn't really using it, I removed it and any other buffering option and the problem stopped.Hope it helps
-
Hi All
So after calling Godaddy a second time I got through to a guy that suggested i go in to wordpress, go to permalinks and select a setting other than the one that was already selected and click publish. then select the one it was on previously and click publish.
It fixed all the problems instantly.
I just thought I'd share this incase anyone else has this problem and searches the forum for 608's in the future.
All the best and thanks for your help
Ben
-
Hi
Yes the site is behaving really badly in windows explorer and fire fox, chrome seams to adjust it almost instantly though.
-
Thanks Dr Pete.
I'm toying with the thought of moving to a new theme so may leave it for now. other issues are my site maps, I'm using plugins to do this but google webtools doesn't like them. But I guess that's something to for me to research further.
Thanks again
-
I will add that it's entirely possible that this is a minor, if odd, problem, and Google is crawling the pages fine. You seem to be indexed properly. Fixing it is a nice-to-have, but I doubt it would be worth a big investment unless you've got other issues that need fixing.
-
Do you know what kind of hosting you're running with GoDaddy? Is it Apache, Windows, etc.? I used to do some hosting with them, and I'm trying to remember where that would be set. It depends completely on the web server, though.
-
Hi Dr
It's amazing how much advice you get on the moz forum. I basicly ditched my developer and subscribed to Moz instead. I called go daddy but they couldn't recreate the problem their end, I even emailed your reply and they still couldn't help.
I'll take a look around the server settings soon and see if I can figure it out. If I can't can any one recommend a web developer? The last too I've had moved on to other things.
Thank you all for your help so far, it's most kind of you!
Ben
-
Unfortunately, this is a server-side issue, so the fix is completely different depending on your setup. Basically, the server is trying to compress your pages, most likely (using something like Gzip), and the settings are probably wrong. So, the final encoding isn't quite right.
At first, I was going to say that our crawler might just be finicky on this one, but when I try to load these pages on Google Chrome, I get a temporary error, after which the page loads. This definitely could be causing you some problems.
I tried to check out your setup with BuiltWith, but it's actually choking on the Gzip errors, too:
http://builtwith.com/sussexchef.com
Step 1 might be to just shut the compression/encoding off, and then try to work out the settings. You're probably going to have to pull in your hosting company and/or developer.
-
Hi, Yeah, us brits have a way of getting points across.
thanks for your responce. Moz says :
608 Home page not decodable as specified Content-Encoding
The server response headers indicated the response used
gzip
ordeflate
encoding but our crawler could not understand the encoding used.To resolve 608 errors, fix your site server so that it properly encodes the responses it sends.
The problem is that I don't understand the issue, so I have no idea how to fix it!
-
Hey SussexChef83!!
LOL "I feel like I've cured herpes only to get gonorrhea!"THAT is some funny stuff!!
Check out this article from Moz about HTTP errors in Crawl Reports.
http://moz.com/help/guides/search-overview/crawl-diagnostics/errors-in-crawl-reports
Not sure if this provides you any real help for you or if it just diagnoses the gonorrhea in detail. Basically what I gather is that you need to encode the responses your site is sending in a different way than you currently have set. Hope this helps!!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
How do I create a segment that shows me all pages using a certain keyword? But nothing that doesn't have that keyword?
There must be an easy answer to this, but I can't seem to find it. All I want to do is create a segment in Google Analytics that shows all pages and search strings with "orthopaedics" in the title, with pageviews, uniques etc. If I simply navigate to "All Pages" in Google Analytics and then click Advanced Filters and do an Include Page Contains "orthopaedics" it works just fine. (See attached Screen Shot) But when I try to recreate this as a segment, it pulls in all other pages the users visited before arriving on the orthopaedics page I want to include, which I don't want. I can manually exclude each URL I don't want, but this is tedious and I feel there must be a simpler method I'm just missing. At the end of the day, I'm trying to create a list of every page and dynamically created query string that includes the word "orthopaedics" to say doctor X, your orthopaedics section generated X views, and here's a list of the pages. Mm6YTKa
Reporting & Analytics | | Patrick_at_Nebraska_Medicine0 -
I'm a beginner with Google Analytics, what is a good place to start?
I'm a beginner with Google Analytics, what is a good place to start? Would love to know some simple ways to set up Google Analytics to keep SEO in check. What are some of your suggestions? How would you go about using GA to do this? and what are the alternatives? Thanks!
Reporting & Analytics | | Eric_S
Eric0 -
How to detect where Google gets indexed URL's
Google index some kind of way some links that create duplicate content. We doesn't understand how these are created so we would like detect where Google robots find these links. We tried: Moz Crawl Diagnostics but it shows 0 as Internal Link Count for these kind of links. Find some information from Google Analytics, that maybe there is trace (site content - all content) from visitors side. There wan't. We tried to find some information in Webmaster Tools under Internal link and HTML Improvements but didn't find any trace. Tried some search commands. Is there maybe some good one to search. TO search URL's form code with https://search.nerdydata.com.
Reporting & Analytics | | raido0 -
Pro's & Con's of Wordpress Categorys & Tags
Good Afternoon! I touched on this question a while back in another post specifically regarding a plethora of duplicate pages that I was finding due to inappropriate tagging in wordpress. As I am going through our website, I am starting to notice it happening again with categories as well. I am including some pictures where you can see the URL structures and titles etc of how everything is laid out. I would like to clarify that I was not the one who did any of this Is it wrong/bad to cross categorize? What I mean by that is put something in more than one category? Would there be any drawback to converting any of these into subcategories? Would that even do anything? Does having two pages that are named the same thing, hurt you? It would seem to me that Google wouldn't like that. I have recently come into the field of thought that Google is getting more and more human, and If it makes a human uncomfortable/confused it will make Google confused. In my pictures you can see we clearly have numerous hard copies of the same thing, not just duplicate elements created by wordpress, that is a separate issue. I personally want to change all of the titles and make everything as different and individual as possible, but i also could be very wrong in my desire to do that. Any thoughts are appreciated! eY4iX2N N3AVqss JZpU7Rq
Reporting & Analytics | | HashtagHustler0 -
The client's website serves as the main referral?
Hi mozzers, I have this weird case where one of my client's first referral is its own website!! I am really confused especially that I have checked there www vs non www and the non www is redirected to the www. This means that it resolve to one version which is good! Any thoughts on why the main referral is its own site? Thanks
Reporting & Analytics | | Ideas-Money-Art0 -
Google Analytics Content Experiments don't deliver 50/50?
Our A/B test is actually delivering at about a 70/30 page view rate. 70% in favor of the original version and only 30% of the new. We are sending 100% of our traffic to this homepage test. Has anyone else experienced this? There seems to be a lot of folks experiencing this.....anyone know why?
Reporting & Analytics | | VistageSEO0 -
Tracking 301'd Domains
'ello SEOMozers 🙂 I've been implementing some awesomely search engine friendly 301'd domains for clients and they work great but I seem to have missed the point where I track said domains in some fashion and come here to ask the question --> how are you all tracking refers from 301'd domains that are pointed to your main hubs? Currently our practice in-house is to alias them in our servers then use .htaccess rules to 301 them to the 1 domain to-rule-them-all but Google Analytics doesn't recognize them as different domains with that plan of action and there goes my refer tracking. Is there a super-star way to use the Google URL Builder and combine that with the above plan of action or a better plan? Has anyone else implemented something else more awesome-er that could help a girl out? Much thanks and Happy Thursday!
Reporting & Analytics | | treefrogseo0