Godaddy and Soft 404's
-
Hello,
We've found that a website we manage has a list of not-found URLS in Google webmaster tools which are "soft 404's " according to Google. I went to the hosting company GoDaddy to explain and to see what they could do. As far as I can see GoDaddy's server are responding with a 200 HTTP error code - meaning that the page exists and was served properly. They have sort of disowned this as their problem. Their server is not serving up a true 404 response. This is a WordPress site. 1) Has anyone seen this problem before with GoDaddy?Is it a GoDaddy problem?2) Do you know a way to sort this issue? When I use the command site:mydomain.co.uk the number of URLs indexed is about right except for 2 or 3 "soft URLs" . So I wonder why webmaster tools report so many yet I can't see them all in the index?
-
We haven't tried the plug-in yet. The pages not found route to a custom 404 page so we can see a 302 redirect to that and then the 200 because the custom page was displayed. Per other forums we tried forcing the 404 return code prior to the page being loaded but this seems to be getting ignored or overwritten by GoDaddy.
I understand some people view the 200 as being correct as a page was loaded correctly but Google does ask for a 404 for a page not found.
-
Hi again, Al123al! Are you able to provide any info about your CMS? Or did the Redirection plugin recommendation take care of it? If so, please mark Dan's response as a Good Answer.
-
What CMS platform are you using? If you're on WordPress, for example, you can use the Redirection plugin to redirect any non-existing url to an existing relevant page.
Alternatively you can do the same with your .htaccess file.
-
The URLS don't exist but I can't see a way of having them return a 404.
-
Hi AL123al! Did Dan's response help? We'd love an update.
-
-
I have a few sites on Godaddy and haven't seen anything unusual occurring for soft 404s.
-
It depends on the cause - are they a large percentage of the total indexed pages? By the sound of it they're only 2 or 3 from a total of how many?
The solution is usually to check why your pages aren't returning a proper 404 error code if they don't exist, or whether there is an issue with them being redirected somewhere.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowing WP 'author' page archives
Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?
Technical SEO | | Netrepid0 -
Will you get more 'google juice' if your social links are in your websites header, rather than its footer?
Hi team, I'm in the process of making some aesthetic changes to my website. Its getting quite cluttered so the main purpose is to clean up its look. I currently have 3x social links in the header, right at the top, and i would really like to move these to the footer to remove some clutter in the header. My concern is that moving them may have an impact on the domains ranking in google. Website: www.mountainjade.co.nz We've made some huge gains against our competitors over the past 6 months and I don't want to jeopardise that. Any help would be much appreciated as i'm self taught in SEO and have learnt through making mistakes. This time however, with Moz, i'd rather get some advice before I make any decisions! Thanks is advance, Jake S
Technical SEO | | Jacobsheehan0 -
Medium sizes forum with 1000's of thin content gallery pages. Disallow or noindex?
I have a forum at http://www.onedirection.net/forums/ which contains a gallery with 1000's of very thin-content pages. We've currently got these photo pages disallowed from the main googlebot via robots.txt, but we do all the Google images crawler access. Now I've been reading that we shouldn't really use disallow, and instead should add a noindex tag on the page itself. It's a little awkward to edit the source of the gallery pages (and keeping any amends the next time the forum software gets updated). Whats the best way of handling this? Chris.
Technical SEO | | PixelKicks0 -
Client error 404
I have got a lot (100+) of 404´s. I got more the last time, so I rearranged the whole site. I even changed it from .php to .html. I have went to the web hotel to delete all of the .php files from the main server. Still, I got after yesterdays crawl 404´s on my (deleted) .php sites. There is also other links that has an error, but aren't there. Maybe those pages were there before the sites remodelling, but I don't think so because .html sites is also affected. How can this be happening?
Technical SEO | | mato0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
Blank pages in Google's webcache
Hello all, Is anybody experiencing blanck page's in Google's 'Cached' view? I'm seeing just the page background and none of the content for a couple of my pages but when I click 'View Text Only' all of teh content is there. Strange! I'd love to hear if anyone else is experiencing the same. Perhaps this is something to do with the roll out of Google's updates last week?! Thanks,
Technical SEO | | A_Q
Elias0 -
404 Errors
Hello Team, I noticed that my site has 1,000s of 404 errors. Not sure how this happened, maybe when I updated our CMS. My question is, should I worry about them. Should I delete them or just leave them alone. Thank you for your feedback!
Technical SEO | | Dallas0 -
Does 'framing' a website create duplicate content?
Something I have not come across before, but hope others here are able offer advice based on experience: A client has independently created a series of mini-sites, aimed at targeting specific locations. The tactic has worked very well and they have achieved a large amount of well targeted traffic as a result. Each mini-site is different but then in the nav, if you want to view prices or go to the booking page, that then links to what at first appears to be their main site. However, you then notice that the URL is actually situated on the mini-site. What they have done is 'framed' the main site so that it appears exactly the same even when navigating through this exact replica site. Checking the code, there is almost nothing there - in fact there is actually no content at all. Below the head, there is a piece of code: <frameset rows="*" framespacing=0 frameborder=0> <frame src="[http://www.example.com](view-source:http://www.yellowskips.com/)" frameborder=0 marginwidth=0 marginheight=0> <noframes>Your browser does not support frames. Click [here](http://www.example.com) to view.noframes> frameset> Given that main site content does not appear to show in the source code, do we have an issue with duplicate content? This issue is that these 'referrals' are showing in Analytics, despite the fact that the code does not appear in the source, which is slightly confusing for me. They have done this without consultation and I'm very concerned that this could potentially be creating duplicate content of their ENTIRE main site on dozens of mini-sites. I should also add that there are no links to the mini-sites from the main site, so if you guys advise that this is creating duplicate content, I would not be worried about creating a link-wheel if I advise them to link directly to the main site rather than the framed pages. Thanks!
Technical SEO | | RiceMedia0