Apache not directing to 404?
-
I have a PHP website that produces an actual page when /index.php/GarbageURL/MoreDirectories/Page.suffix/DirectoryAgain is typed in a browser.
Why?
How?
For what purpose?
The content and HTML is produced in the source, but the images and css are broken due to the location of the file, obvi. I don't understand what this default tendency is for.
-
Hi Rory,
That's not the default tendency of apache. If the file doesn't exist, it would normaly return a 404 error. You might have a .htaccess file on your server with some rewrite rule that allow such thing to happen. I might also be a misconfiguration trying to show a wrong default 404 page.
Also, to fix your broken css/images/links when the page is in a subdirectory, you should use urls relative to the root of your domain, so instead of simply using href="css/main.css" you might want to use href="/css/main.css".
Finaly, have you looked at your HTTP headers to see if this page was returning a 404 or a 200 code? For your default 404 page, you might want to look at this : http://httpd.apache.org/docs/2.0/mod/core.html#errordocument
Regards,
Guillaume Voyer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 re-direct affect on SERPS
Hi Moz Community, please can I hit you with a scenario and get your thoughts? We have a client site - clientsite.com - with reasonable rankings for some of our client's target search terms/branded terms. We have built language specific subdomains - it.clientsite.com, de.clientsite.com - which have been manually translated into local languages. These subdomains have robots 'noindex' as we only want to drive traffic to clientsite.com. We've installed a geo location tool on clientsite.com that 301s visitors to the appropriate subdomain, so content is served in their local language. clientsite.com will be the 'catch all' for locations where sub domains have not yet been created. If Google crawls clientsite.com and is 301ed to a sub domain, will we lose SERPS? The sub domains will have the same content (99% the same content anyway) as clientsite.com, but in local languages. Cheers guys. Steve
Intermediate & Advanced SEO | | steviechat1 -
Spike then Drop in Direct Traffic?
We've been doing some SEO work over the last few weeks and earlier this week we saw a large spike in traffic. Yay we all thought, but then yesterday the traffic levels returned to pre-celebratory levels. I've been doing some digging to try and find out what was different Monday and Tuesday this week. Mondays are usually big traffic days for us anyway, but this week was by far the biggest, and Tuesday was even higher still, our best day ever. After some poking, I found that the direct traffic followed the same pattern as our overall traffic levels (image attached). The first spike coincides with an email we sent out that day, but the later spike we just don't know where it came from? I understand loosely that direct isn't easily traceable, but can anyone help us understand more about this second spike? Thanks! ayqL2wi
Intermediate & Advanced SEO | | HB170 -
When should you 410 pages instead of 404
Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Webmaster tools 404
Hey, I'm getting a soft 404 error on a webpage that has content and is deferentially not a 404. We've redirect a load of urls to the web page. The url has parameters which was used before the redirect but are no longer used on by the new url, these parameters have been carried over in the redirect. Is this whats causing the soft 404 error or is there another problem that may need addressing? Also a canonical has been set on the webpage. Thanks, Luke.
Intermediate & Advanced SEO | | NoisyLittleMonkey1 -
How should I go about repairing 400,000 404 error pages?
My thinking is to make a list of most linked to and most trafficked error pages, and just redirect those, but I don't know how to get all that data because i can't even download all the error pages from Webmaster Tools, and even then, how would i get backlink data except by checking each link manually? Are there any detailed step-by-step instructions on this that I missed in my Googling? Thanks for reading!!
Intermediate & Advanced SEO | | DA20130 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Soft 404's from pages blocked by robots.txt -- cause for concern?
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?
Intermediate & Advanced SEO | | nicole.healthline4 -
Should I let my Apache server compress automatically site information?
My internet service provider has an option to let Apache compress site information. They give you two options: compress all content or compress only the MIME type specific. This is good for SEO?
Intermediate & Advanced SEO | | Naghirniac0