Generating 404 Errors but the Pages Exist
-
Hey
I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine.
This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine.
The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless.
Have tried to rollback to previous versions but still does not work.
Anyone had any experience of similar issues?
Many thanks
K.
-
FYI, we finally found our error. The short URL turned out to be the same name as the folder (photo-gallery) so once this was changed, wordpress was able to access the correct path. A bit of custom javascript had to be amended as well, but that was limited to our custom code. Using your web-sniffer.net link we were able to test immediately and fix it fairly quickly. Thank you for your help!
-
That's true Ryan I guess it is coding related really.
Issues like this are a real pain in the ass. And most people don't even check WMT to realise the issues exist. TBH, I don't check as often as I should.
-
I agree with you Paul.
As you pointed out one possible cause is a CMS-related issue which I would refer to as "coding" meaning something in the code which was used to present the website. Perhaps there is a better way to phrase it but nothing comes to mind at the moment.
Another possibility you mentioned is Litespeed which would be a server-side issue directly. Either way, it is a legitimate issue which should be addressed.
-
FWIW, I don't think it's a coding issue. If it were coding, it would either show a 200OK or it would show a 404. It wouldn't sometimes serve a 404.
If you're using Litespeed, I'd guarantee that is the issue and if you're using Joomla, it's another prime culprit.
-
Please keep in mind, that 404 error does not mean the page doesn't exist. It means your server, is sending a response code to indicate that it doesn't exist.
When I installed Litespeed on my server, this issue happened over and over again.
I believe Joomla for example, has some kind of security module that serves a 404 if a single IP requests a page too many times. I remember running SEOFrog on a friends Joomla site and tons of 404's were showing up.
-
Dev team are looking into it, must be quite a complex htaccess issue. Will get to the bottom of it this week and post any findings.
-
Thanks Ryan! I will get it looked at...Sue
-
@DentalID, the same reply I offered to Guy applies for you as well. This is an SEO issue which does need to be fixed. Something on your end is causing the page to show with a 403 response code. You really need a programmer to get in there and determine the root cause of the issue. You could try asking your web host if you have managed hosting, but this level of assistance would normally be outside the support of managed hosting.
-
Guy,
In looking at the page this appears to be a legitimate problem. Your server settings allow you to present a page with any header code you wish. You can 301 a page but still present the page with a 200 code if you want. Presently it appears the page is being presented fine but your server is offering a 404 header code.
I can't tell the actual source of the problem other then to say it appears to be on your end and should be fixed. I originally looked at the code with the MOZbar but then checked independently with another tool as well. http://web-sniffer.net/
All tools show a 404 header code for the page. This response code is generated by your web server.
-
We are having a similar problem with this URL: http://dentalimplantsportland.com/photo-gallery/ and also the following locations:
http://cosmeticdentistportland.net/photo-gallery/
http://dentalveneersportland.com/photo-gallery/
SEO Moz and Google webmaster tools show it as a 403 error but the pages display fine. I am not able to tell if this is really a problem for SEO or if we should reconstruct this gallery system and would really love your input.
This is Wordpress with a Spry gallery...
Thanks so much!
-
It is just a small affiliate site I am looking at - this page creates a 404.
http://www.insure-uk.com/post-office-car-insurance.html
Currently testing on some beta servers. Hopefully should fix soon as otherwise it will lose indexation.
-
I also see this now and again, but next crawl they fix themselfs. i assume robots can not always reach page for a number of reasons
-
Can you offer an example of a URL which is causing this problem?
-
I have had the same issues, I think it is often the bot's problem
Just to be certain check your links are correct and manually test them. Also ensure your sitemap is up to date and that you are not blocking the crawlers with metarobots, robots.txt, or some weird stuff in htaccess.
I have found that renaming pages or moving them will often cause 404 issues with crawlers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many SEO changes needed on a page. Create a new page?
I've been doing some research on a keyword with Page Optimization. I'm finding there's a lot of changes suggested. I'm wondering that because of the amount of changes required is it better to create a new page entirely from scratch that has all the suggestions implemented OR change the current page? Thanks, Chris
Intermediate & Advanced SEO | | Chris29181 -
Open Site Explorer - Top Pages that don't exist / result of a hack(?)
Hi all, Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which. The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’. And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs: ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console. Sooner or later they’ll disappear. Still, a year after, they appear. I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack. The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links. My question is: Is there a more direct action I should take here? For example, informing Google of the offending domains with these backlinks. Any thoughts appreciated! Many thanks
Intermediate & Advanced SEO | | macthing1 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Why is my XML sitemap ranking on the first page of google for 100s of key words versus the actual relevant page?
I still need this question answerd and I know it's something I must have changed. But google is ranking my sitemap for 100s of key terms versus the actual page. It's great to be on the first page but not my site map...... Geeeez.....
Intermediate & Advanced SEO | | ursalesguru0 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0