What if 404 Error not possible?
-
Hi Everyone,
I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error.
It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon.
I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired).
Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough?
Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page?
Many thanks in advance for your help,
Best Regards,
Daniel
-
The pages should not show up at all once they are de-indexed.
-
Hi Takeshi, thanks for the asnwer again.
Would it prevent the deleted/expired pages to be shown as soft 404 in the Webmaster tools?
-
Ok, sounds like a noindex,follow in the header is the best solution then. That will keep the no-longer-existant pages from being indexed while still preserving any link juice the page may have acquired.
-
Hi Again,
@Takeshi Young: Thanks for your answer.
I will try to explain what is happening a little better.
We are using a CMS for Classifieds adds. The script is able to give "SEO Friendly" URLs, which are based in mode_rewrite. If a listing has an ID number, lets say "5", that listings url will look like this:
http://mydomain.com/5-listingname/details.html
After the listing expires, the URL will not be valid anymore, and if a user try to visit the listing, the script deliver a page with a message indicating that the lising is not active anylonger. The HTTP Code is 200 "ok". If the listing is deleted, then a user trying to visit the URL will get a similar message, also with a HTTP Code 200. It is a problem, because that page should return a 404 code, indicating the search engine that the page is gone.
If a user try to visit an invalid page, like for example:
http://mydomain.com/invalidpage.html
then the system will deliver the 404 page that is set in the .htaccess file, but since the script recognises the numeric parameter in the deleted/inactive listing, it does not deliver the 404 error but a page with a message, and this page with a message is a soft 404 error, bad for SEO.
It is out of my knowlage to repair the script in order to make it deliver the proper 404 header, but I can customize as much as I want the page indicating the error.
Then I have two questions:
-
If I set the soft 404 error page as noindex, will it be good enough as to not being affected by the problem?
-
Is there any way of indicating the search engine that a page is 404, other than using the apache .htaccess? Like a tag in the head section? or any trick that would help me with this problem?
Thanks in advance for your help,
Daniel
-
-
Why are these parameters an issue for you? Where are they getting linked from? If it's from a high authority external site, it may make sense to 301 redirect them. If they're just low quality sites, it's probably safe to ignore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl errors - 2,513 not found. Response code 404
Hi,
Technical SEO | | JamesHancocks1
I've just inherited a website that I'll be looking after. I've looked in the Search Console in the Crawl errors section and discovered thousands of urls that point to non- existent pages on Desktop. There's 1,128 on Smartphone.
Some are odd and make no sense. for example: | bdfqgnnl-z3543-qh-i39634-imbbfuceonkqrihpbptd/ | Not sure why these have are occurring but what's the best way to deal with them to improve our SEO? | northeast/ | 404 | 8/29/18 |
| | 2 | blog/2016/06/27/top-tips-for-getting-started-with-the-new-computing-curriculum/ | 404 | 8/10/18 |
| | 3 | eastmidlands | 404 | 8/21/18 |
| | 4 | eastmidlands/partner-schools/pingle-school/ | 404 | 8/27/18 |
| | 5 | z3540-hyhyxmw-i18967-fr/ | 404 | 8/19/18 |
| | 6 | northeast/jobs/maths-teacher-4/ | 404 | 8/24/18 |
| | 7 | qfscmpp-z3539-i967-mw/ | 404 | 8/29/18 |
| | 8 | manchester/jobs/history-teacher/ | 404 | 8/5/18 |
| | 9 | eastmidlands/jobs/geography-teacher-4/ | 404 | 8/30/18 |
| | 10 | resources | 404 | 8/26/18 |
| | 11 | blog/2016/03/01/world-book-day-how-can-you-get-your-pupils-involved/ | 404 | 8/31/18 |
| | 12 | onxhtltpudgjhs-z3548-i4967-mnwacunkyaduobb/ | Cheers.
Thanks in advance,
James.0 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Duplicate Page Errors
Hey guys, I'm wondering if anyone can help... Here is my issue... Our website:
Technical SEO | | TCPReliable
http://www.cryopak.com
It's built on Concrete 5 CMS I'm noticing a ton of duplicate page errors (9530 to be exact). I'm looking at the issues and it looks like it is being caused by the CMS. For instance the home page seems to be duplicating.. http://www.cryopak.com/en/
http://www.cryopak.com/en/?DepartmentId=67
http://www.cryopak.com/en/?DepartmentId=25
http://www.cryopak.com/en/?DepartmentId=4
http://www.cryopak.com/en/?DepartmentId=66 Do you think this is an issue? Is their anyway to fix this issue? It seems to be happening on every page. Thanks Jim0 -
Error on Magento database 301 bulk update
Hi all, One of my client has a magento website and I recently received received 404 errors for about 600 links on GWT and I tried to give 301 redirection via bulk upload but i get errors. It's magento 1.7 and I have following columns on csv file. I included first sample row as well. <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
Technical SEO | | sedamiran
| url_rewrite_id | store_id | id_path | request_path | target_path | is_system | options | description | category_id | product_id |
| 125463 | 1 | 22342342_54335 | old_link | new_link | 0 | RP | NULL | NULL | NULL | | | | | | | | | | | | The error msg I receive is below. I was wondering if anyone has tried this before and if you know you how to fix this. Manual redirection works fine but probably this first 600 error is just a start, I'll be getting more 404 errors soon, somehow i need to figure out how to fix this. I appreciate if any one has experience on this and guide me through. Thanks in advance, Here is the error: SQL query: INSERT INTO 'mgn_core_url_rewrite'
VALUES ( 'url_rewrite_id', 'store_id', 'id_path', 'request_path', 'target_path', 'is_system', 'options', 'description', 'category_id', 'product_id' )MySQL said: #1452 - Cannot add or update a child row: a foreign key constraint fails ('ayb_mgn2'.'mgn_core_url_rewrite', CONSTRAINT 'FK_101C92B9EEB71CACE176D24D46653EBA' FOREIGN KEY ('category_id') REFERENCES 'mgn_catalog_category_entity' ('entity_id') ON DELETE CASCADE ON) <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
| | | | | | | | | | |1 -
Numerous 404 errors on crawl diagnostics (non existent pages)..
As new as them come to SEO so please be gentle.... I have a wordpress site setup for my photography business. Looking at my crawl diagnostics I see several 4xx (client error) alerts. These all show up to non existent pages on my site IE: | http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109 | Totally lost on what could be causing this. Thanks in advance for any help!
Technical SEO | | Swanny8110 -
Best action to take for "error" URLs?
My site has many error URLs that Google webmaster has identified as pages without titles. These are URLs such as: www.site.com/page???1234 For these URLs should I: 1. Add them as duplicate canonicals to the correct page (that is being displayed on the error URLs) 2. Add 301 redirect to the correct URL 3. Block the pages in robots.txt Thanks!
Technical SEO | | theLotter0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
How do crawl errors from SEOmoz tool set effect rankings?
Hello - The other day I presented the crawl diagnostic report to a client. We identified duplicate page title errors, missing meta description errors, and duplicate content errors. After reviewing the report we presented it to the clients web company who operates a closed source CMS. Their response was that these errors are not worthy of fixing and in fact they are not hurting the site. We are having issues getting the errors fixed and I would like your opinion on this matter. My question is, how bad are these errors? Should we not fix them? Should they be fixed? Will fixing the errors have an impact on our site's rankings? Personally, I think the question is silly. I mean, the errors were found using the SEOmoz tool kit, these errors have to be effecting SEO.....right? The attached image is the result of the Crawl Diagnostics that crawled 1,400 pages. NOTE: Most of the errors are coming from Pages like blog/archive/2011-07/page-2 /blog/category/xxxxx-xxxxxx-xxxxxxx/page-2 testimonials/147/xxxxx--xxxxx (xxxx represents information unique to the client) Thanks for your insight! c9Q33.png
Technical SEO | | Gabe0