404 Errors generating in WP
-
Our crawl reports are generating back several 404 errors for pages with urls that look like:
/category/consulting/page/5/
The tag changes, the page number changes, but the result is always the same: A big glaring 404. Our sites are built on WordPress Multi-site, and I am fairly certain this issue is on the WP end, but I can't figure out why it is generating pages out to infinity, essentially, from the tags and categories.
It is worse on some sites than others, but is happening across the board (my initial concern was that it might be a theme issue, but that does not seem to be the case).
If anyone has run into this issue and knows a fix, you're insight would be greatly appreciated.
Thanks!
-
This is the exact answer that you want if you have any issues running into problems once you have isolated them with screaming frog you can easily pick them out and redirect them either through your host or through a plug-in depending upon where you're hosted and how your site is being managed.
-
Here's what to do;
- Crawl the site with Screaming Frog (you can do up to 500 URLs with the free version).
- Look for the suspect pages in your crawl list.
- Click once on the page in the list
- Below, click "To Links" - and this box will show you the page(s) linking to the bad 404 page.
- View the "From" page (which links to the bad page) in your browser.
- Look at the source code for that page - do a "find" and paste in the bad page's URL. This will show you where it's linking from.
- Fix the spot in your theme or template where it is linking to pages that do not exist.
-
I want to first say I apologize for the wording on my first answer I was using voice recognition and it made a few mistakes. I also want to say I agree with what Dan has said he really knows WordPress and I would take
his advice the fact that he's endorsed Lynn's answer makes me believe that it is the correct one and screaming frog spider is one of the best tools in the world for any type of website and can help you create 301's easily to get rid of your 404's on the paid version
here is a great Screaming frog spider guide By Seer
http://www.seerinteractive.com/blog/screaming-frog-guide
I hope this helps,
Thomas
-
Hi Melissa, you've received some great responses. Did any of them help you resolve your issue?
-
I second Lynn's answer. You need to find where the link is coming from to begin with. Could also use Screaming Frog SEO Spider or Webmaster Tools - they will all get you the same thing. Find out where the bad URLs are linked from, and then you can narrow down the source of bad code or whatever it may be.
-Dan
-
Yoast's SEO plug-in for WordPress
Will eliminate the /1/ /2/ /3/ page effect.
http://yoast.com/wordpress/seo/
I hope this is of help. The running multisite through a non-subdomain set up on expecting correct?
-
Hi Melissa,
It is actually pretty common for wp based sites to spin out repeated 404 errors (or at least have the potential to do this). It could be a theme issue, a plugin issue or a setup issue or a combination of all 3 depending on the site.
First thing to do is figure out where the link is coming from. Download your moz error report in csv and filter by 404's. On the left you will have the 404 page and on the far right you will have the referring page. Go to the referring page and view the source and try to find that link. Keep an eye out for relative links from deep structure pages also (so you have page /category/page/5 and the link is a relative href="/" or href="/6" which can have the effect of spawning repeated 404s like /category/page/5/6 etc etc).
Once you have found the link, you should be able to identify the template part which is producing them and then act accordingly to either edit the template or adjust the settings/plugin so that they stop. If you can give us a real example of a 404 and its linking page we should be able to give you more specific info.
-
I seriously doubt the issue is with WP. perhaps a plugin might be conflicting with how the page is rendered. Make sure you have a database and file backup. You could FTP in and rename the plugins folder to _plugins ( this will disable all plugins ) and check if the issue persists. Rename the folder back to plugins once you finish testing it.
NOTE : with some plugins you will have to manually go in a enable and configure them after you do this.
-
Hi 404 are extremely common. I would build 404 page and I would also use this plug-in here but make sure that you understand that this is by far no way perfect. You must check the logs consistently and make sure their note for forest mistaken for instance I had used it once and it had four or my feed back my homepage
http://wordpress.org/plugins/redirection/
However this is a very good plug-in that has been progressively getting better. It is recommended by some of the best people in WP I strongly recommend using it and I hope I've helped you if you need anymore help please let me know.
sincerely,
Thomas
Yes this will also work if you're using Nginx a much faster web server type that Apache or even lightspeed. However it is very hard to create 301 anyone use a Nginx web server environment. However this tool is able to do so and then most Managed WordPress hosts are the ones that are actually using Nginx with this Internet these days. Or managed WordPress hosts are able to then rewrite the correct 301 with this redirection plug-in is able to create 301 100% then most of the time people running Nginx host make the corrections after the redirection have to spend time.
The specialty of this redirection plug-in the is catching 404's and redirecting them to a page that it should be redirected to logically for instance if I was going to have /about/ 404 and then /about- us/ the plug-in easily figure that out that should be the correct redirect and then mixed the correction automatically you can also turn off this mode once you believe 404's are no longer an issue.
You can also use an Apache only mode which depending on your Web server might me your needs better depending on what type of Webster rerunning
I think this is an outstanding plug-in and definitely believe that it is a huge help if you have a any amount of 404s.
I hope I've been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to eliminate "429 : Received HTTP status 429" errors?
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz. Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with: Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report. What can I do to eliminate "429 : Received HTTP status 429" errors? Any insight you can offer is greatly appreciated! Thanks,
Moz Pro | | ryanjcormier
Ryan0 -
Moz & Xenu Link Sleuth unable to crawl a website (403 error)
It could be that I am missing something really obvious however we are getting the following error when we try to use the Moz tool on a client website. (I have read through a few posts on 403 errors but none that appear to be the same problem as this) Moz Result Title 403 : Error Meta Description 403 Forbidden Meta Robots_Not present/empty_ Meta Refresh_Not present/empty_ Xenu Link Sleuth Result Broken links, ordered by link: error code: 403 (forbidden request), linked from page(s): Thanks in advance!
Moz Pro | | ZaddleMarketing0 -
What web page and domain analysis / error checking / testing tools do you use for competitor analysis? sites like webpagetest
Just wondering what everyone is using, I am looking to get as much insight and detail as I can on websites that are not currently being monitored by me... i.e. potential clients. I use tools like pagespeed, webpagetest, loadimpact and open site explorer, google adwords, ispionage, alexa, semrush and well, looking for more. I really just want to rip a website to the tiniest pieces possible in an organized and coherent manner... is there anything out there? I have tried several other's which i no longer use (compete, 4q, woopra, nuestar, to name a few), I am not sure if I know exactly what i want, i just want more.... damn the human condition. lol
Moz Pro | | atb9900 -
20000 site errors and 10000 pages crawled.
I have recently built an e-commerce website for the company I work at. Its built on opencart. Say for example we have a chair for sale. The url will be: www.domain.com/best-offers/cool-chair Thats fine, seomoz is crawling them all fine and reporting any errors under them url great. On each product listing we have several options and zoom options (allows the user to zoom in to the image to get a more detailed look). When a different zoom type is selected it adds on to the url, so for example: www.domain.com/best-offers/cool-chair?zoom=1 and there are 3 different zoom types. So effectively its taking for urls as different when in fact they are all one url. and Seomoz has interpreted it this way, and crawled 10000 pages(it thinks exist because of this) and thrown up 20000 errors. Does anyone have any idea how to solve this?
Moz Pro | | CompleteOffice0 -
SEOMoz says i have errors but goole webmaster doesnt show them - which one is right ?
I have about 350 websites all created in farcry 4.0 cms platform. When i do a site crawl using any seo tool ( seomoz, raven, screaming frog) it comes back telling me I have duplicate titles, description and content for a bunch of my pages. The pages are the same page its just that the crawl is showing the object Id and the friendly URL which is autocreated in the CMS as different pages. EXAMPLE these are the samge page but are recognised as different in SEOMOZ crawl test and therefore flagged as having duplicate title tags and content ... <colgroup span="1"><col style="width: 488pt; mso-width-source: userset; mso-width-alt: 23771;" span="1" width="650"></colgroup>
Moz Pro | | cassi
| www.westendautos.com.au/go/latest-news-and-specials <colgroup span="1"><col style="width: 488pt; mso-width-source: userset; mso-width-alt: 23771;" span="1" width="650"></colgroup>
| www.westendautos.com.au/index.cfm?objectid=9CF82BBD-9B98-B545-33BC644C0FA74C8E | | GOOGLE WEBMASTER however does not show me these errors ? It shows no errors at all. Now i believe i can fix this by chucking in a rel=canonical at the top of each page ? (a big job over 350 sites) But even so - my problem is that the website developers are telling me that SEOMOZ and all the other tools are wrong - that google will see these the way it should, that the object ID's would not get indexed ( although i have seen at least one object id show up in the serps.) Do i believe the developers and trust that google has it sorted or go through the process of hassling the developers to get a rel=canonical added to all the pages? (the issue sees my homepage as about 4 different pages www.domain.com/ www.domain.com/home /index AND object id.0 -
SEOmoz crawl error questions
I just got my first seomoz crawl report and was shocked at all the errors it generated. I looked into it and saw 7200 crawl errors. Most of them are duplicate page titles and duplicate page content. I clicked into the report and found that 97% of the errors were going off of one page It has ttp://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20 http://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20__quickjump__A__name_box__begins__name__A__quickjump__E etc Has 20 pages of slight variations of this link. It is all my members list or a search of my members list so it is not really duplicate content or anything. How can I get these errors to go away and make search my site is not taking a hit? The forum software I use is IPB.
Moz Pro | | NoahGlaser780 -
How to handle crawl diagnostic errors for the same url. /products & /products/
I have copied on of the errors out of the crawl diagnostics report. Both /products and /products/ are returning an error, and both have pretty good domain authority so I feel like its hurting my site that these show up this way. Both urls create the same page, should I just setup a 301 on the /products with no slash or will that cause more harm... I am using the MODx cms system and that could have something to do with it. | Products | Datalight http://www.datalight.com/products 1 37 5 Products | Datalight http://www.datalight.com/products/ | 1 | 30 | 1 |
Moz Pro | | tjsherrill0 -
Keyword Difficulty Tool: Error
Hi - is anyone else getting an error using the Keyword Difficulty tool? I'm getting "ERROR: There was a transient error with your request. Please try again."
Moz Pro | | ErikDster0