How to remove 404 pages wordpress
-
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere?
Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases?
I figure that getting rid of the 404 errors will improve SEO is this correct?
Thanks,
David
-
Yeah...as others have noted, there often is the live link somewhere else that points to a page that is now gone...
So a 404 really is the LINK page....as long as it's out there, it'll point to that non-existant page....so a 301 can help, or (this was fun) you can 301 the incoming 404 link BACK to the linking page itself....
teeHee...yeah, not such a good idea but a tactic that we did have to use about 4 years ago to get a spam directory to "buzz off!!!"
-
Hey David
Once you publish a page/post in WordPress and submit a sitemap, you are stuck with those pages. I've experienced this problem a lot as I use WordPress often. Once you trash a page there and delete it permanently, it's not stored anywhere in the WordPress CMS. They are just reading as 404s since they existed and now no longer exist.
As stated above, just make sure you are not linking to your trashed page anywhere in your site.
I've done a couple things with 404 Pages on my WordPress sites:
1. Make an awesome 404 page so that people will stay on the site if they found your 404 page on accident. Google will eventually stop crawling 404s so this is a good temporary way to engage users.
2. 301 Redirect the 404s to relevant pages. This helps keep your link juice and also helps with the user experience (since they are reaching a relevant page)
Hope that helps!
-
404's are a natural part of websites, Google understands that. As long as you don't have links to pages on your site that are 404'ing you're fine. So basically, just make sure your website is not the source of your 404's.
-
Anything you type after your domain which isn't an actual page will return a not found error; it doesn't mean the page exists somewhere. [Try entering yourdomain.com/anythingyouwant and you will get a 404.] Or am I misunderstanding the question? In any case, 404 errors are not necessarily bad for SEO, as long as they are not harming the user experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing Domains From Disavow File
We may have accidentally included the wrong domains in our Disavow file and have since removed most domains leaving the only very highly rated spammy links (using moz's new spam score)in the file. How long can it take for to google to recognise this change?ThanksMike
Moz Pro | | mlb70 -
My moz only one page was crawled
I recently moved my shopping cart from one provider to another and today moz only crawled one page, could this be because maybe google has not indexed it yet or should i be concerned? I pointed the DNS at the new cart monday night if that helps. I would have expected it to be indexed by now
Moz Pro | | SmartVapes0 -
Crawl diagnostics incorrectly reporting duplicate page titles
Hi guys, I have a question in regards to the duplicate page titles being reported in my crawl diagnostics. It appears that the URL parameter "?ctm" is causing the crawler to think that duplicate pages exist. In GWT, we've specified to use the representative URL when that parameter is used. It appears to be working, since when I search site:http://www.causes.com/about?ctm=home, I am served a single search result for www.causes.com/about. That begs the question, why is the SEOMoz crawler saying there is duplicate page titles when Google isn't (doesn't appear under the HTML improvements for duplicate page titles)? A canonical URL is not used for this page so I'm assuming that may be one reason why. The only other thing I can think of is that Google's crawler is simply "smarter" than the Moz crawler (no offense, you guys put out an awesome product!). Any help is greatly appreciated and I'm looking forward to being an active participant in the Q&A community! Cheers, Brad
Moz Pro | | brad_dubs0 -
Page Authority is the same on every page of my site
I'm analyzing a site and the page authority is the exact same for every page in the site. How can this be since the page authority is supposed to be unique to each page?
Moz Pro | | azjayhawk0 -
How to add a simple page to a campaing.
Hello, My domain is www.artes-plasticas-pollock.com. this domain must be positioned by one keyword, but, inside this domain there are more pages to be posicioned with another keywords. As example, inside the domain there is a page http://www.artes-plasticas-pollock.com/index-rafael-navarro.php that must be positioned by the keyword "Rafael Navarro" ¿ How can I configure it ? May I create a new campaing ? Is it possible to create this page inside the existing campaing related to the main url www.artes-plasticas-pollock ? Please ... any information will be pleased. thanks pilar.
Moz Pro | | OkTuWeb0 -
Pass Page LinkJuice? Or Pass Keyword LinkJuice?
I have a popular page that is not one of the three pages that I am hoping to raise awareness of (want to focus on). The dilemma I am trying to understand is that I really don't want to encourage all the flow from the popular to ONE of my hopeful pages (focus pages). Rather, I want to focus the keyword portions of that page to help the three hopeful pages. So I consider the rel=canonical tag.... err no. rel=canonical would pass ALL my popular page link juice to ONE of my three hopeful pages. What's the best way to pass the keyword link juice relevant to each of my three hopeful pages their, um, portion, of the popular page link juice. I'm white hat by preference. All four pages are good legitimate landing pages, and of course I dread sabotaging the popularity of what is working. Suggestions? Advice?
Moz Pro | | iansears0 -
On page optimisation tool issues
When viewing my campaign and looking at the on page optimisation tool, I have a few issues. I seems to only shows the keywords I want rankings for and how optimised my homepage is for those keywords. Is there any way I can get it to analyse permanently specifc keywords for specific pages because my homepage isnt optimised for some keywords which are on my list, which I have optimised other pages for, and because its looking at my homepage its getting a really low grade, and looks really bad and frustrates me because I cant work this out. Any help greatly appreciated.
Moz Pro | | CompleteOffice1 -
Page Authority vs Domain Authority
I'm using the site explorer to compare a potential clients site against 4 others, in an incredibly competitive market. Each of their competitiors has a higher page authority (on the home page) than their domain authority. This is untrue for the clients site. (which have much lower metrics all round) Any input as to what this means/says about their competitors who I would guess (looking at some of their backlink profiles) have done some failry widespread grey hat stuff in the past. (Though haven't we all 😉 )
Moz Pro | | FDC0