Custom 404 Page Indexing
-
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
-
Sorry that I missed this response! You want to find a server header checker tool (doing a Google search on those keywords will get you several such tools) and then put your 404 page into that tool. That will tell you if that page is truly serving up a 404.
-
This is a great idea that I missed. Any hints / links on setting this up correctly for a basic html site? We are old school.
-
Have you checked the response code of your custom 404 page and made sure that it's returning a 404 and not a 200?
-
Hi sftravel,
Good question.
There has to be some kind of keyword cannibalization happening if your custom 404 page is really ranking higher than the page you've optimized for those keywords.
In short... Yes. No-Index your 404 page. Users seeing a 404 error page as a search result doesn't really do anything but harm your site's reputation, as it more than likely makes them think that the site is ill-maintained.
The more important question is this:
**Is the 404 page itself ranking and receiving the traffic or is there a page on your site that has moved and is now yielding the 404 error? **
I'd suggest using the Queries and Landing Pages sections in the Traffic Sources (Search Engine Optimization) menu in Google Analytics to track down what search queries are leading to your 404 page. Webmaster tools will work too. Im almost positive that it's simply a page that was ranking and has since moved that's causing the problem. If so, a simple 301 redirect to the correct URL should solve the problem without harming the ranking of the page or decreasing organic traffic. It may actually improve, as I'm sure the bounce rate for the clicks from that specific query will decrease.
Hope this helps.
Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
What happens if I 301 Redirect my homepage to a different page on site
If i were to 301 redirect the index page of my website to a page in a different subdirectory of my site would that adversely affect SEO? Does your home page need to be in the root of your site? I'm asking because a developer has told me that it would be best to do that since he needs to install OpenCart on the root of our domain...
Web Design | | SheffieldMarketing0 -
Redirecting duplicate pages
For whatever reason, X-cart creates duplicates of our categories and articles so that we have URLs like this www.k9electronics.com/dog-training-collars
Web Design | | k9byron
www.k9electronics.com/dog-training-collars/ or http://www.k9electronics.com/articles/anti-bark-collar
http://www.k9electronics.com/articles/anti-bark-collar/ now our SEO guy says that we dont have to redirect these because google is "smart enough" to know they are the same, and that we should "leave it as-is". However, everything I have read online says that google sees this as dupe content and that we should redirect to one or the other / or no /, depending on which most of our internal links already point to, which is with a slash. What should we do? Redirect or leave it as is? Thanks!0 -
Google also indexed trailing slash version - PLEASE HELP
Hi Guys, We redesigned the website and somehow our canonical extension decided to add a trailing slash to all URLs. Previously our canonical URLs didn't have a trailing slash. During the redesign we haven't changed the URLs. They remained same but we have now two versions indexed. One with trailing slash one without. I've now fixed the issue and removed the the trailing slash from canonical URLs. Is this the correct way of fixing it? Will our rankings be effected in a negative way? Is there anything else I need to do. The website went live last Tuesday. Thanks
Web Design | | Jvalops0 -
Flag page elements to not be loaded by Instapaper and co.
Does anybody know if there is a way to mark certain elements (especially navigation menus) so that instapaper and co don't pull them? I'm looking for a quick solution (best would be if it was CSS based) nothing fancy like parsing the user-agent. That would be plan B. I've added role="navigation" id="navigation" and class="navigation" to the nav elements in hope that it would work. Seems like it does not; sometimes the elements are present in the page generated by instapaper, sometimes not. Thank you for any replies and have a great day! Jan
Web Design | | jmueller0 -
Google indexing Quickview popups
Hi Guys I can't seem to find any info on this. Maybe you can help. We are using xcart as our shopping cart. When you land on a product page you have the option to "Quickview" the item. Google is picking up the quickview urls" and the vote on product urls. I have added the following to the robots.txt file but not sure if this will work. Any help on this would be great. Disallow: /?popup=Y Disallow: /?mode=add Undesired URL Examples: <colgroup><col width="735"></colgroup>
Web Design | | fasctimseo
| http://www.funlove.com/store/6_Pack_Shooter_Beer_Belt/?mode=add_vote&vote=60 | <colgroup><col width="735"></colgroup>
| http://www.funlove.com/store/6_pack_shooter_beer_belt/?popup=Y |0 -
Best way of conserving link juice from non important pages
If I have a bunch of non important pages on my website which are of little use in the SE's index - IE contact us pages, pages which are near duplicate and conflict with KW's targetting other pages etc, what is the best way of retaining the link juice that would normally be passed to these pages? Most recent discussion I have read has said that with nofollow you effectively just loose link juice, as opposed to conserving it, so that doesn't seem a great option. If I do "noindex" on these pages, would that conserve the link juice in the site, or again would it be just lost? It seems quite a tricky situation as many pages are legitimate for customer usability, but are not worth having in the SE's index and you better off consolidating link juice - so it seems you are getting penilised for making something "for users". Thanks
Web Design | | James770 -
Do Pages That Rearrange Set Off Any Red Flags for Google?
We have a broad content site that includes crowdsourced lists of items. A lot of the pages allow voting, which causes the content on the pages (sometimes the content is up to 10 pages deep) to completely rearrange, and therefore spread out and switch pages often among the (up to 10) pages of content. Now, could this be causing any kind of duplicate content or any other kind of red flags for Google? I know that the more the page changes the better, but if it's all the same content that is being moved up and down constantly, could Google think we're pulling some kind of "making it look like we have new content" scheme and ding us for these pages? If so, what would anyone recommend we do? Let's take an example of a list of companies with bad customer service. We let the internet vote them up and down all the time, the order changes depending on the votes in real time. Is that page doomed, or does Google see it and love it?
Web Design | | BG19850