Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Custom 404 Page Indexing
-
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
-
Sorry that I missed this response! You want to find a server header checker tool (doing a Google search on those keywords will get you several such tools) and then put your 404 page into that tool. That will tell you if that page is truly serving up a 404.
-
This is a great idea that I missed. Any hints / links on setting this up correctly for a basic html site? We are old school.
-
Have you checked the response code of your custom 404 page and made sure that it's returning a 404 and not a 200?
-
Hi sftravel,
Good question.
There has to be some kind of keyword cannibalization happening if your custom 404 page is really ranking higher than the page you've optimized for those keywords.
In short... Yes. No-Index your 404 page. Users seeing a 404 error page as a search result doesn't really do anything but harm your site's reputation, as it more than likely makes them think that the site is ill-maintained.
The more important question is this:
**Is the 404 page itself ranking and receiving the traffic or is there a page on your site that has moved and is now yielding the 404 error? **
I'd suggest using the Queries and Landing Pages sections in the Traffic Sources (Search Engine Optimization) menu in Google Analytics to track down what search queries are leading to your 404 page. Webmaster tools will work too. Im almost positive that it's simply a page that was ranking and has since moved that's causing the problem. If so, a simple 301 redirect to the correct URL should solve the problem without harming the ranking of the page or decreasing organic traffic. It may actually improve, as I'm sure the bounce rate for the clicks from that specific query will decrease.
Hope this helps.
Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
What’s the best tool to visualize internal link structure and relationships between pages on a single site?
I‘d like to review the internal linking structure on my site. Is there a tool that can visualize the relationships between all of the pages within my site?
Web Design | | QBSEO0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Internal Linking: What is the best practice for pages not included in Nav bar?
I never quite understood why internal linking was such a big deal for SEO, but now I'm having second thoughts and perhaps understanding it more. I always thought since most websites have a navigation feature--usually the menu bar located at the top and often another one in the footer--that internal navigation was usually already built in to most websites and therefore, a silly topic to make a fuss over; however, I may be the silly one after all. I am now creating pages that are not included in the navigation so.... What is the best practice for this? If I am creating say, pages for certain locations and those location pages begin to number in the hundreds, it makes my navigation bar a little too cumbersome to have all those pages in a drop down menu. So I made a Locations page and just link to all those pages from that page (and from nowhere else). But now I'm wondering if this could be a bad internal linking practice and perhaps hurt my online visibility as an SEO ranking factor. Is this a crawl problem? And if so, is there a better option that provides a good visitor experience while appeasing the search engines.
Web Design | | Dino640 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
Do I need to 301 redirect www.domain.com/index.html to www.domain.com/ ?
So, interestingly enough, the Moz crawler picked up my index.html file (homepage) and reported duplicate content, of course. But, Google hasn't seemed to index the www.domain.com/index.html version of my homepage, just the www.domain.com version. However, it looks like I do have links going specifically to www.domain.com/index.html and I want to make sure those are getting counted towards my overall domain strength. Is it necessary to 301 redirect in the scenario described above?
Web Design | | Small_Business_SEO0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870