Any idea why this is reporting a 404 in MozTools?
-
I did away with a vague category and 301 redirected the category url to the home page. However the link is reporting as a 404 in Moz Tools when it scans my site. Here's the link, and as you can see it redirects to the home page. Just curious if I did something wrong. Thanks.
-
Totally agree with Ryan, you should redirect https to http
Trivial maybe, but can see no reason not to do it as it takes no effort and makes your site even more accessible to users who mis-type, key your address directly into the url bar while on a https site etc
*This was in response to Ryans post which started "Perhaps i am being too picky..." I accidentally clicked the wrong reply button!!
-
Hi Rick,
While I know that Ryan decided to drop the https:// protocol problem, I just wanted to explain why it could be an issue for you.
It may not be a concern for a lot of personal sites, but for those where the site may be serving a strongly recognized brand it definitely should be a concern. That recognition could be coming both from your online visibility and/or from your involvement in offline communities or activities. Basically, if you have a reputation or following, people who know of your site will be much more likely to type your URL straight into a browser to go there.
I, for example, have come to know and love noahsdad.com through your involvement here in SEOmoz Q&A. I've visited the site, love your work and from time to time it crosses my mind to drop in and see what Noah has been up to lately. Since I know the site's domain, when that happens, I click inside the field at the top of my browser and replace everything after the www of the site that is open with noahsdad.com.
Now, in the event that the page I had open in my browser when I did that happened to be using the https:// protocol and I didn't realize that (which often happens), I would actually be asking my browser to go to https://www.noahsdad.com...and I think now you see why this could be an issue for you.
Hope that helps
Sha
Thumbs up for the catch too Ryan!
-
It could be that MozTools is looking at a cached version . You could try fetching it in Google Webmasters Tool to see how Google sees it ( which is what you should really worry about )
But as far as I can see you have 301 in place for that category to your home page : www.webconfs.com/http-header-check.php?submit=submit&url=http://noahsdad.com/mom-md
-
I would need to see the full record from the crawl report in order to respond. Perhaps you can upload the record to a web server and share the link?
-
No worries.
My real question is still why the redirect is showing up as a 404 when the site is crawled. I'd still be interested in figuring that out if you have any thoughts.
Thanks.
-
I am going to let this issue drop since it is such a small item for a personal site. A few last thoughts:
**Wordpress automatically redirects to the non http version. **
No, it does not. On the SEOmoz site, you are seeing the proper redirect. For example, if you take this Q&A post and prefix it with https:// you will wind up on this exact Q&A post in the http:// protocol. If you look at the Mozbar, you will see the redirect.
On your site, you are taken to another page from your hosting company.
**I dont know many personal sites that pay for a https cert **
There is no need for you to purchase a SSL certificate. That is not what I was suggesting.
I apologize for bringing this trivial matter up. Please disregard.
-
Ryan,
Thanks for your feedback. Wordpress automatically redirects to the non http version.
Regarding https sites I dont know many personal sites that pay for a https cert (or even why there would be a need.) Also I don't link to anything with https. It doesn't seem really seem necessary (unless I"m missing something.)
Heck, I just tried to go to https://seomoz.com and their site doesn't even go there.
-
Perhaps I am being too picky, especially for a non-business site.
I would share that even though your site is noahsdad.com, you took the effort to redirect www.noahsdad.com to noahsdad.com, right? Why did you take that extra step?
Whatever the response, the same concept would apply to the redirect from https protocol to http. The issue may never come up, but then again it only requires minimal effort to close this gap.
-
Yah, but my site is http://noahsdad.com/ - should I expect https to also work?
-
Try going to the following address: https://noahsdad.com
-
Yelp, the crawl just happened today.
Also you'll have to excuse my ignorance but I'm not sure what you mean by the last half of your comment. "my site does not handle https protocol well?"
Can you explain that to me. (I'm still learning.)
Thanks for taking the time to help by the way.
-
Are you certain you crawled the site after the redirect was in place?
If so, can you share the full record?
By the way, while looking at this is issue I noticed your site does not handle https protocol well. Try using it on your home page and see what happens. If your site does not use https, I would suggest redirecting all https requests to their http equivalent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rebranding: 404 to homepage?
Hello all!
Technical SEO | | JohnPalmer
I did a rebranding, [Domain A] -> [Domain B]. what to do with all the 404 pages? 1. [Domain A (404)] -> [Domain B (homepage)]?
2. [Domain A (404)] -> [Domain B (404 page + same url) - for example: xixix.com/page/bla What do you think ?0 -
Ignore these external links reported in GWT?
Taking a long, Ace-Ventura-like breath here. This question is loaded. Here we go: No manual actions reported against my client's site in GWT HOWEVER, a link: operator search for external links to my client's website shows NO links in results. That seems like a very bad omen to me. OSE shows 13 linking domains, but not the one that's listed in the next bullet point. Issue: In Google Webmaster Tools, I noticed 1,000+ external links to my client's website all coming from riddim-donmagazine.com (there are a small handful of other domains listed, but this one stuck out for the large quantity of links coming from this domain) Those external links all point to two URLs on my client's website. I have no knowledge of any campaigns run by client that would use this other domain (or any schemes for that matter) It appears that this website riddim-donmagazine.com has been suspended by hostgator All of the links were first discovered last year (dates vary but basically August through December 2013) There have not been any newly discovered links from this website reported by GWT since those 2013 dates All of the external links are /? based. Example: http://riddim-donmagazine.com/?author=1&paged=31 If I run that link in preceding bullet point through http://www.webconfs.com/http-header-check.php, or any others from riddim-donmagazine.com those external links return 302 status. My best guess is at one time the client was running an advertising program and this website may have been on that network. One of the external links points to an ad page on the client's website.
Technical SEO | | EEE3
(web.archive.org confirms this is a WordPress site and that it's coverage of Bronx news could trigger an ad for my client or make it related to my client's website when it comes to demographics.) Believe me, this externally linked domain is only a small problem in comparison with the rest of my client's issues (mainly they've changed domains, then they changed website vendors, etc., etc.), but I did want to ask about that one externally linked domain. Whew.Thanks in advance for insights/thoughts/advice!0 -
404 not found page appears as 200 success in Google Fetch. What to do to correct?
We have received messages in Google webmaster tools that there is an increase in soft 404 errors. When we check the URLs they send to the 404 not found page:
Technical SEO | | Madlena
For example, http://www.geographics.com/images/01904_S.jpg
redirects to http://www.geographics.com/404.shtml.
When we used fetch as Google, here is what we got: .
#1 Server Response: http://www.geographics.com/404.shtml
HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
What is wrong and what shall we do? The soft 404 errors are mainly for images that no longer exist in the server. Thanks!0 -
Duplicate Content Reports
Hi Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0 -
404 page for webshop vs 302 redirect
Hi everybody Im the owner of a webshop and we have implemented that products that are not instock are disabled from the shop. My problem is that i have a lot of 404 pages, that right now get redirected to the front page, when the item are not instock. This is because it would hurt the conversion rate if they got a standard 404 page. Customers dont know what a 404 and would click back and choose another competitor. Its really hard to find out what are the best solution and what are not a downrank at google. This has been running like this for 2 years and cant see any negative in the solution regarding seo and so on, What are your thoughts? Christian Hansen Denmark
Technical SEO | | noerdar0 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0