404 Errors in WMT
-
Currently my website have about 10,000 404 errors for my site as wordpress is adding /feed/ to the end of all url in my website.. Should I restrict /feed/ from the robot txt?
-
Yea would have my developer look at the issue for me thank you..
-
Thanks for the info via direct message. As far as I know, those /feed/ URLs should not return 404's. I checked my site for example;
http://www.evolvingseo.com/2014/08/15/hiring-evolver-number-one/feed/ - and that returns a 200 OK.
I am not sure why WordPress would be doing this to be honest. Do you have a developer working with you? Or if it's a Theme you could contact the theme vendor about it.
-
Hi There
As mentioned above - it would be optimal to see an example - or if you can't share the site, just a generic example. It may be that wordpress is adding feed URLs where they don't need to be, so we'd need to take a good look.
-
Good morning!
Before you go cutting 10,000 404's I personally would try and address why your getting 404 errors for your RSS Feed.
Having 10,000 errors is a broad number, plenty of those could be duplicates, and some of those are probably not just related to the RSS Feed. If there is anything I have learned in SEO is that I can almost never use broad strokes when painting, and if I do, I have to be absolutely SURE what my brush is covering. A little while back Matt Cutts made a video about RSS feeds and the benefit they can have to websites. They are not as important as the blog itself, but still, it's a nice feature that you could take advantage if you already have.
The reason I bring this up; if you make the broad statement the restrict /feed/ how to you know for certain that you aren't cutting off other pages that have helped?
I don't know enough about your website to truly advise but I would take a look at all of those errors, put them into a spreadsheet and first get rid of all duplicates and pull out all of the /feed/ 404's to try and get as specific of a number as possible.
Look in your referrals in GWMT and GA and see if your RSS feed is bringing you traffic/referrals at all. If it isn't helping then I think you can put a 410 code for the /feed/ although as pointed out to me, there really isn't much benefit for using the 410 over just letting the 404s die on their own.
Hope that helps a little!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help! How to Remove Error Code 901: DNS Errors (But to a URL that doesn't exist!)
I have 2 urgent errors saying there are 2 x error code 909's detected. These don't link to any page - but I can tell there is a mistake somewhere - I just don't know what needs changing. http://www.justkeyrings.co.ukhttp/www.justkeyrings.co.uk/printed-promotional-keyrings http://www.justkeyrings.co.ukhttp/www.justkeyrings.co.uk/blank-unassembled-keyrings Could someone help please? screen-shot-2015-08-11-at-13.18.17.png?t=1439292942
Technical SEO | | FullSteamBusiness0 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Error report in Bing Evaluated size of HTML....
Hi Whilst checking Bing's SEO analyser I got this error message for our page www.tidy-books.co.uk/childrens-bookcases "Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. (Issue marker for this rule is not visible in the current view)" Just wondering what needs to be done about it and what it actually means? Thanks
Technical SEO | | tidybooks0 -
Unfindable 404's
So I have noticed that my site has some really strange 404's that are only being linked to from internal links from the site.
Technical SEO | | Adamshowbiz
When I go to the pages that Web master tools suggests I can't actaully find the link which is pointing to the 404. In that instance what do you do? Any help would be much appreciated 🙂0 -
Does it really matter to set 301 redirect for not found error pages?
I've very simple question for not found error pages. Does it really require to set up 301 redirect for all not found error pages which detected in Google webmaster tools? Honestly, I don't want to set 301 redirect exclude externally connected pages. So, what will impact on ranking after follow this process?
Technical SEO | | CommercePundit0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Impact of "restricted by robots" crawler error in WT
I have been wondering about this for a while now with regards to several of my sites. I am getting a list of pages that I have blocked in the robots.txt file. If I restrict Google from crawling them, then how can they consider their existence an error? In one case, I have even removed the urls from the index. And do you have any idea of the negative impact associated with these errors. And how do you suggest I remedy the situation. Thanks for the help
Technical SEO | | phogan0 -
404 Errors
Hello Team, I noticed that my site has 1,000s of 404 errors. Not sure how this happened, maybe when I updated our CMS. My question is, should I worry about them. Should I delete them or just leave them alone. Thank you for your feedback!
Technical SEO | | Dallas0