404 Errors in WMT
-
Currently my website have about 10,000 404 errors for my site as wordpress is adding /feed/ to the end of all url in my website.. Should I restrict /feed/ from the robot txt?
-
Yea would have my developer look at the issue for me thank you..
-
Thanks for the info via direct message. As far as I know, those /feed/ URLs should not return 404's. I checked my site for example;
http://www.evolvingseo.com/2014/08/15/hiring-evolver-number-one/feed/ - and that returns a 200 OK.
I am not sure why WordPress would be doing this to be honest. Do you have a developer working with you? Or if it's a Theme you could contact the theme vendor about it.
-
Hi There
As mentioned above - it would be optimal to see an example - or if you can't share the site, just a generic example. It may be that wordpress is adding feed URLs where they don't need to be, so we'd need to take a good look.
-
Good morning!
Before you go cutting 10,000 404's I personally would try and address why your getting 404 errors for your RSS Feed.
Having 10,000 errors is a broad number, plenty of those could be duplicates, and some of those are probably not just related to the RSS Feed. If there is anything I have learned in SEO is that I can almost never use broad strokes when painting, and if I do, I have to be absolutely SURE what my brush is covering. A little while back Matt Cutts made a video about RSS feeds and the benefit they can have to websites. They are not as important as the blog itself, but still, it's a nice feature that you could take advantage if you already have.
The reason I bring this up; if you make the broad statement the restrict /feed/ how to you know for certain that you aren't cutting off other pages that have helped?
I don't know enough about your website to truly advise but I would take a look at all of those errors, put them into a spreadsheet and first get rid of all duplicates and pull out all of the /feed/ 404's to try and get as specific of a number as possible.
Look in your referrals in GWMT and GA and see if your RSS feed is bringing you traffic/referrals at all. If it isn't helping then I think you can put a 410 code for the /feed/ although as pointed out to me, there really isn't much benefit for using the 410 over just letting the 404s die on their own.
Hope that helps a little!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawl error Javascript method is not defined
Hi All, I have this problem, that has been a pain in the ****. I get tons of crawl errors from "Googlebot" saying a specific Javascript method does not exist in my logs. I then go to the affected page and test in a web browser and the page works without any Javascript errors. Can some help with resolving this issue? Thanks in advance.
Technical SEO | | FreddyKgapza0 -
Once on https should Moz still be picking up errors on http
Hello, Should Moz be picking up http errors still if the sites on https? Or has the https not been done properly? I'm getting duplicate errors amoung other things. Cheers, Ruth
Technical SEO | | Ruth-birdcage1 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
To avoid errors in our Moz crawl, we removed subdomains from our host. (First we tried 301 redirects, also listed as errors.) Now we have backlinks all over the web that are broken. How bad is this, from a pagerank standpoint?
Our MOZ crawl kept telling us we had duplicate page content even though our subdomains were redirected to our main site. (Pages from Wineracks.vigilantinc.com were 301 redirected to vigilantinc.com/wineracks.) Now, to solve that problem, we have removed the wineracks.vigilantinc.com subdomain. The error report is better, but now we have broken backlinks - thousands of them. Is this hurting us worse than the duplicate content problem?
Technical SEO | | KristyFord0 -
What to do with 404 errors when you don't have a similar new page to 301 to ??
Hi If you have 404 errors for pages that you dont have similar content pages to 301 them to, should you just leave them (the 404's are optimised/qood quality with related links & branding etc) and they will eventually be de-indexed since no longer exist or should you 'remove url' in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Remove 404 errors
I've got a site (www.dikelli.com.au) that has some 404 errors. I'm using Dreamweaver to manage the site which was built for me by I can't seem to figure out how to remove the 404 pages as it's not showing up in the directory? How would I fix this up?
Technical SEO | | sterls0 -
Webmaster tools...URL Errors
Hi mozzers, Quick question. Whats the best thing to do about URL errors in webmaster tools. They are all 404s that point from external sites. Many of them are junk spam sites. Should I mark them as "fixed" or just leave them. I'm hoping google is aware it's out of my control if spam sites want to link to 404s on my site. Peter
Technical SEO | | PeterM220 -
Funky 404 error on reports
The report is showing a 404 error where a URL is being appended to the end of the address. It does not show up on the website of on the Sitemap so am wondering if I am missing something or is it a system error?
Technical SEO | | ccbseo0