What's Worse - 404 errors or a huge .htaccess file
-
We have changed our site architecture pretty significantly and now have many fewer pages (albeit with more robust content and focused linking).
My question is, what should I do about all the 404 errors (keep in mind, I am only finding these in Bing Webmaster tools, not Moz or GWT)?
Is it worse to have all those 404 errors (hundreds), or to have a massive htaccess file for pages that are only getting hits by the Bing crawlbot.
Any insight would be great.
Thanks
-
It's not ideal to have such a massive .htaccess file that it slows down your page load time significantly. But if you have a lot of inbound links to pass that matter, you'll likely want to keep your SEO value intact and use 301 redirects to handle this properly.
My $0.02: Test!
Do a page load test with the .htaccess file off / removed, and then do another one where it is on and live. If there's no significant time difference, you should be okay.
We have sites with hundreds or even thousands of lines in the .htaccess file and they run pretty quickly.
That said, here's why 404 pages aren't ideal to serve:
According to Rand Fishkin's Moz blog writeup, Are 404 Pages Always Bad for SEO?
http://moz.com/blog/are-404-pages-always-bad-for-seo"When faced with 404s, my thinking is that unless the page:
A) Receives important links to it from external sources (Google Webmaster Tools is great for this)
B) Is receiving a substantive quantity of visitor traffic
and/or C) Has an obvious URL that visitors/links intended to reachIt's OK to let it 404."
According to Moz's Redirection SEO Best Practice:
http://moz.com/learn/seo/redirection
... you want to use a 301 redirect to indicate that the content has moved permanently.Finally, here's a post that describes how to create a more SEO friendly migration, here's a great info graphic:
http://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographicHope this helps!
Thanks,
-- Jeff -
I think the 404 errors would be more of a concern. A large .htaccess shouldn't be a problem especially if you are only talking about a few hundred redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
URL with query string being indexed over it's parent page?
I noticed earlier this week that this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages?channel=care was being indexed instead of this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages for its various keywords We have rel=canonical tags correctly set up and all internal links to these pages with query strings are nofollow, so why is this page being indexed? Any help would be appreciated 🙂
Technical SEO | | iHasco0 -
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
Soft 404 errors
Google webmaster tools is telling me I have 8 "soft 404's". They are all like this page...
Technical SEO | | sdwellers
http://www.seadwellers.com/search/page/8/ All 8 pages are the same except the number at the end...... I just can't figure this....any insight at all is appreciated and do i need to correct somehow?0 -
Are Collapsible DIV's SEO-Friendly?
When I have a long article about a single topic with sub-topics I can make it user friendlier when I limit the text and hide text just showing the next headlines, by using expandable-collapsible div's. My doubt is if Google is really able to read onclick textlinks (with javaScript) or if it could be "seen" as hidden text? I think I read in the SEOmoz Users Guide, that all javaScript "manipulated" contend will not be crawled. So from SEOmoz's Point of View I should better make use of old school named anchors and a side-navigation to jump to the sub-topics? (I had a similar question in my post before, but I did not use the perfect terms to describe what I really wanted. Also my text is not too long (<1000 Words) that I should use pagination with rel="next" and rel="prev" attributes.) THANKS for every answer 🙂
Technical SEO | | inlinear0 -
What's the best canonicalization method?
Hi there - is there a canonicalization method that is better than others? Our developers have used the
Technical SEO | | GBC0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
Slashes In Url's
If your cms has created two urls for the same piece of content that look like the following, www.domianname.com/stores and www.domianname.com/stores/, will this be seen as duplicate content by google? Your tools seem to pick it up as errors. Does one of the urls need 301 to the other to clear this up, or is it not a major problem? Thanks.
Technical SEO | | gregster10000