Site architecture change - +30,000 404's in GWT
-
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future.
But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated.
Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point?
Thanks,
Ben
-
Hi Ben,
The answer to your question boils down to usability and link equity:
- Usability: Did the old URLs get lots of Direct and Referring traffic? E.g., do people have them bookmarked, type them directly into the address bar, or follow links from other sites? If so, there's an argument to be made for 301 redirecting the old URLs to their equivalent, new URLs. That makes for a much more seamless user experience, and increases the odds that visitors from these traffic sources will become customers, continue to be customers, etc.
- Link equity: When you look at a Top Pages report (in Google Webmaster Tools, Open Site Explorer, or ahrefs), how many of those most-linked and / or best-ranking pages are old product URLs? If product URLs are showing up in these reports, they definitely require a 301 redirect to an equivalent, new URL so that link equity isn't lost.
However, if (as is common with a large number of ecommerce sites), your old product URLs got virtually zero Direct or Referring traffic, and had virtually zero deep links, then letting the URLs go 404 is just fine. I think I remember a link churn report in the early days of LinkScape when they reported that something on the order of 80% of the URLs they had discovered would be 404 within a year. URL churn is a part of the web.
If you decide not to 301 those old URLs, then you simply want to serve a really consistent signal to engines that they're gone, and not coming back. Recently, JohnMu from Google suggested recently that there's a tiny difference in how Google treats 404 versus 410 response codes - 404s are often re-crawled (which leads to those 404 error reports in GWT), whereas 410 is treated as a more "permanent" indicator that the URL is gone for good, so 410s are removed from the index a tiny bit faster. Read more: http://www.seroundtable.com/google-content-removal-16851.html
Hope that helps!
-
Hi,
Are you sure these old urls are not being linked from somewhere (probably internally)? Maybe the sitemap.xml was forgotten and is pointing to all the old urls still? I think that for 404's to show in GWT there needs to be a link to them from somewhere, so in the first instance in GWT go to the 404s and have a look at where they are linked from (you can do this with moz reports also). If it is an internal page like a sitemap, or some forgotten menu/footer feature or similar that is still linking to old pages then yes you certainly want to clear this up! If this is the case, once you have fixed the internal linking issues you should have significantly reduced list of 404s and can then concentrate on these on a more case by case basis (assuming they are being triggered by external links).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Change of Title Tags
Hi All, Does anyone have insight on any repercussions from Google if many title tags are changed at once on a site (we're talking 500 to several thousand)? Appreciate any input. Thanks!
White Hat / Black Hat SEO | | AliMac260 -
Has our site been attacked?
Hello fellow mozers! I am having a problem you might be able to help me with and any thoughts on the issue will be greatly appreciated. Yesterday, I received an automated monthly report from Quill Engage, a tool that fetches data from Google Analytics and generates reports in a narrative format. Last month's 'referral traffic' section indicates two incredibly spammy websites driving more than 200 sessions to our website. Naturally, I checked out GWT and Open Site Explorer but couldn't find any traces of such activity. Futhermore, all our metrics seem ok. Can this possibly be a negative SEO attack that was only traced by the aforementioned tool? Can you propose any other way to test this and make sure we're not being attacked?
White Hat / Black Hat SEO | | SMD_0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
What could go wrong? SEO on mobile site is different than desktop site.
We have a desktop site that has been getting worked on over the year regarding improving SEO. Since the mobile site is separate, the business decided to not spend the time to keep it updated and just turned it off. So any mobile user that finds a link to us in search engines, goes to a desktop site that is not responsive. Now that we're hearing Google is going to start incorporating mobile user friendliness into rankings, the business wants to turn the mobile site back on while we spend months making the desktop site responsive. The mobile site basically has no SEO. The title tag is uniform across the site, etc. How much will it hurt us to turn on that SEO horrid mobile site? Or how much will it hurt us to not turn it on?
White Hat / Black Hat SEO | | CFSSEO0 -
Site-wide links: Nofollow or eliminate altogether?
As a web developer, it's not uncommon for me to place a link in the footer of a website to give myself credit for the web design/development. I recently decided to go back and nofollow all these site-wide footer links, to avoid potentially looking spammy. I wanted to know if I should remove these links altogether, and just give myself text credit without a link at all? I would like for a potential client who is interested in my work to still be able to get to my site if they like my work - but I want to keep my link profile squeaky clean. Thoughts?
White Hat / Black Hat SEO | | brad.s.knutson0 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
Pages linked with Spam been 301 redirected to 404\. Is it ok
Pl suggest, some pages having some spam links pointed to those pages are been redirected to 404 error page (through 301 redirect) - as removing them manually was not possible due to part of core component of cms and many other coding issue, the only way as advised by developer was making 301 redirect to 404 page. Does by redirecting these pages to 404 page using 301 redirect, will nullify all negative or spam links pointing to them and eventually will remove the resulting spam impact on the site too. Many Thanks
White Hat / Black Hat SEO | | Modi0 -
2 sites in one niche?
Hello, Can you be penalized for having 2 ecommerce sites in the same niche? Is there a way to do it white-hat? Please explain.
White Hat / Black Hat SEO | | BobGW0