Site architecture change - +30,000 404's in GWT
-
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future.
But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated.
Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point?
Thanks,
Ben
-
Hi Ben,
The answer to your question boils down to usability and link equity:
- Usability: Did the old URLs get lots of Direct and Referring traffic? E.g., do people have them bookmarked, type them directly into the address bar, or follow links from other sites? If so, there's an argument to be made for 301 redirecting the old URLs to their equivalent, new URLs. That makes for a much more seamless user experience, and increases the odds that visitors from these traffic sources will become customers, continue to be customers, etc.
- Link equity: When you look at a Top Pages report (in Google Webmaster Tools, Open Site Explorer, or ahrefs), how many of those most-linked and / or best-ranking pages are old product URLs? If product URLs are showing up in these reports, they definitely require a 301 redirect to an equivalent, new URL so that link equity isn't lost.
However, if (as is common with a large number of ecommerce sites), your old product URLs got virtually zero Direct or Referring traffic, and had virtually zero deep links, then letting the URLs go 404 is just fine. I think I remember a link churn report in the early days of LinkScape when they reported that something on the order of 80% of the URLs they had discovered would be 404 within a year. URL churn is a part of the web.
If you decide not to 301 those old URLs, then you simply want to serve a really consistent signal to engines that they're gone, and not coming back. Recently, JohnMu from Google suggested recently that there's a tiny difference in how Google treats 404 versus 410 response codes - 404s are often re-crawled (which leads to those 404 error reports in GWT), whereas 410 is treated as a more "permanent" indicator that the URL is gone for good, so 410s are removed from the index a tiny bit faster. Read more: http://www.seroundtable.com/google-content-removal-16851.html
Hope that helps!
-
Hi,
Are you sure these old urls are not being linked from somewhere (probably internally)? Maybe the sitemap.xml was forgotten and is pointing to all the old urls still? I think that for 404's to show in GWT there needs to be a link to them from somewhere, so in the first instance in GWT go to the 404s and have a look at where they are linked from (you can do this with moz reports also). If it is an internal page like a sitemap, or some forgotten menu/footer feature or similar that is still linking to old pages then yes you certainly want to clear this up! If this is the case, once you have fixed the internal linking issues you should have significantly reduced list of 404s and can then concentrate on these on a more case by case basis (assuming they are being triggered by external links).
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We've just completed a company video. Should we post it everywhere at once, or stagger on various channels (YouTube, website, LinkedIn, Facebook...)
Hopefully we'll get a lot of traffic from our new corporate video. If we post it everywhere at once, will we get a spike in our analytics, and if so, will it be seen by Google as an anomaly, or even suspicious. If we spread out the distribution over several channels over a little time, should we get a longer bump. In either instance, we may consider a sharing schedule to promote it over time.
White Hat / Black Hat SEO | | SteveMauldin0 -
Do you see sites with unfixable Penguin penalties?
Hello, We have a site with 2 Penguin update penalties (drops in traffic) and one quality penalty (another drop in traffic) all years ago, both just drops in rankings and not messages in Google Console. Now that Penguin is hard coded, do you find that some sites never recover even with a beautiful disavow and cleanup? We've added content and still have some quality errors, though I thought they were minor. This client used to have doorway sites and paid links, but now is squeaky clean with a disavow done a month ago though most of the cleanup was done by deletion of the doorways and paid links 9 months ago. Is this a quality problem or is our site permanently gone? Let me know what information you need. Looking for people with a lot of experience with other sites and Penguin. Thanks.
White Hat / Black Hat SEO | | BobGW2 -
Will including a global-site link in all 100 local-sites footer be considered spammy?
If I am a car manufacturer brand site(global), and I request all my location-specific domains include a link to the global site in their footers, would this trigger a red flag for Google? There are roughly 100 location-specific sites, but I would like to come up with a long term solution, so this number could be larger in the future. Is it best practice to only follow the footer link on each location-specific site Homepage, and nofollow the rest of the footer links on each site? Is it best to only include one followed link to the manufacturer brand site (global) on each location-specific domain? Is it best to not put this global link in the footer, but rather towards the top of the page only on the homepage?
White Hat / Black Hat SEO | | Jonathan.Smith0 -
Exchange link from sites in same google account
Hi everyone, Anybody have experience when you have some websites which stored in Google Webmaster Tool and they exchange links between sites. So is it good for sites? We are hosted on different server. Thank you so much
White Hat / Black Hat SEO | | Jeepster0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
How to run SEO tests you don't want to be associated with
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for. I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so. I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself. The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another. How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter? Tom
White Hat / Black Hat SEO | | lethal0r0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0 -
Can't figure out how my competitor has so many links
I suspect something possibly black-hat is going on with the amount of inbound links for www.pacificlifestylehomes.com ( http://www.opensiteexplorer.org/links?site=www.pacificlifestylehomes.com ) mainly because they have such a large volume of links (for my industry) with their exact targeted keyword. Can anyone help clear this up for me?
White Hat / Black Hat SEO | | theChris0