Killing 404 errors on our site in Google's index
-
Having moved a site across to Magento, obviously re-directs were a large part of that, ensuring all the old products and categories linked up correctly with the new site structure.
However, we came up against an issue where we needed to add, delete, then re-add products. This, coupled with a misunderstanding of the csv upload processing, meant that although the old urls redirected, some of the new Magento urls changed and then didn't redirect:
For Example:
mysite/product
would get deleted re-added and become:
mysite/product-1324
We now know what we did wrong to ensure it doesn't continue to happen if we weret o delete and re-add a product, but Google contains all these old URLs in its index which has caused people to search for products on Google, click through, then land on the 404 page - far from ideal.
We kind of assumed, with continual updating of sitemaps and time, that Google would realise and update the URL accordingly. But this hasn't happened - we are still getting plenty of 404 errors on certain product searches (These aren't appearing in SEOmoz, there are no links to the old URL on the site, only Google, as the index contains the old URL).
Aside from going through and finding the products affected (no easy task), and setting up redirects for each one, is there any way we can tell Google 'These URLs are no longer a thing, forget them and move on, let's make a fresh start and Happy New Year'?
-
No canonical back to the main product page?
-
Both helpful replies thanks. Further investigation led me to this Magento Bug:
http://www.magentocommerce.com/bug-tracking/issue/?issue=13662
(Need to have a magneto account to see the bug report).
Seems there's a spearate underlying issue which we need to fix first - the rewrite table grows exponentially every time we index Magento and creates a new URL for every configurable product. i.e. a product that has one or more associated products that will have the same name - used for displaying different sizes and colours. This means that Google is picking up a new page for each configurable product each time it indexes: different URL, same content, same product sku - a technical SEO nightmare!
-
Hey Sean
This should take care of itself but there are a few things you can do to help.
**1. **Firstly, using webbug or some such, just make sure the page is returning a HTTP 404 or 410 code to ensure that whilst it may be displaying some kind of 404 like page, that it is actually sending the 4XX code back to Google (so they can update this and remove them).
2. Then, you can log into webmaster tools and remove URLs from your site:
Webmaster Tools > Optimisation > Remove URLs
This way you can manually remove them.
Alternatively, you could always just manually add some 301 redirects for those pages which may be the quickest way to sort this out and certainly provides the best experience for any users clicking on those links in the SERPs.
Hope that helps!
Marcus -
complex thing. Not sure if this may help you or not -
Example meta tag
Add the following meta tag in the HTML source of your page:
<meta http-equiv="expires" content="mon, 27 sep 2010 14:30:00 GMT">
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Removing Parameterized URLs from Google Index
We have duplicate eCommerce websites, and we are in the process of implementing cross-domain canonicals. (We can't 301 - both sites are major brands). So far, this is working well - rankings are improving dramatically in most cases. However, what we are seeing in some cases is that Google has indexed a parameterized page for the site being canonicaled (this is the site that is getting the canonical tag - the "from" page). When this happens, both sites are being ranked, and the parameterized page appears to be blocking the canonical. The question is, how do I remove canonicaled pages from Google's index? If Google doesn't crawl the page in question, it never sees the canonical tag, and we still have duplicate content. Example: A. www.domain2.com/productname.cfm%3FclickSource%3DXSELL_PR is ranked at #35, and B. www.domain1.com/productname.cfm is ranked at #12. (yes, I know that upper case is bad. We fixed that too.) Page A has the canonical tag, but page B's rank didn't improve. I know that there are no guarantees that it will improve, but I am seeing a pattern. Page A appears to be preventing Google from passing link juice via canonical. If Google doesn't crawl Page A, it can't see the rel=canonical tag. We likely have thousands of pages like this. Any ideas? Does it make sense to block the "clicksource" parameter in GWT? That kind of scares me.
Intermediate & Advanced SEO | | AMHC0 -
Should you allow an auto dealer's inventory to be indexed?
Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!
Intermediate & Advanced SEO | | Gauge1230 -
Refocusing a site's conent
Here's a question I was asked recently, and I can really see going either way, but want to double check my preference. The site has been around for years and over that time expanded it's content to a variety of areas that are not really core to it's mission, income or themed content. These jettisonable other areas have a fair amount of built up authority but don't really contribute anything to the site's bottom line. The site is considering what to do with these off-theme pages and the two options seem to be: Leave them in place, but make them hard to find for users, thus preserving their authority as an inlink to other core pages. or... Just move on and 301 the pages to whatever is half-way relevant. The 301 the pages camp seems to believe that making the site's existing/remaining content focused on three or four narrower areas will have benefits for what Google sees the site as being about. So, instead of being about 12 different things that aren't too related to each other, the site will be about 3 or 4 things that are kinda related to eachother. Personally, I'm not eager to let go of old pages because they do produce some traffic and have some authority value to help the core pages via in-context and navigation links. On the other hand, maybe focusing more would have benefits search benefits. What do think? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
Is it possible for a multi doctor practice to have the practice's picture displayed in Google's SERP?
Google now includes pictures of authors in the results of the pages. Therefore, a single practice doctor can include her picture into Google's SERP (http://markup.io/v/dqpyajgz7jkd). How can a multi doctor practice display the practice's picture as opposed to a single doctor? A search for Plastic Surgery Chicago displayed this (query: plastic surgery Chicago) http://markup.io/v/bx3f28ynh4w5. I found one example of a search result showing a picture of both doctors for a multi doctor practice (query: houston texas plastic surgeon). http://markup.io/v/t20gfazxfa6h
Intermediate & Advanced SEO | | CakeWebsites0 -
Can literally any site get 'burned'?
Just curious what people think. The SEOMOZ trust on my site has gone up, all while Google is dropping us in rankings for lots of keywords. Just curious if this can happen to anyone or once you are 100% 'trusted' you're good. We went from 120,000 page views down to about 50,000. All while doubling content, improving the design(at least from a user perspective), and getting more natural links. Seems counter intuitive to Google's mantra of ranking quality. I would guess 'authority' sites never get hit by these updates right? So when you make it you've made it.(at least from a dropping like a rock perspective, obviously you have to keep working). I'm guessing we just need a bunch more quality links but would hate to work on building links, quality content, trust etc for it to be something so finicky long term.
Intermediate & Advanced SEO | | astahl110 -
404'd pages still in index
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
Intermediate & Advanced SEO | | mj7750