Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
-
Hi all,
I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool).
The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either:
- canonicalize the broken URL's to the new corresponding product pages,
or
- request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days)
What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)?
Alex
-
Everett,
You're right on the money. I don't think you could have summarized my problem any better. I will take Dana's and your advice and let them sit "indexed" for a while and serve a 404. According to GWT's Index Status, the product pages were indexed about a month ago, so I guess it won't hurt to wait a few more weeks until those pages dropped out of Google's index naturally, especially since the site development won't be done for another 6~7 weeks.
Thanks a bunch for all of your insights
-
Right on Everett. I agree 100%
-
I want to make sure everyone, including myself, understands you Alex. Correct me if I'm wrong, but you're saying that the website is totally new (a start-up) and nothing (at least nothing owned by the company you're with) has ever been on that domain name. While building the site the previous guy accidentally allowed the development version of the site to be indexed, and/or allowed product pages that you don't want on the site at all to be indexed. Since it is a brand new site those "old" pages that were deleted didn't have any external links, and didn't have any traffic from Google or elsewhere outside of the company.
IF that is the case, then you can probably just let those pages stay as 404s. Eventually, since nobody is linking to them, they will drop out of the index on their own.
I wouldn't use the URL removal tool in this case. For one thing, it is a dangerous tool and if you don't have experience with this sort of thing it could do more harm than good. It should only take a few weeks for those URLs that were briefly live and indexed to go away if you are serving a 404 or 410 http header response code on those URLs.
I hope this helps. Please let us know if we have misinterpreted your problem.
-
Understood Alex. Yes, of course you would have to rebuild the pages first before you can 301, but it sounds like you are planning on rebuilding them (otherwise you wouldn't be able to use canonical tags either, because there wouldn't be a page to put them on).
I wouldn't just give up and ask Google to remove all of the old URLs. I agree with what Mike has to say about that below. A 302 is a good option if you are worried about the 404s sitting in the index while you are rebuilding your product pages. If you are still on the same platform (it sounds like that didn't change), I would suggest rebuilding as many of the old URLs as you can (if they were good SEO-friendly URLs). That way you could bypass the 301 redirect. If you want to create your pages so that product options are rolled in and separate colors of things no longer need separate pages, you can then choose whether to 301 redirect those old URLs or simply let them 404.
404s aren't necessarily always a bad thing. Regarding the 2,000 of them you have now, if some of those pages just need to go away, you can let them 404 and they will eventually drop out of Google's index. You aren't required to manually submit them via GWT in order for them to be removed.
-
Hi Mike,
Thanks for weighing in. Recreating all of the old pages seems like a pain in the butt... Besides, the site never launched, so I had no traffic at all. Considering there was no traffic at all to these pages, do you think it's a good idea to go through the URL removal from GWT and purge the broken links completely from Google's index?
- Alex
-
Hi Dana,
Thank you for your advice. I'm new at SEO, so I may be wrong but...
Mapping out the old/new URLs on a spreadsheet and setting up a 301 redirect to the new URLs is not a plausible option in my opinion, mainly because the new URLs literally do not exist (I have not created ANY product pages). According to your suggestion, I would have to create new product pages and do a 301 redirect from the broekn URLs to the newly created pages? Not quite sure if I'm understanding you correctly...
In addition, the previous project manager wasn't SEO-savvy (l'm not either... sigh..), so he didn't know that creating separate pages for a product with multiple attributes (such as flavor and size) would result in major duplicate content issues.
The site is going through some major design/layout overhaul, and I intend to come up with a SEO strategy before creating any categories or products.
Thus...
Do you think it's better to submit a URL removal request on GWT and get rid of the indexed URL's completely? I just re-read Google's policy on URL removal, and it states that as long as I have a 4xx (404 or 410, I'm assuming..) returned for the URLs, Google will honor the removal request.
- Alex
-
Rel Canonical is not quite the right thing for this sort of issue.
If you're worried about the 404s sitting around too long and losing traffic for the moment, you can 302 everything to a landing page, category page, or homepage while you work on setting everything else up. You have two choices at this point.... 1) recreate all of the old pages and old URLs then remove the 302s, or 2) Add new products and new URLs, then as Dana said you'll need to map out all your new product URLs and old URLs to determine what old URL should be 301 redirected where. Then set up your necessary 301s and test that they all work.
-
Hi Alex, I am sorry to hear about this. What a mess, no? If it were me, I wouldn't rely solely on the canonical tag. I would also create a spreadsheet and map all the old URLs to the new URLs and set up 301 redirects from the old to the new. 2,000 isn't too bad. You can probably knock them out in 2-3 days...but be sure to test all of the 301s and make sure they are performing the way you expect them to. Hope that helps a little!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
External 404 pages
A client of mine is linking to a third-party vendor from their main site. The page being linked to loads with a Page Not Found error and then replaces some application content once the Javascript kicks in. This process is not visible to users (the application loads fine for front-end users) but it is being picked up as a 404 error in broken link reports. This link is part of the site skin so it's on every page. Outside of the annoyance of having lots of 404 errors being flagged in a broken link report, does this cause any actual issue? Eg, do search enginges see that my client is linking to something that is a 404 error, and does that cause them any harm?
Intermediate & Advanced SEO | | mkleamy0 -
Mass Product Page Upload - SEO Issue?
Hi We will be adding a lot of products to our site, in a mass referencing exercise, not all in one go, but 10,000 split into a few loads. This product content won't be duplicate, but the quality of the information may be sparse and not very high. My question is, whether adding a bulk of these pages will reduce the pverall domain authority on our site? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Any issue? Redirect 100's of domains into one website's internal pages
Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?
Intermediate & Advanced SEO | | Fergclaw0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
In-House SEO - Doubt about one SEO issue - Plz guys help over here =)
Hello, We wanna promote some of our software's. I will give u guys one example bellow: http://www.mediavideoconverter.de/pdf-to-epub-converter.html We also have this domain: http://pdftoepub.de/ How can we deal about the duplicate content, and also how can we improve the first domain product page. If I use the canonical and don't index the second domain and make a link to the first domain it will help anyway? or don't make any difference? keyword: pdf to epub , pdf to epub converter What u guys think about this technique ? Good / Bad ? Is there the second domain giving any value to the first domain page? Thanks in advance.
Intermediate & Advanced SEO | | augustos0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0