Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
-
Hi all,
I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool).
The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either:
- canonicalize the broken URL's to the new corresponding product pages,
or
- request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days)
What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)?
Alex
-
Everett,
You're right on the money. I don't think you could have summarized my problem any better. I will take Dana's and your advice and let them sit "indexed" for a while and serve a 404. According to GWT's Index Status, the product pages were indexed about a month ago, so I guess it won't hurt to wait a few more weeks until those pages dropped out of Google's index naturally, especially since the site development won't be done for another 6~7 weeks.
Thanks a bunch for all of your insights
-
Right on Everett. I agree 100%
-
I want to make sure everyone, including myself, understands you Alex. Correct me if I'm wrong, but you're saying that the website is totally new (a start-up) and nothing (at least nothing owned by the company you're with) has ever been on that domain name. While building the site the previous guy accidentally allowed the development version of the site to be indexed, and/or allowed product pages that you don't want on the site at all to be indexed. Since it is a brand new site those "old" pages that were deleted didn't have any external links, and didn't have any traffic from Google or elsewhere outside of the company.
IF that is the case, then you can probably just let those pages stay as 404s. Eventually, since nobody is linking to them, they will drop out of the index on their own.
I wouldn't use the URL removal tool in this case. For one thing, it is a dangerous tool and if you don't have experience with this sort of thing it could do more harm than good. It should only take a few weeks for those URLs that were briefly live and indexed to go away if you are serving a 404 or 410 http header response code on those URLs.
I hope this helps. Please let us know if we have misinterpreted your problem.
-
Understood Alex. Yes, of course you would have to rebuild the pages first before you can 301, but it sounds like you are planning on rebuilding them (otherwise you wouldn't be able to use canonical tags either, because there wouldn't be a page to put them on).
I wouldn't just give up and ask Google to remove all of the old URLs. I agree with what Mike has to say about that below. A 302 is a good option if you are worried about the 404s sitting in the index while you are rebuilding your product pages. If you are still on the same platform (it sounds like that didn't change), I would suggest rebuilding as many of the old URLs as you can (if they were good SEO-friendly URLs). That way you could bypass the 301 redirect. If you want to create your pages so that product options are rolled in and separate colors of things no longer need separate pages, you can then choose whether to 301 redirect those old URLs or simply let them 404.
404s aren't necessarily always a bad thing. Regarding the 2,000 of them you have now, if some of those pages just need to go away, you can let them 404 and they will eventually drop out of Google's index. You aren't required to manually submit them via GWT in order for them to be removed.
-
Hi Mike,
Thanks for weighing in. Recreating all of the old pages seems like a pain in the butt... Besides, the site never launched, so I had no traffic at all. Considering there was no traffic at all to these pages, do you think it's a good idea to go through the URL removal from GWT and purge the broken links completely from Google's index?
- Alex
-
Hi Dana,
Thank you for your advice. I'm new at SEO, so I may be wrong but...
Mapping out the old/new URLs on a spreadsheet and setting up a 301 redirect to the new URLs is not a plausible option in my opinion, mainly because the new URLs literally do not exist (I have not created ANY product pages). According to your suggestion, I would have to create new product pages and do a 301 redirect from the broekn URLs to the newly created pages? Not quite sure if I'm understanding you correctly...
In addition, the previous project manager wasn't SEO-savvy (l'm not either... sigh..), so he didn't know that creating separate pages for a product with multiple attributes (such as flavor and size) would result in major duplicate content issues.
The site is going through some major design/layout overhaul, and I intend to come up with a SEO strategy before creating any categories or products.
Thus...
Do you think it's better to submit a URL removal request on GWT and get rid of the indexed URL's completely? I just re-read Google's policy on URL removal, and it states that as long as I have a 4xx (404 or 410, I'm assuming..) returned for the URLs, Google will honor the removal request.
- Alex
-
Rel Canonical is not quite the right thing for this sort of issue.
If you're worried about the 404s sitting around too long and losing traffic for the moment, you can 302 everything to a landing page, category page, or homepage while you work on setting everything else up. You have two choices at this point.... 1) recreate all of the old pages and old URLs then remove the 302s, or 2) Add new products and new URLs, then as Dana said you'll need to map out all your new product URLs and old URLs to determine what old URL should be 301 redirected where. Then set up your necessary 301s and test that they all work.
-
Hi Alex, I am sorry to hear about this. What a mess, no? If it were me, I wouldn't rely solely on the canonical tag. I would also create a spreadsheet and map all the old URLs to the new URLs and set up 301 redirects from the old to the new. 2,000 isn't too bad. You can probably knock them out in 2-3 days...but be sure to test all of the 301s and make sure they are performing the way you expect them to. Hope that helps a little!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
If my products aren't showing in rich snippets, is there still value in adding product schema?
I'm adding category pages for an online auction site and trying to determine if its worth marking up the products listed on the page. All of the individual product pages have product schema, but I have never seen them show up in rich snippets likely due to the absence of the price element and the unique nature of the items. Is there still value in adding the product schema even if the items won't show in rich snippets? Also, is it possible the product schema will help optimize for commerce related keywords such as [artist name] + for sale?
Intermediate & Advanced SEO | | Haleyb350 -
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
Google is indexing wrong page for search terms not on that page
I’m having a problem … the wrong page is indexing with Google, for search phrases “not on that page”. Explained … On a website I developed, I have four products. For example sake, we’ll say these four products are: Sneakers (search phrase: sneakers) Boots (search phrase: boots) Sandals (search phrase: sandals) High heels (search phrase: high heels) Error: What is going “wrong” is … When the search phrase “high heels” is indexed by Google, my “Sneakers” page is being indexed instead (and ranking very well, like #2). The page that SHOULD be indexing, is the “High heels” page (not the sneakers page – this is the wrong search phrase, and it’s not even on that product page – not in URL, not in H1 tags, not in title, not in page text – nowhere, except for in the top navigation link). Clue #1 … this same error is ALSO happening for my other search phrases, in exactly the same manner. i.e. … the search phrase “sandals” is ALSO resulting in my “Sneakers” page being indexed, by Google. Clue #2 … this error is NOT happening with Bing (the proper pages are correctly indexing with the proper search phrases, in Bing). Note 1: MOZ has given all my product pages an “A” ranking, for optimization. Note 2: This is a WordPress website. Note 3: I had recently migrated (3 months ago) most of this new website’s page content (but not the “Sneakers” page – this page is new) from an old, existing website (not mine), which had been indexing OK for these search phrases. Note 4: 301 redirects were used, for all of the OLD website pages, to the new website. I have tried everything I can think of to fix this, over a period of more than 30 days. Nothing has worked. I think the “clues” (it indexes properly in Bing) are useful, but I need help. Thoughts?
Intermediate & Advanced SEO | | MG_Lomb_SEO0 -
What to do when you buy a Website without it's content which has a few thousand pages indexed?
I am currently considering buying a Website because I would like to use the domain name to build my project on. Currently that domain is in use and that site has a few thousand pages indexed and around 30 Root domains linking to it (mostly to the home page). The topic of the site is not related to what I am planing to use it for. If there is no other way, I can live with losing the link juice that the site is getting at the moment, however, I want to prevent Google from thinking that I am trying to use the power for another, non related topic and therefore run the risk of getting penalized. Are there any Google guidelines or best practices for such a case?
Intermediate & Advanced SEO | | MikeAir0 -
Privacy Policy & T&C's SEO related question
With Adwords they request a Privacy Policy and T&C's sometimes for an Ad to be approved. Silly question I know but do you think Google looks out for pages like this to identity websites which are more genuine for organic? Thanks
Intermediate & Advanced SEO | | activitysuper0 -
Do Outbound NoFollow Links Reduce the Page's Ability to Pass PageRank?
I get the recent change where adding a nofollow to one link wont increase the juice passed to other links. I'm wondering if nofollow still passes link-juice into the void. i.e. if a page has $10 of link-juice and has one link then regardless of whether this link is follow or nofollow will the page will leak the same juice? Specifically, Is this site benefitting from having a nofollow on the links in it's car buyer's checklist: http://www.trademe.co.nz/motors/used-cars/mitsubishi/diamante/auction-480341592.htm
Intermediate & Advanced SEO | | seomoz8steer0