Deleting 30,000 pages all at once - good idea or bad idea?
-
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes.
We no longer want them weighing down our database so we are going to delete them.
My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
-
Hi
This happened to a co. I was working with recently who deleted thousands of pages without notifying the SEO team, this led to lots of work in WMT and lots of digging around to understand where we had lost links. If you've got a load of inbound links pointing at these pages and nothing has been done 301s etc then watch your DA fall, as happened to this company. Ouch. This is what happens when Tech don't "like" marketing.
-
To make it look organic for Google I would say it would depend on how large the site is. If you have a site that only has an additional 30,000 pages, then it would be really odd for half the site to be removed at once. To be safe, do a batch of say 10,000 and let it index before doing the next batch. If it seems like you've been penalized, do a smaller batch the next time.
Then again, if you are already being penalized and are desperate for a change, you are already falling out of control into the black whole of Google SERPs, then do them all at once. After all, your already doing poorly, any corrective action would be better than nothing.
Just my thoughts. There are prolly better experts here than me. lol
-
I would delete all of them at once. BAM!
-
Not a problem. Any advice on deleting all at once or deleting in bits and pieces?
-
In that case, I agree with EGOL. Just drop them all.
-
Sorry, I edited after you posted... I agree... If the 301 is more work then no need to do it.
Good luck.
-
This is more of a theoretical question, but if we're not getting traffic to these pages and there wasn't a way for people to get there, do we need to 301?
Adding 301's will increase the work we'll need to do from the dev standpoint and I'm not sure if it's worth the effort. Any ideas?
-
I would delete these pages as soon as possible, since you have determined that they are not of value. It is possible that all of these pages are dead weight on your site.
Chop chop.
-
Visitors can't actually get to these pages, so it wouldn't be an issue for them. We also have researched and have not had any traffic to any of these pages for over 2 years. So we're not worried about it from a user-facing standpoint.
We're planning on 404ing them because the likelihood of having any backlinks are as close to zero as possible. I'm just wondering if it's better to 404 them in batches or all at once.
-
I suppose the most important question is What will be replacing them?
You don't necessarily want 30k 404 pages appearing overnight like that and causing issues for visitors. Personally I'd say you should go through all the pages to determine what the most relevant alternative page is and then 301 redirect the old pages to the new ones. That'll be a lot of work for 30,000 pages but it would probably be the best way to save from inadvertently losing traffic, backlinks and positive link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Very wierd pages. 2900 403 errors in page crawl for a site that only has 140 pages.
Hi there, I just made a crawl of the website of one of my clients with the crawl tool from moz. I have 2900 403 errors and there is only 140 pages on the website. I will give an exemple of what the crawl error gives me. | http://www.mysite.com/en/www.mysite.com/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | | | | | | | | | | There are 2900 pages like this. I have tried visiting the pages and they work, but they are only html pages without CSS. Can you guys help me to see what the problems is. We have experienced huge drops in traffic since Septembre.
Technical SEO | | H.M.N.0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
Page not cached
Hi there, we uploaded a page but unfortunately didn't realise it had noindex,nofollow in the meta tags. Google had cached it then decached it (i guess thats possible) it seems? now it will not cache even though the correct meta tags have been put in and we have sent links to it internally and externally. Anyone know why this page isn't being cached, the internal link to it is on the homepage and that gets cached almost every day. I even submitted it to webmaster tools to index.
Technical SEO | | pauledwards0 -
Seomoz pages error
Hi
Technical SEO | | looktouchfeel
I have a problem with seomoz, it is saying my website http://www.clearviewtraffic.com has page errors on 19,680 pages. Most of the errors are for duplicate page titles. The website itself doesn't even have 100 pages. Does anyone know how I can fix this? Thanks Luke0 -
Is it good to have a few outbound links to good authority sites?
my site has less than 3 OBLs. (do follow) is it a good idea to give a couple do follow links to authoritative sites for domain authority purposes? thanks guys!
Technical SEO | | tm46150 -
Pages plummeting in ranking
Hi all, I have a question, which i hope you can answer for me. I have a site www.betxpert.com (a danish betting site) and we have tried to do some SEO to improve conversions. One of the steps we have taken was to link to all of our bookmaker reviews in our menu (a mega menu). All of our bookmakers have an img and text link in the menu. The menu is shown on every page of the site. Since we have made this change we have been plumeting down the SERPs. For the search "betsafe" this page http://www.betxpert.com/bookmakere/betsafe is no longer in the top 50. We also added the "stars" so that the google result will show our over all review for the bookmaker, in order to stand out in the SERPs. Can anyone explain to me what the problem might be? Over extensive internal linking or?
Technical SEO | | rasmusbang0