Deleting 30,000 pages all at once - good idea or bad idea?
-
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes.
We no longer want them weighing down our database so we are going to delete them.
My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
-
Hi
This happened to a co. I was working with recently who deleted thousands of pages without notifying the SEO team, this led to lots of work in WMT and lots of digging around to understand where we had lost links. If you've got a load of inbound links pointing at these pages and nothing has been done 301s etc then watch your DA fall, as happened to this company. Ouch. This is what happens when Tech don't "like" marketing.
-
To make it look organic for Google I would say it would depend on how large the site is. If you have a site that only has an additional 30,000 pages, then it would be really odd for half the site to be removed at once. To be safe, do a batch of say 10,000 and let it index before doing the next batch. If it seems like you've been penalized, do a smaller batch the next time.
Then again, if you are already being penalized and are desperate for a change, you are already falling out of control into the black whole of Google SERPs, then do them all at once. After all, your already doing poorly, any corrective action would be better than nothing.
Just my thoughts. There are prolly better experts here than me. lol
-
I would delete all of them at once. BAM!
-
Not a problem. Any advice on deleting all at once or deleting in bits and pieces?
-
In that case, I agree with EGOL. Just drop them all.
-
Sorry, I edited after you posted... I agree... If the 301 is more work then no need to do it.
Good luck.
-
This is more of a theoretical question, but if we're not getting traffic to these pages and there wasn't a way for people to get there, do we need to 301?
Adding 301's will increase the work we'll need to do from the dev standpoint and I'm not sure if it's worth the effort. Any ideas?
-
I would delete these pages as soon as possible, since you have determined that they are not of value. It is possible that all of these pages are dead weight on your site.
Chop chop.
-
Visitors can't actually get to these pages, so it wouldn't be an issue for them. We also have researched and have not had any traffic to any of these pages for over 2 years. So we're not worried about it from a user-facing standpoint.
We're planning on 404ing them because the likelihood of having any backlinks are as close to zero as possible. I'm just wondering if it's better to 404 them in batches or all at once.
-
I suppose the most important question is What will be replacing them?
You don't necessarily want 30k 404 pages appearing overnight like that and causing issues for visitors. Personally I'd say you should go through all the pages to determine what the most relevant alternative page is and then 301 redirect the old pages to the new ones. That'll be a lot of work for 30,000 pages but it would probably be the best way to save from inadvertently losing traffic, backlinks and positive link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Filter pages - Shopify
Hi there, /collections/living-room-furniture/black
Technical SEO | | williamhuynh
/collections/living-room-furniture/fabric Is that ok to make all the above filter/tag pages canonicalised with their main category /collections/living-room-furniture OR I keep them as it is, so /collections/living-room-furniture/black can rank for filter keywords, example: black living room furniture, /collections/living-room-furniture/fabric fabric living room furniture etc. Also, does it needs to be noindex, follow as well? Note - already removed the main category content from filter pages, updated meta tags as well. Please advice, thank you0 -
Home Page Deindexed overnight?
Hi, Hope you guys can help. I run an e-commerce site https://alloywheels.com Last night our home page (and a few other pages, but not all) were de-indexed by Google. The site has been ranking (UK) for years in P1 for the "alloy wheels" keyword and on the whole been running very successfully. However recently I have noticed from fluctuation on the "alloy wheels" keyword, dropping to P3 then P5 then back to P3, but this morning I noticed we were not even ranking on the first page. When I check inside Search Console there are no messages or warnings but the "/" page was de-indexed. There were a few other key pages that were also de-indexed. I have request reindexing and they have come back, P7 for the home page for "alloy wheels" The only thing I have changed was I realised yesterday there was no robots.txt on the site and was being recommended by web.dev to add one, so I did. It was just an allow all: User-agent: *
Technical SEO | | JamesDolden
Disallow Sitemap: https://alloywheels.com/sitemap.xml I ran tests on the robots.txt before it was uploaded and it all came green. I have removed the robots.txt for now. Has anybody seen anything like this before? With the recent ranking fluctuation I am not sure whether it is to do with that, the robots.txt or something different altogether? Thanks in advance, James0 -
Site Crawl -> Duplicate Page Content -> Same pages showing up with duplicates that are not
These, for example: | https://im.tapclicks.com/signup.php/?utm_campaign=july15&utm_medium=organic&utm_source=blog | 1 | 2 | 29 | 2 | 200 |
Technical SEO | | writezach
| https://im.tapclicks.com/signup.php?_ga=1.145821812.1573134750.1440742418 | 1 | 1 | 25 | 2 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=blog&utm_campaign=brightpod-article | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=marketplace&utm_campaign=homepage | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=blog&utm_campaign=first-3-must-watch-videos | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?_ga=1.159789566.2132270851.1418408142 | 1 | 5 | 31 | 2 | 200 |
| https://im.tapclicks.com/signup.php/?utm_source=vocus&utm_medium=PR&utm_campaign=52release | Any suggestions/directions for fixing or should I just disregard this "High Priority" moz issue? Thank you!0 -
Deleteing old page and passing on link strenth?
We are a printing company and thinking over bringing our products down to 2 - 3 rather than the 10+ we currently have, the pages we will be getting rid of will be pages such as flyers, booklets etc and just concentrating on banners and stickers would you suggest 301ing the pages to the home page or picking pages for them to go to? Also could we expect a decent raise for the pages we are left with? Thanks shaun
Technical SEO | | BobAnderson0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Page load speed?
Hi everyone, hope you've all had a good new years celebration. Just a quick one on page load speed? how important do you think it is to organic rankings, especially on competitive terms? Any tips on improving it? do i need to improve just the landing page im trying to rank or every page as its looked at as a qualtiy score as a whole? Kind Regards,
Technical SEO | | pauledwards0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0