Best practice for removing pages
-
I've got some crappy pages that I want to delete from a site.
I've removed all the internal links to those pages and resubmitted new site maps that don't show the pages anymore, however the pages still index in search (as you would expect).
My question is, what's the best practice for removing these pages?
Should I just delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed from the search results?
-
Thanks for all the responses.
Think I'll opt for a server 301 to a suitable page
-
An alternative for removing them from search results, besides a 301, is meta robots - noindex. You would do this if you do not want them passing any link value to 301d pages, which it doesn't sound like you would want them to.
-Dan
-
Hi Shelly
If the pages are 'crappy' as you describe, then a good idea to delete them as poor quality pages can negatively affect the site overall.
If you are going to delete the pages, then yes, cisaz is right, ensure they are 301 redirected to a relevant page to retain some of the link benefits and to give direct visitors to those pages somewhere to go other than a 404 error page.
Also, request the removal of the URLs of the deleted pages within Google Webmaster Tools to speed up the de-indexing of them. (Will need a NoIndex robots tag on the pages, and/or a Disallow in the robots.txt file).
Regards
Simon
-
I would go with your second Idea, "Delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed".
Dont want to loose any link back juice you may have.
Hope this helps. ; )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices for SEO 2021
What are the best way to do on page and off page seo in 2021?
Intermediate & Advanced SEO | | SaraClay0 -
VTEX Infinite Scroll Design: What is On-Page SEO Best Practice?
We are migrating to the VTEX E Commerce platform and it is built on javascript, so there are no <a>tags to link product pages together when there is a long list of products. According to the Google Search Console Help document, "Google can follow links only if they are an</a> <a>tag with an href attribute." - Google Search Console Help document </a>http://support.google.com/webmasters/answer/9112205. So, if there a 1000 products, javascript just executes to deliver more content in order to browse through the entire product list. The problem is there is no actual link for crawlers to follow. Has anyone implemented a solution to this or a similar problem?
Intermediate & Advanced SEO | | ggarciabisco0 -
How I improve my ON-PAGE?
Hi, My Tech related site zophra is not rank in google properly and traffic is not increasing. I think my website on-page is not suitable according to google algorithm. Kindly help me if anyone knows about on-page.
Intermediate & Advanced SEO | | igaoevale0 -
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0