When auditing a website, when do you decide to delete pages?
-
Given that the Panda algorithm includes engagement and user experience, when would you consider deleting a page that has poor engagement and conversion metrics?
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
Are some metrics weighed more than others? What kind of thresholds do you use?
Finally, is there a situation when you would choose NOT to delete pages, even considering the above?
-
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
I would improve the page.
Beef up the content, add seductive links to get traffic to a more valuable page, add adsense to earn money if the traffic is low quality.
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
If someone brought me a site that needed help I would do keyword research to determine if they are covering the important queries for their line of business. If they are not I would have a content plan to get them covered. If they are covered but performing poorly we would improve those pages.
Looking at the numbers you suggest is like cutting off a foot because you have a blister on your toe. Decide instead if the foot is valuable. If yes, cure it.
-
Hard to beat what Dan has said here.
The only think I could possibly add is to monitor whether google has added those pages to the index, and/or removed them. I find it telling to see what google acknowledges by way of their own search results.
-
Hi There
First off, I rarely delete pages. Better and easier to noindex. That way you get them out of the SERPs and reduce the poor user metrics, but people can still find the pages otherwise and you don't have to 301 redirect them etc. You can delete if you feel they are just a bad user experience over all of course - but I noindex as a starting point.
Anyhow, regardless, here's how I access it - first I use a custom report with the following metrics (you can play around with them);
- pageviews
- entrances
- new visits
- avg time on page
- exits
- exit rate
- "page" for the dimension
Thresholds - starting point (I use filters)
- pageviews - I start with over 50
- avg time on page - less than 30 seconds
- exit rate - great than 80%
I like to end up with a list of maybe 50-100 pages that fall within the thresholds. Every site is different. But I try to isolate 50-100 of the worst pages (we're assuming maybe a 2,500+ page site).
You can throw a segment on there if you want to segment just Google Organic traffic - that could in some cases be more accurate.
Hope that helps! Interested to see what other people do.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google treat significant content changes to web pages and how should I flag them as such?
I have several pages (~30) that I have plans to overhaul. The URLs will be identical and the theme of the content will be the same (still talking about the same widgets, using the same language) but I will be adding a lot more useful information for users, specifically including things that I think will help with my fairly high bounce rate on these pages. I believe the changes will be significant enough for Google to notice, I was wondering if it goes "this is basically a new page now, I will treat it as such and rank accordingly" or does it go "well this content was rubbish last time I checked so it is probably still not great". My second question is, is there a way I can get Google to specifically crawl a page it already knows about with fresh eyes? I know in the Search Console I can ask Google to index new pages, and I've experimented with if I can ask it to crawl a page I know Google knows (it allows me to) but I couldn't see any evidence of it doing anything with that index. Some background The reason I'm doing this is because I noticed when these pages first ranked, they did very well (almost all first / second page for the terms I wanted). After about two weeks I've noticed them sliding down. It doesn't look like the competition is getting any better so my running theory is they ranked well to begin with because they are well linked internally and the content is good/relevant and one of the main things negatively impacting me (that google couldn't know at the time) is bounce rate.
Search Behavior | | tosbourn0 -
Bounce Rate: Would GA consider a user interacting with a single page (i.e expanding text), before exiting a bounce?
Hey guys, I have done a ton of research regarding bounce rates, yet I cannot seem to find the answer I'm after. Senario: BoIf a user follows a link and lands on a product results page with various sets of filters and an "apply" button, looks through the site, adjusts the filter functions (without changing the url structure) and exists without going to another page - would that qualify as a bounce in Google Analytics? For example: If I click 'details' to see expandable text on a results page ( /credit-cards). The dynamic url I'm talking about is below: https://masii.co.th/thai-en/credit-card/compare-buy?category=shopping%2Ccashback%2Cair_miles%2Cassistance%2Cdining%2Cpetrol%2Crewards&income=30000
Search Behavior | | Masii0 -
Is it better to find a page without the desired content, or not find the page?
Are there any studies that show which is best? If you find my page but not the specific thing you want on it, you may still find something of value. But, if you don't you may associate my site with poor results, which can be worse than finding what you want at a competitor site. IOW maybe it is best to have pages that ONLY and ALWAYS have the content desired. What do the studies suggest? I'm asking because I have content that maybe 1/3 of the time exists and 2/3 of the time doesn't...think 'out of stock' products. So, I'm wondering if I should look into removing the page from being indexed during the 2/3 or should keep it. If I remove it then my concern is whether I lose the history/age factor that I've read Google finds important for credibility. Your thoughts?
Search Behavior | | friendoffood0 -
Free Tool that allows you to compare traffic for multiple websites
I'm banging my head on this one. In the past I was able to use Compete.com, Quancast, Google Trends, and Alexa, but now all these sites either required you to have Pro membership (pay) or they discontinue it like Google Trends for website. I need to do this comparison for one of my client... their traffic versus 4 of their competitors. Any suggestions would be greatly appreciated. Have a blessed Day, Benny
Search Behavior | | ACann1 -
Google site: search showing twice amount of indexed pages. why?
I have around 50k pages indexed on my site but when I do a google site: of my site it shows around 100k pages indexed. Why is it showing so much more? It is also only showing around 700 pages indexed in my web masters account for the site. Background: We have a custom site map being generated automatically. Let me know if you would like more info, Thanks.
Search Behavior | | Nicktaylor10 -
Only 11 pages being crawled
Hi, Can some one have a look and see why out of 400+ pages we only have 11 being crawled on here?? http://www.lifetimelegal.co.uk Kind Regards Elissa
Search Behavior | | Chris__Chris0 -
A grade website
I started to optimize my personal Blog and portfolio website and all of my on page reports for the keywords "photography" and "Colorado Springs" were A's. What are the benefits and how much is taken in consideration by Google's algorithm? Website http://clotairedamy.com
Search Behavior | | clotairedamy0 -
How to get more page impressions?
I'm wondering about one of our web-projects. There's a lot of good interesting content but the statistics of page impressions don't make me very happy. Each user is visiting just 1,5 sites per visit. That's really not much. We have other (similar) projects where this problem does not exist, where user are visiting a lot more sites per visit. I have no idea what could be the reason for it. Do you know / use some tricks to get more page impressions? Thank you, Sally
Search Behavior | | SallyO0