When auditing a website, when do you decide to delete pages?
-
Given that the Panda algorithm includes engagement and user experience, when would you consider deleting a page that has poor engagement and conversion metrics?
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
Are some metrics weighed more than others? What kind of thresholds do you use?
Finally, is there a situation when you would choose NOT to delete pages, even considering the above?
-
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
I would improve the page.
Beef up the content, add seductive links to get traffic to a more valuable page, add adsense to earn money if the traffic is low quality.
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
If someone brought me a site that needed help I would do keyword research to determine if they are covering the important queries for their line of business. If they are not I would have a content plan to get them covered. If they are covered but performing poorly we would improve those pages.
Looking at the numbers you suggest is like cutting off a foot because you have a blister on your toe. Decide instead if the foot is valuable. If yes, cure it.
-
Hard to beat what Dan has said here.
The only think I could possibly add is to monitor whether google has added those pages to the index, and/or removed them. I find it telling to see what google acknowledges by way of their own search results.
-
Hi There
First off, I rarely delete pages. Better and easier to noindex. That way you get them out of the SERPs and reduce the poor user metrics, but people can still find the pages otherwise and you don't have to 301 redirect them etc. You can delete if you feel they are just a bad user experience over all of course - but I noindex as a starting point.
Anyhow, regardless, here's how I access it - first I use a custom report with the following metrics (you can play around with them);
- pageviews
- entrances
- new visits
- avg time on page
- exits
- exit rate
- "page" for the dimension
Thresholds - starting point (I use filters)
- pageviews - I start with over 50
- avg time on page - less than 30 seconds
- exit rate - great than 80%
I like to end up with a list of maybe 50-100 pages that fall within the thresholds. Every site is different. But I try to isolate 50-100 of the worst pages (we're assuming maybe a 2,500+ page site).
You can throw a segment on there if you want to segment just Google Organic traffic - that could in some cases be more accurate.
Hope that helps! Interested to see what other people do.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
List all keywords from a website per page in a table??
Hi all, Very basic question I know but strangely cannot find a solution to it? (its 20:40 on a Fri-night! 😉 ) I am working on a website that has over 100 pages and would like to see all the keywords associated with each page maybe in a table report of some kind? Is this at all possible? Heres an example URL KEYWORDS /index.html Moz, Moz local, London /aboutmoz.html Moz. Moz SEO, London and so on....
Search Behavior | | darrenbooy0 -
Long list or paginated pages
Hi peeps, I am just interested in this from a usability POV and to see what you would prefer to see when you are met with a page that has multiple options. Lets say that the page looks like a list of services, each clearly marked out in its own segment, but there are 50-60 options that match your requirements. Do you like to keep scrolling, or would you prefer to take what is there and then move on if you feel you want to dig deeper? Would you like to see a long list, of have the options loaded in as you get to them? -Andy
Search Behavior | | Andy.Drinkwater2 -
Spammy website dominating SERPs! Why!?
Hey guys, I've recently noticed that a series of EMDs have been setup to completely spam an extensive set of keywords - and it seems to be working. All of the URLs are keyword targeted with tons of keyword variations. And they're getting massive ranking preference over a number of more established websites. These are just an example of some of the domains; diykitchens1.co.uk fittedkitchens1.co.uk cheapkitchens1.co.uk kitchenunits1.co.uk And then there's loads of local targeted domains such as; kitchensglasgow1.co.uk kitchensedinburgh1.co.uk Again, all of these are getting high ranking with what seems to be duplicated websites. It's pretty bizarre. Will Google penalise these sites? Surely they will?
Search Behavior | | Webrevolve0 -
2 websites for 2 dealer locations or one website for both locations - Thoughts?
I'm trying to decide what would be the best option for my client. They are a car dealership group who own 2 dealerships about a half hour away from each other. The dealerships have the same name but are just located in different locations. One dealership is in a small city in competition with several other dealerships within the city. "Dealership name city name" The other dealership they own (Same dealership name) is located in a small town close to an even larger city. "Same dealership name small town name" My options are: 1. Creating 1 authoritative website optimized for all 3 locations. The 2 cities both dealerships are located in as well as the large city close to the small town. This option would be less time consuming, we would only have to earn links, citations & blog for one website. However we'd still need to have citations using both dealership addresses. So that's still double the work. This site would probably be more authoritative and we could have a page promoting each dealership & have shared vehicle inventory. We'd attach 2 Google+ pages using the different addresses & have both location addresses prominently in the footer of the site. 2. Create 2 separate websites for each dealership & target the surrounding towns/cities in their respective areas (even though both dealerships are only a half hour apart). This option is more time consuming as we'd have to earn double the amount of links. Work on citation building, blog for 2 websites etc. But we wouldn't be diluting our SEO by trying to rank for all 3 locations. We'd have a better chance if we focused on each locations separately on 2 sites. BUT the 2 sites would have less authority. What is everyone's thoughts? What would you recommend to be the best option. Money isn't an issue. Thanks so much for any help.
Search Behavior | | DCochrane0 -
Why does my website not rank better for the keyword i am going for?
My website is www.canadafloraldelivery.com. I just can't seem to get my webpage ranked well for the keyword " Flowers Edmonton " . Also I can't seem to get my Google local listing on the map for many keywords. Is there something obvious that I am missing? Any help would be very appreciated.
Search Behavior | | CKerr0 -
Dating Blog Posts & How Fast Google Picks up on New Pages
I had until a few months ago included the original post date of a new blog post on the site. I then removed it and none of my results in Google now include the blog post date, although for some (for articles written about events) Google includes the date of the event where you would usually see the post date. Since I did this, it seems like new blog posts are taking longer to rank on Google, some results are ranking well, and others declined relative to what I would have previously expected. What's the best thing to be doing? To include a date (considering a lot of my content is not time-relevant) or to keep it as it is now? The second thing, is I often go through and update my articles with new information and re-post it in my rss feed etc - ie the date becomes new again. How does Google treat this? Any ideas or comments would be great! Thanks
Search Behavior | | ben10001 -
Would you say it is more bennificial to seperate keywords in the title tag tag of a page using a common ( keyword , keyword | Domain.com) or using a hyphen as SEOmoz best practices reccommends (keyword - keyword | domain.com)?
Title tag best practices according to seomoz is the following keyowrd - keyword | brand.com but I have seen some interesting results from using a comma as to a hyphen to seperate keywords as reccomended and wanted to know which method is more crawler friendly.
Search Behavior | | JHSpecialty0 -
Google Is Displaying An Alt image tag as my homepage's page title
Google is randomly displaying the alt image tag as the page title for my homepage. It happens when you search for the brand name, but the page title appears as "BrandName Logo" (obviously not "BrandName"). Has anybody seen this happen before?
Search Behavior | | MichaelWeisbaum0