When auditing a website, when do you decide to delete pages?
-
Given that the Panda algorithm includes engagement and user experience, when would you consider deleting a page that has poor engagement and conversion metrics?
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
Are some metrics weighed more than others? What kind of thresholds do you use?
Finally, is there a situation when you would choose NOT to delete pages, even considering the above?
-
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
I would improve the page.
Beef up the content, add seductive links to get traffic to a more valuable page, add adsense to earn money if the traffic is low quality.
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
If someone brought me a site that needed help I would do keyword research to determine if they are covering the important queries for their line of business. If they are not I would have a content plan to get them covered. If they are covered but performing poorly we would improve those pages.
Looking at the numbers you suggest is like cutting off a foot because you have a blister on your toe. Decide instead if the foot is valuable. If yes, cure it.
-
Hard to beat what Dan has said here.
The only think I could possibly add is to monitor whether google has added those pages to the index, and/or removed them. I find it telling to see what google acknowledges by way of their own search results.
-
Hi There
First off, I rarely delete pages. Better and easier to noindex. That way you get them out of the SERPs and reduce the poor user metrics, but people can still find the pages otherwise and you don't have to 301 redirect them etc. You can delete if you feel they are just a bad user experience over all of course - but I noindex as a starting point.
Anyhow, regardless, here's how I access it - first I use a custom report with the following metrics (you can play around with them);
- pageviews
- entrances
- new visits
- avg time on page
- exits
- exit rate
- "page" for the dimension
Thresholds - starting point (I use filters)
- pageviews - I start with over 50
- avg time on page - less than 30 seconds
- exit rate - great than 80%
I like to end up with a list of maybe 50-100 pages that fall within the thresholds. Every site is different. But I try to isolate 50-100 of the worst pages (we're assuming maybe a 2,500+ page site).
You can throw a segment on there if you want to segment just Google Organic traffic - that could in some cases be more accurate.
Hope that helps! Interested to see what other people do.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Location Pages
Hi. A client of mine offers multiple services and covers a region of the UK. They want to target each major town/city within this area. However, there are 20 cities and services offered range from 5-15 services. I am in the process of creating a location page for each city, so it can be optimised separately however I am not sure if there is a better way to do it? Or should I create a page for each city & service. So I for example I end up with 10 London pages with each one offering a different service? These can all be optimised for different services within London then? Any suggestions? Thanks
Search Behavior | | YNWA0 -
Free Tool that allows you to compare traffic for multiple websites
I'm banging my head on this one. In the past I was able to use Compete.com, Quancast, Google Trends, and Alexa, but now all these sites either required you to have Pro membership (pay) or they discontinue it like Google Trends for website. I need to do this comparison for one of my client... their traffic versus 4 of their competitors. Any suggestions would be greatly appreciated. Have a blessed Day, Benny
Search Behavior | | ACann1 -
How come some local 7 pack listings link to site and some link to the G+ page?
Does anyone know how to fix this issue? Even though a site profile has had the website added to it Google continues to link the main "title tag" link to the G+ page and not the actual website domain. Thanks for any info in advance! https://www.google.com/#sclient=psy-ab&q=dog+sitting+in+rockaway&oq=dog+sitting+in+rockaway&gs_l=hp.3..0i22i30l4.14871.16189.1.16397.8.8.0.0.0.0.296.2042.2-7.7.0...0.0...1c.1.15.psy-ab.Y1db0jo77V0&pbx=1&bav=on.2,or.r_cp.r_qf.&bvm=bv.47244034,d.bmk&fp=722b460c2153b7be&biw=1920&bih=910
Search Behavior | | irvingw0 -
Are there better & inexpensive third party website analytics software over Google Analytics?
I've heard there are some third-party software that provide greater depth of information than Google Analytics, such as mouse tracking, heat mapping, video snapshots ect. Can anyone recommend a good program to use? I've tried a basic web-search but there seems to be a great variety of different ones.
Search Behavior | | Justin_hannan270 -
Spammy website dominating SERPs! Why!?
Hey guys, I've recently noticed that a series of EMDs have been setup to completely spam an extensive set of keywords - and it seems to be working. All of the URLs are keyword targeted with tons of keyword variations. And they're getting massive ranking preference over a number of more established websites. These are just an example of some of the domains; diykitchens1.co.uk fittedkitchens1.co.uk cheapkitchens1.co.uk kitchenunits1.co.uk And then there's loads of local targeted domains such as; kitchensglasgow1.co.uk kitchensedinburgh1.co.uk Again, all of these are getting high ranking with what seems to be duplicated websites. It's pretty bizarre. Will Google penalise these sites? Surely they will?
Search Behavior | | Webrevolve0 -
2 gallerys showing the same images on the one website
Hi, I was Just looking through some of our website and noticed that we have a slide show gallery and when you click on the image it loads onto a light box gallery. would this act a duplicate content as it shows in two different types of displays in the one place. Thanks again
Search Behavior | | Feily0 -
How to optimize a Stock Symbol page?
Hi, My website is for stock picking. The keywords in each stock page are stock symbol. Now the stock symbol rank of my website in google is around 30-50. I use On-Page Keyword Optimization tool on seomoz to optimize those page. Now all stock symbol pages are Grade A in on-page report card. Is there other suggestions to increase stock symbol page ranking? Thanks in advance! -Don
Search Behavior | | dodoflying0 -
Have you seen any good articles on implementing customer reviews on an E-commerce website
Looking for a great article on implementing customer based review system with the ability to feed that data to Google with rich snippets.
Search Behavior | | WebResource0