When auditing a website, when do you decide to delete pages?
-
Given that the Panda algorithm includes engagement and user experience, when would you consider deleting a page that has poor engagement and conversion metrics?
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
Are some metrics weighed more than others? What kind of thresholds do you use?
Finally, is there a situation when you would choose NOT to delete pages, even considering the above?
-
For example, consider a page that ranks well organically and receives (relatively) decent traffic from search. However, this page has poor engagement metrics compared to other pages on the site, does not convert visitors as well as other pages on the site, and doesn't have any external links. Would you consider deleting this page?
I would improve the page.
Beef up the content, add seductive links to get traffic to a more valuable page, add adsense to earn money if the traffic is low quality.
Which metrics do you use when auditing a site and considering a web page from removal (bounce rate, average time on site, pages per visit, linking root domains, visits, revenue per visit, etc.)?
If someone brought me a site that needed help I would do keyword research to determine if they are covering the important queries for their line of business. If they are not I would have a content plan to get them covered. If they are covered but performing poorly we would improve those pages.
Looking at the numbers you suggest is like cutting off a foot because you have a blister on your toe. Decide instead if the foot is valuable. If yes, cure it.
-
Hard to beat what Dan has said here.
The only think I could possibly add is to monitor whether google has added those pages to the index, and/or removed them. I find it telling to see what google acknowledges by way of their own search results.
-
Hi There
First off, I rarely delete pages. Better and easier to noindex. That way you get them out of the SERPs and reduce the poor user metrics, but people can still find the pages otherwise and you don't have to 301 redirect them etc. You can delete if you feel they are just a bad user experience over all of course - but I noindex as a starting point.
Anyhow, regardless, here's how I access it - first I use a custom report with the following metrics (you can play around with them);
- pageviews
- entrances
- new visits
- avg time on page
- exits
- exit rate
- "page" for the dimension
Thresholds - starting point (I use filters)
- pageviews - I start with over 50
- avg time on page - less than 30 seconds
- exit rate - great than 80%
I like to end up with a list of maybe 50-100 pages that fall within the thresholds. Every site is different. But I try to isolate 50-100 of the worst pages (we're assuming maybe a 2,500+ page site).
You can throw a segment on there if you want to segment just Google Organic traffic - that could in some cases be more accurate.
Hope that helps! Interested to see what other people do.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google treat significant content changes to web pages and how should I flag them as such?
I have several pages (~30) that I have plans to overhaul. The URLs will be identical and the theme of the content will be the same (still talking about the same widgets, using the same language) but I will be adding a lot more useful information for users, specifically including things that I think will help with my fairly high bounce rate on these pages. I believe the changes will be significant enough for Google to notice, I was wondering if it goes "this is basically a new page now, I will treat it as such and rank accordingly" or does it go "well this content was rubbish last time I checked so it is probably still not great". My second question is, is there a way I can get Google to specifically crawl a page it already knows about with fresh eyes? I know in the Search Console I can ask Google to index new pages, and I've experimented with if I can ask it to crawl a page I know Google knows (it allows me to) but I couldn't see any evidence of it doing anything with that index. Some background The reason I'm doing this is because I noticed when these pages first ranked, they did very well (almost all first / second page for the terms I wanted). After about two weeks I've noticed them sliding down. It doesn't look like the competition is getting any better so my running theory is they ranked well to begin with because they are well linked internally and the content is good/relevant and one of the main things negatively impacting me (that google couldn't know at the time) is bounce rate.
Search Behavior | | tosbourn0 -
How to capture leads from website?
Hi, We have a contact and a registration form on our website to capture leads/enquiries. However, I have come across many websites who give away many resources for free i.e. without asking for any user details. In this case, how do they track or capture the data of people coming to their website and interacting with the content? There are other methods we well like CTAs, pop-up on exit intent etc., but these are different to what I have asked above. Regards
Search Behavior | | IM_Learner0 -
Location Pages
Hi. A client of mine offers multiple services and covers a region of the UK. They want to target each major town/city within this area. However, there are 20 cities and services offered range from 5-15 services. I am in the process of creating a location page for each city, so it can be optimised separately however I am not sure if there is a better way to do it? Or should I create a page for each city & service. So I for example I end up with 10 London pages with each one offering a different service? These can all be optimised for different services within London then? Any suggestions? Thanks
Search Behavior | | YNWA0 -
My website disappears off google!
So this might be kinda of a weird question... Every morning and night I check the ranking of a website that I am building.. The ranking has gone up a lot the last two months. It went from the fifth page to now the second page. I have a issue where some days I check Google my website is completely gone! I go through every page for my keyword and it's not there! After a couple of days of frustration I check again and all of a sudden it is there but now at a higher ranking... I went through the code to make sure there's a not a not follow code in the robots.txt page... Btw another weird thing is so then I look up my website on a google out of country like google.sg and I'm ranking first page like number 5 but again disappeared off google usa. Literally driving my crazy.. does anyone know why this could be? Btw the first time it disappeared I went into webmasters and sent a request because I thought I got penalized but they responded they could not find any spam and I was NOT penalized...
Search Behavior | | BecCan0 -
Is it possible to know if visitor arrived at the web page via organic search and if so, show some content?
Hello, Is it possible to know if visitors are arriving at a web page via organic search? Background: We have a section of job description pages to explain typical tasks. These have very high bounce rate (some 100%), and I think people are confusing them with actual jobs. For example "stage designer". Many of those keyword we have very high rankings. I am thinking of having a small notice at the top of those page to say something like "if you are stage designer job, check out our job section". Thanks
Search Behavior | | CreativeChoices0 -
Long page - good or bad?
Our attorney wrote a dozen articles that range from 300 to 700 words on various topics of the certain law area. These articles are all placed on our FAQ page with anchored table of contents. This page does frequently come up on the first page of the google when people search for the questions discussed in these articles. 90% of these visits are not local therefore they are not potential clients. Attorney views it more like a community service then a marketing tool. However, I think there might be a problem. People read though the page and close it because usually they can find what they were looking for right there, however GA counts it as bounce because they did not browse to another page. Would large number of bounces hurt our standing with Google? Would it be better to separate the page into multiple pages for each article to make visitors browse?
Search Behavior | | SirMax0 -
Spam like visits to website
I am having a problem where I'm getting an enormous amount of unique visitors that bounce from a small area in Oregon even though we never market there. It appears to be some type of ping. It messes up our tracking and inflates the bounce rate. Is there a way to block this? I'm OK with any type of block because our reach is statewide in Florida. ANy feedback?
Search Behavior | | marksierra0 -
Have you seen any good articles on implementing customer reviews on an E-commerce website
Looking for a great article on implementing customer based review system with the ability to feed that data to Google with rich snippets.
Search Behavior | | WebResource0