Handling Deleted Pages
-
The backstory: My site has a fantasy sportsbook where users can place "bets" on pretty much anything. Each game has a unique matchup page, where the odds are displayed and any conversation about the game takes place. Example here.
About 95% of the games are automatically graded, but I have to manually grade the rest. Therefore, as soon as every game starts I check to see if any users have made a pick on it, and if not I delete it because it reduces my workload.
The problem: About 15% of my search-driven traffic is queries for games that no longer exist, which makes sense because nobody bets on the super obscure games and these games are very easy to rank for. I am currently redirecting them to my 404 page but I'm worried that all of these hits are hurting my reputation with the big G.
Would it be better to noindex all of these pages at first and take the noindex away as soon as I'm positive that the game will stay?
-
Thanks! You've confirmed my fears. The first link redirects to the 404 page because that game was deleted - thus demonstrating the severity of the problem.
Here is a link for a game that wasn't deleted. I'm torn on possibly nofollowing all of these pages unless a conversation has been started like here, because they don't contain much content. In fact, I'm trying to put myself in the shoes of somebody searching for a game result, and find that I may be more likely to bounce from a matchup page than a 404 page. Oh, the decisions we face!
-
Hi Patrick,
Your first link is going to your custom 404 as well, but I get what you are describing, so yes, if it were me I would make sure that the pages are noindexed (using the noindex meta tag, which is the most reliable method) by default when they are created. When you decide that a page is to remain on the site permanently you can then remove the noindex tag and make it visible.
Your custom 404 has everything you need to make sure there is a good chance the visitor will click through to another page in your site, so you are doing all you can to make sure that those people following a broken link don't bounce right back to the search engine.
I would say your logic is correct in that the only way to further improve the odds is to reduce the potential for broken links before they happen.
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many pages should we optimise?
I have more than 250 pages on my site including my products. Is it a good idea to optimise each page with a unique keyword or is there a limit to the number of pages we should aim for?
On-Page Optimization | | Timberwink0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Page architecture
We have some good content on our site, particularly relating to UK employment law. One section on unfair dismissal is split into 9 pages - there is a fair amount of legal detail. The question is whether we should combine it all into one "mother of all unfair dismissal" page just to satisfy the Google monster or keep in as it is. Some of the individual pages rank on page 1 already. If we change the architecture are 301 redirects the best way to handle the changing urls? The other more important issue is whether it is easier to read it all on one page or split it. Keeping G happy may not actually keep our users happy. As the content is quite dense we want to ensure we don't overload people. Any thoughts appreciated.
On-Page Optimization | | dexm100 -
Issue: Duplicate Page Content
For duplicate page content, how different should pages be? For example, I have seven locations and on each location page, we offer a discount. The discounts are the same currently and open into a pop-up window. So it looks something like this: mysite.com/locationA/dicount mysite.com/locationB/discount mysite.com/locationX/discount The pages are identical. Should I change the verbiage on each page or let it be? I noticed that our organic search rankings have dropped since our site upgrade and this is one item that SEOMOZ has noted. Thanks! DHO
On-Page Optimization | | DougHoltOnline0 -
Break-up content into individual pages or keep on one page
I am working on a dental website. Under menu item "services" lists everything he does like.. Athletic Sports Guards
On-Page Optimization | | Czubmeister
An athletic sports guard is a resilient plastic appliance that is worn to protect the teeth and gum tissues by absorbing the forces generated by traumatic blows during sports or other activities. Digital X-Rays We use state of the art digital x-rays and digital cameras to help with an accurate diagnosis of any concerns. Digital Imaging On initial visits, and recall visits, we take a series of digital photographs to aid us in diagnosis as well as to give you a close-up view of your mouth and any oral conditions. Smile Makeovers
We offer a number of different options including bleaching, bonding, porcelain veeners, and in some cases, implants and/or orthodontic care is utilized in our smile makeover planning. Nitrous oxide for your Comfort Would it be better to break these services up into individual pages? I was thinking I would because then I could add more pictures and expand on the topic and try to get an "A" grade on each page. I'm not sure how I could rank a page if I have 35 services listed on the page. That would be an awfully big H1! Suggestions?0 -
Why is on-page optimisation not showing all my keywords
Why is on-page optimisation not showing all my keywords, it is reporting on only about 10 of the 40
On-Page Optimization | | yours2share0 -
URL for location pages
Hello all We would like to create clean, easy URLs for our large list of Location pages. If there are a few URLs for each of the pages, am I right when I'm saying we would like this to be the canonical? Right now we would like the URL to be: For example
On-Page Optimization | | Ferguson
Domain.com/locations/Columbus I have found some instances where there might be 2,3 or more locations in the same city,zip. My conclusion for these would be: adding their Branch id's on to the URL
Domain.com/locations/Columbus/0304 Is this an okay approach? We are unsure if the URL should have city,State,zip for SEO purposes?
The pages will have all of this info in it's content
BUT what would be best for SEO and ranking for a given location? Thank you for any info!0 -
Measuring Page Weight Dashboard
I was wondering if there is a software / dashboard where I can track key metrics around page weight (primed & unprimed etc…) javascript etc…. and compare us to established benchmarks of competitors?
On-Page Optimization | | Tradingpost0