An immediate and long-term plan for expired Events?
-
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague):
PROBLEM
I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages.
The problem? Almost all of these have expired, leading to a little bit of index bloat.
THINGS TO CONSIDER
-
A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires.
-
About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event.
QUESTIONS I'M ASKING
-
How do we address all these old events that provide no real value to the user?
-
What should a future process look like to prevent this from happening?
MY SOLUTION
Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users.
Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them.
Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired.
Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index.
Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go.
Thanks. Eager to hear all your thoughts.
-
-
Great! Happy to help
-
Hi Robin, thanks for taking the time to write out such detailed and helpful responses. I think I've decided to go with the approach you're outlining above:
For those that are already indexed:
- Change the 302s to 301s (all of the expired events that are indexed are 302s for some reason)
- 404/410 those that don't have any equity
- Create a custom 404 page
- Wait for them to drop out of index
For future Expired Events
- Wait about one month, then apply a 404 with custom page
- Redirect any that have backlinks
It'll require a little more work, but it is, I think, the right thing to do in this very bizarre situation.
-
To be honest it sounds like you already have your plan.
One thing I'd bear in mind is a crawl you run of your site won't line up with the pages that Google is visiting. For one thing, the tools we use try to approximate Google but won't be exactly the same. More importantly, once Google knows of a page it'll come back and check on it to see if the content changed, the only way you'll see that is by looking at your log files.
Yea there's no point making it "noindex, follow", it's not that Google doesn't know what to do with the page, it's just that it's attitude to the page will change over time.
In terms of the large number of redirects, there is some risk that Google could see the large number of 301s as spammy but, to be honest, I've never directly seen evidence of that being a problem. The way I see it, the choice is fairly similar you could
-
404/410 that's the way the internet is meant to work when something no longer exists but you'll lose link equity.
-
301 to preserve link equity but you're essentially misusing the status code.
-
Do a monthly check, 301 any expired pages with discovered backlinks, 410 the rest. This is best of both worlds but is much more time consuming.
I think you can probably get away with the 301s but it all comes down to your appetite for risk.
Good luck!
-
-
Thanks for the detailed response and the suggestion. The problem is, I think, a little more complicated than that. So there are two main concerns:
**1. What do we do with the current expired pages? **
So one thing that happens is that the event pages are effectively orphaned once the event has passed. All trace of them is removed from the website, and if my previous crawl is to be believed, they don't get crawled. Right now, the majority of these expired and indexed event pages are actually 302 redirects. So we're getting a temporary redirect to a page that is expired. Hardly a good user experience.
I do know that since it's a 302, Google is thinking "Hey, the page is coming back, so we're going to index that but send visitors to the new pages." This would be why the 302 URL is indexed. Am I correct in assuming that updating all of these to a 301 would result in the URL ultimately being removed? If so, then I think the best course of action would simply be to 301 redirect all of the current 302 URLs, as well as the actual expired event pages to the relevant event host / program pages.
Also, I did not know that _noindex _was treated as noindex, nofollow after awhile. Would it be beneficial to make them _noindex, follow, _or would that still be a redundancy that Google will ultimately ignore? I also do not think a pop-up is the way to go. These are very short-term events, so the issue is _less _a user experience and more a means of preventing them from clogging up the index. Also it would just be more work for the client and I'm trying to keep things as simple as possible.
**2. What do we do with the future expired pages so they don't end up getting indexed? **
This is probably a more pressing question. So the main concern is we want the Event pages to be indexed while they're live then ultimately removed after they've expired. I'm okay with this process: write a script that auto-redirects and remove all internal links from the website and just simply be patient. My main concern is just having way too many 301 redirects in place.
I'm hoping that the 301 in place combined with the complete orphaning of the page will mean they simply won't be crawled and eventually dropped from the index and thus not accessible to Google or users, but I'm still a little wary. Thoughts? Is there any room for adding anything to robots.txt?
Thanks again for your help. It is much appreciated.
-
Hi there, thanks for posting!
I think my main question here is around the decision to note 404 or 301 these pages. I totally understand that you want to reduce the number of indexed pages which aren't providing value but also don't want to lose equity. I know you mention you're not super technical so I'm going to break down how I expect link equity to be passed around a site and therefore how I expect each of these techniques to impact the page.
Equity is passed from page to page via links so these events pages will pass equity to other pages on yo by Google having a record of the page and the equity of that page, then distributing that equity through links it can follow. Google representatives have said recently that, after a period of time, noindex pages are treated as noindex nofollow at which point we can't rely on equity being passed along any of the outbound links from these pages.
-
noindex: removes the page from the index, after a period of time no equity will be passed from the noindexed page. Initially Google will continue to crawl the page but that will reduce over time.
-
404: the page doesn't exist so will be removed from the index after a period of time. No equity will be passed from the page. Google should stop crawling the page fairly quickly.
-
410: more definitive than 404. Page should drop out of the index more quickly. No equity will be passed from the page. Google should should stop crawling the page fairly quickly.
-
301: we're telling Google that this address is no good any more and it should instead look at a different address. Again, the redirected page should drop out of the index and some proportion of the redirected page's equity should be transferred to the target page. Google should stop crawling the page more quickly than noindexed version but probably not as quickly as the 404/410.
Based on all that I don't think noindex is necessarily your best option. You'll still have a bunch of defunct pages, which Google may still spend time crawling, and you can't rely on them passing equity.
A custom 404/410 page explaining to users that the event has passed is probably a pretty good user experience and would be the most expected behaviour for a situation where content isn't there any more, but won't help you with equity.
I think what you could do is automatically 301 redirect to a relevant category page with a pop-up message that explains to users what's happened. Doesn't sound like you expect the event pages to pop in and out of existence so the logic should be fairly simple.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best practice for h1 titles for products inside categories? long tail or short tail?
Sorry for the title of this, but didn't know the best way to put it. I have a question with category naming for products, To keep things simple, I'll use shoes for this example. Say I have a category, running-shoes, so the url would be, site.com/running-shoes/ and inside this category I have purple-pumas so the url is site.com/running-shoes/purple-pumas/ And I have the title Purple Puma Running Shoes Is that ok, or should I just use the title Purple Pumas ? The breadcrumbs shows Home > Running Shoes > Purple Puma Running Shoes then and I wasn't sure if that would be too much? Or how about using another version of the keyword for the title, like Purple Puma Jogging sneakers? Any ideas? Thank you!
Technical SEO | | Deacyde0 -
How Long To Recover Rankings After Multi-Day Site Outage?
Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks
Technical SEO | | smaavie0 -
Expires Header
We are considering adding expires header to our site. If we add this meta tag to expire for a certain date, but we do not make any changes to the site after it expires, can you be penalized for this?
Technical SEO | | tdawson090 -
One hosting plan for multiple websites?
I use one Godaddy shared Linux hosting account for 4 separate websites. In Google Webamster Tools, specifally "Sitr Errors," I noticed that inner pages from another site are being listed as a broken link in the original unique-now-shared site. I checked and the files are not mi-installed. My question is, should each of the four sites have a unique hosting plan and/or static IP? Thanks, Eric
Technical SEO | | monthelie10 -
Does Google index has expiration?
Hi, I have this in mind and I think you can help me. Suppose that I have a pagin something like this: www.mysite.com/politics where I have a list of the current month news. Great, everytime the bot check this url, index the links that are there. What happens next month, all that link are not visible anymore by the user unless he search in a search box or google. Does google keep those links? The current month google check that those links are there, but next month are not, but they are alive. So, my question is, Does google keep this links for ever if they are alive but nowhere in the site (the bot not find them anymore but they work)? Thanks
Technical SEO | | informatica8100 -
Setting up a 301 redirect from expired webpages
Hi Guys, We have recently created a new website for one of our clients and replaced their old website on the same domain. One problem that we are having is that all of the old pages are indexed within Google (1000s) and are just getting sent to our custom 404 page. We are finding that there is an large bounce rate from this and also, I am worried from an SEO point of view that the site could lose rank positioning through the number of crawl errors that Google is getting. Want I want is to set up a 301 redirect from these pages to go to the 'our brands' page. The reason for this is that the majority of the old URLs linked to individual product pages, and one thing to note is that they are all .asp pages. Is there a way of setting up a rule in the htaccess file (or another way) to say that all webpages that end with the suffix of .asp will be 301 redirected to the our brands' page? (there is no .asp pages on the new site as it is all done in php). If so, I would love it if someone could post the code snippet. Thanks in advance guys and if you have any other ideas then be my guest to suggest 🙂 Matt.
Technical SEO | | MatthewBarby0 -
How long before rankings return?
We recently had a redesign of our website and moved it from a static html site to WordPress. During the development the WordPress installation sat on a sub-domain. Our programmer and designer had the privacy setting inside WordPress checked so the search engines would not crawl the pages while being developed to avoid duplicate content. Fast forward two weeks when new site replaces old. Guess what privacy setting was not fixed. So about 3 days after the site is live organic traffic drops by 80%. Do a little research and all of our great rankings are gone. Notice that our page rank is also gone...Hmm... Upon further inspection the noindex tag is on every page of the site. It's now removed and has been 3-4 days. I've resubmitted inside webmaster tools and I'm just curious what y'all think the likelihood is that everything will come back to how it was before.
Technical SEO | | jmacek070 -
How long does it take for customized Google Site Search to show results from pdf files?
The site in question is http://www.ejmh.eu I am pretty unsatisfied with the results I am getting from the Site Search provided by Google. We have over 160 pdf files in this subfolder: http://www.ejmh.eu/mellekletek The files are the digital versions of articles. When I search for content in those pdf files, Google does not show results. It does show results from older pages, dating back 1-2 years but it is certainly not showing anything from pdf files that I have just put up 3 weeks ago. My questions: If I place a Google Search on a site, does it not automatically display results from ALL the content in the root domain? Is there any correlation between how the Site Search is indexing the files and how Google is indexing the urls in general? Should I just wait and see whether site search performance improves or should I switch to another Search software like Zoom Search? It is vital to have a proper, high-quality search functioning on that site in the very near future. What are your experiences? Any tips are greatly appreciated.
Technical SEO | | Lauroca0