How do retailers phase out short term promotional pages for best SEO?
-
I have a high level question and am open to all discussions.
I work for a large ecommerce site and we are always making new landing pages, product assortments and promotions.
We have a challenge on how to retire these pages after a promotion is outdated.
What is this the MOZ community option on a way to retire, like 301 redirect to home page?
-
The guys all have some great thoughts here.
One more thought:
If a promotion or product is usually short-lived but comes back intermittently (e.g. once a year, twice a year, etc.), the URL can be kept live. A good example is rental real estate in a big city - the properties will come back on the market, but the URLs need to show visitors that the apartment is not available at the moment. RIghtmove (real estate in the UK) does this with rentals it knows will come back on the market, e.g. http://www.rightmove.co.uk/property-to-rent/property-21907128.html
They re-use the pages when the property is live again - I've seen them do it with both flats I lived in, and they rank remarkably well. Questionable usability if a page ranks and is actually unavailable, but effective.
Clearly this is only relevant if you will re-use / open these short-term promotions in the future.
-
I'm in ecommerce as well. We 301 promotion-specific pages to the homepage once the promotion is over, as each one won't come back around for a year.
-
Yeah if you can determine a set schedule, with no overlap, I would try to aggregate promotions onto a common URL.
For instance, we host local events in Brooklyn, for which the content becomes obsolete once the event is over. We maintain one URL for our events:
http://www.uncommongoods.com/designs/events
And simply rotate the content. That way the links to the page help build more specific authority for "design events". If we were to 301 redirect content to the HP, we'd lose this niche.
-
It depends on how quick the promotions are. If the promotions really are short-term, I like to create a "specials" or "promotions" page where they can all be aggregated along with some unique content. Pages like that can get crazy traffic and even links. Visitors love them.
Then, you can have those individual promotions pages branch off of that main one. 301s to the promotions page would work. Or, if possible, maybe creating brand new pages for each promotion isn't the right way to go about it, and those promotions should ONLY live on the main promotions page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical tag from shopify page to wordpress site page
We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?
Intermediate & Advanced SEO | | shabbirmoosa0 -
404 Errors flaring on nonexistent or unpublished pages – should we be concerned for SEO?
Hello! We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere. However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing? Thank you!
Intermediate & Advanced SEO | | DebFF
Chloe0 -
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
Best SEO for table in mobile view
I'm wondering what the best way to present a table for mobile view in terms of SEO? It's a complicated table (not simple rows & columns but also col spans) which doesn't work with any responsive techniques I can find. I can offer different content for desktop / mobile so desktop is OK. But what's the best way forward with Google for mobile? I could offer a jpg or simply an explanation to revisit the page on desktop, but neither of those options seem particularly Google-friendly?
Intermediate & Advanced SEO | | Ann640 -
How best to deindex tens of thousands of pages?
Hi there, We run a quotes based site and so have hundreds of thousands of pages. We released a batch of pages (around 2500) and they ranked really well. Encouraged by this we released the remaining ~300,000 pages in just a couple of days. These have been indexed but are not ranking any where. We presume this is because we released too much too quickly. So we want to roll back what we've done and release them in smaller batches. So I wondered if: 1. Can we de-index thousands of pages, and if so what's the best way of doing this? 2. Can we then re-index these pages but over a much greater time period without changing the pages at all - or would we need to change the pages/the URL's etc? thanks! Steve
Intermediate & Advanced SEO | | SteveW19870 -
Duplicate page title at bottom of page - ok, or bad?
Can I get you experts opinion? A few years ago, we customized our pages to repeat the page title at the bottom of the page. So the page title is in the breadcrumbs at the top, and then it's also at the bottom of the page under all the contents. Here is a sample page: bit.ly/1pYyrUl I attached a screen shot and highlighted the second occurence of the page title. Am worried that this might be keyword stuffing, or over optimizing? Thoughts or advice on this? Thank you so much! ron ZH8xQX6
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Google swapped our website's long standing ranking home page for a less authoritative product page?
Our website has ranked for two variations of a keyword, one singular & the other plural in Google at #1 & #2 (for over a year). Keep in mind both links in serps were pointed to our home page. This year we targeted both variations of the keyword in PPC to a products landing page(still relevant to the keywords) within our website. After about 6 weeks, Google swapped out the long standing ranked home page links (p.a. 55) rank #1,2 with the ppc directed product page links (p.a. 01) and dropped us to #2 & #8 respectively in search results for the singular and plural version of the keyword. Would you consider this swapping of pages temporary, if the volume of traffic slowed on our product page?
Intermediate & Advanced SEO | | JingShack0