Best way to handle expired ad in a classified
-
I don't think there is a definitive answer to this, but worth the discussion:
How to handle an expired ad in a classified / auction site?
Michael Gray mentioned you should 301 it to it's category page, and I'm inclined to agree with him. But some analysts say you should return a "product/ad expired" page with a 404.
For the user I think the 404 aproach is best, but from a SEO perspective that means I'm throwing link juice out.
What if I 301 him from the ad, and show a message saying why they're seeing the listing page instead of the product page?
Thoughts?
-
I would do #3.
-
Great inputs!
But what if, for legal reasons (price, pictures, etc), the ad has to be removed after it has expired. (real case here)
Ideas:
- Modify the ad page and return a 200? (remove ad data and add a message saying it's expired)
- Throw a friendly 404 page, saying the ad has expired and show other options for the user to navigate to
- 301 to it's parent page
(3) is my favourite, but (2) may be the best option for users.
Thoughts?
-
Interesting...
I don't know how "private" selling prices are in your area but maybe a couple pages on your site like thiese.....
WHAT YOU CAN BUY IN YOURCITY FOR $100,000
This would be a point of reference for buyers and sellers. Where I live there is a huge divergence between askin' and sellin' prices. They ask for the moon but get something a lot less.
RECENT SALES PRICES IN YOURCITY...
Nosy people would love this.
-
I do the same thing with our real estate site. If a listing has expired, I keep the page active, but I put a note at the top saying, "This listing has sold! Contact us and we can find you similar listings in the city."
My expired listings bring in a lot of search traffic.
-
Who is going to bet against Michael Gray? I think that you should listen to him.
I would give his answer one tweak. He says....
If the product goes out of stock forever, you have a couple choices. You can leave the page up with a discontinued notice on the page. IMHO that’s not the best way to go for search engines. Ideally I’d like to not lose any link equity and 301 the product page to a similar product, category/department page, or home page.
I would do exactly what he says 99% of the time, however, if that page is pulling a lot of search engine traffic and same manufacturer has a replacement product or something close that substitutes, I would leave that page in place and use it to explain... "This product has been retired but a new and improved widget is available... (then give the sales pitch for the new model with a buy button). This approach would be especially valuable if the product is something like running shoes where repeat customers with very high loyalty are looking to replace their favorite shoes up to several times per year.
When this shoe was replaced by Addiction there was a mad scramble to buy up all existing stock... (I am probably only person posting here old enough to have worn out a couple dozen pairs)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
How to find if a website has paid or spammy back-links? Latest ways to investigate.
Hi all, I would like to investigate about our website back-links if something is wrong. If there are any paid or spammy back-links. How to proceed on this exercise? We have been using ahrefs and seems like it's quite enough. Is there any way we can pull out the fishy back-links? Do we have any helpful data from webmasters about this? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Want to know Best Method to fix keyword cannibalization issue?
I have a website that has been experiencing keyword cannibalization issue since last 2-3 months. We have one main key search term to bring our website TOP ranking, but we have been seeing our website’s 2 different pages ranking strangely sometime for 1st page& sometime for 2nd page that one main key search term. As e.g.:
White Hat / Black Hat SEO | | Aman_123
our main key search term 1st page rank sometime instead 2nd page
our main key search term 2nd page rank sometime instead page I am looking for best solution here to get this fixed..0 -
I am launching an international site. what is the best domain strategy
Hi Guys, I am launching a site across the US, UK and UAE. Do I go **test.com/uk test.com/us test.com/UAE -- **or do I go us.Test.com UAe.test.com us.test.com? Which is best for SEO?
White Hat / Black Hat SEO | | Johnny_AppleSeed1 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Footer links VS Page links - Which one is best?
Hello all 🙂 I was wondering if someone could advise me on a link building question. If you wish to create a couple of landing pages for different locations with anchor text link building etc is it better to have a page like this web site here: http://www.acorncommercial.co.uk/commercial-property/development-sites/ or quick footer links like this web site here?: http://www.robertholmes.co.uk/ (click on quick links at the bottom). I would like to know if there is a difference from an SEO perspective or if they are considered black hat. Your advise would be much appreciated! Yiannis
White Hat / Black Hat SEO | | artdivision0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0