Best way to handle expired ad in a classified
-
I don't think there is a definitive answer to this, but worth the discussion:
How to handle an expired ad in a classified / auction site?
Michael Gray mentioned you should 301 it to it's category page, and I'm inclined to agree with him. But some analysts say you should return a "product/ad expired" page with a 404.
For the user I think the 404 aproach is best, but from a SEO perspective that means I'm throwing link juice out.
What if I 301 him from the ad, and show a message saying why they're seeing the listing page instead of the product page?
Thoughts?
-
I would do #3.
-
Great inputs!
But what if, for legal reasons (price, pictures, etc), the ad has to be removed after it has expired. (real case here)
Ideas:
- Modify the ad page and return a 200? (remove ad data and add a message saying it's expired)
- Throw a friendly 404 page, saying the ad has expired and show other options for the user to navigate to
- 301 to it's parent page
(3) is my favourite, but (2) may be the best option for users.
Thoughts?
-
Interesting...
I don't know how "private" selling prices are in your area but maybe a couple pages on your site like thiese.....
WHAT YOU CAN BUY IN YOURCITY FOR $100,000
This would be a point of reference for buyers and sellers. Where I live there is a huge divergence between askin' and sellin' prices. They ask for the moon but get something a lot less.
RECENT SALES PRICES IN YOURCITY...
Nosy people would love this.
-
I do the same thing with our real estate site. If a listing has expired, I keep the page active, but I put a note at the top saying, "This listing has sold! Contact us and we can find you similar listings in the city."
My expired listings bring in a lot of search traffic.
-
Who is going to bet against Michael Gray? I think that you should listen to him.
I would give his answer one tweak. He says....
If the product goes out of stock forever, you have a couple choices. You can leave the page up with a discontinued notice on the page. IMHO that’s not the best way to go for search engines. Ideally I’d like to not lose any link equity and 301 the product page to a similar product, category/department page, or home page.
I would do exactly what he says 99% of the time, however, if that page is pulling a lot of search engine traffic and same manufacturer has a replacement product or something close that substitutes, I would leave that page in place and use it to explain... "This product has been retired but a new and improved widget is available... (then give the sales pitch for the new model with a buy button). This approach would be especially valuable if the product is something like running shoes where repeat customers with very high loyalty are looking to replace their favorite shoes up to several times per year.
When this shoe was replaced by Addiction there was a mad scramble to buy up all existing stock... (I am probably only person posting here old enough to have worn out a couple dozen pairs)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
What is the best factors to increase DA PA?
I want to increase DA Pa for Our Company website Tandem NZ. How Can i Increase it?
White Hat / Black Hat SEO | | Tandem-Digital0 -
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
I am launching an international site. what is the best domain strategy
Hi Guys, I am launching a site across the US, UK and UAE. Do I go **test.com/uk test.com/us test.com/UAE -- **or do I go us.Test.com UAe.test.com us.test.com? Which is best for SEO?
White Hat / Black Hat SEO | | Johnny_AppleSeed1 -
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0