Stock lists - follow of nofollow?
-
a bit of a catch 22 position here that i could use some advice on please!
We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle.
We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there.
now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages?
is there a "best practice" way to do this or is it really personal preference?
-
I would take the "aggregation" route.
Instead of having lots of pages for each make and model of vehicle, I would make single pages that list all of the vehicles of a single make and model. These pages would be more substantive, permanent, impressive, useful, competitive, than a lot of skimply single pages that appear and disappear from your website.
Competitors are probably not doing this because it is difficult instead of easy. Put checkboxes down the side of the page that visitors can "check to compare".
-
The idea of redirecting a user to a car that might not match their search does not seem like a very user friendly option, if they wanted a mustang and clicked a search listing for it but none are available and were then redirected to a camaro page the user would not be happy, and that flow is built only for the site and not the customer, IMO.
-
Hi Ben,
Is it possible to create a basic sold page with some dynamic info about the vehicle. After the vehicle becomes sold or no longer available then 301 the old page to the sold page populated with the vehicle info with parameters and some possible other buying choices.
For example:
siteblah.com/Make/Moldel/Year/short-car-desciption
When sold, 301 page to:siteblah.com/Vehicles/Sold/Vehicle-Sold.php?listing-id='12222190'
The benefit here is the old page sends the new page the link juice so you don't lose that. With content the customer understands the car is sold, and providing them with actionable options. The search engines learn about the new page and can treat as such. Additionally you'd only have to create one new page and plugin the parameters. Every 3 months or so you can probably remove the old pages and the 301 redirect depending on server performance.
-
Thanks for the response.
The two routes i was looking at are both for the user. i'm looking at either not allowing search engines to serve the content that can expire, or redirecting them to similar vehicles/relevant content within the site.
i was purely wondering which would have additional benefits with google, as the first option is the easier of the two development wise.
-
My thoughts are instead of worrying about what is best for Google, think of what will give a user the best experience and go with that. While it is nice to have a lot of pages index, if by the time they get to Google they are gone, what good does that do a visitor who was searching for a specific term that your site no longer offers? They are much more likely to leave which will effect your whole site negativity as bounces from search go up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best to Combine Listing URLs? Are 300 Listing Pages a "Thin Content" Risk?
We operate www.metro-manhattan.com, a commercial real estate website. There about 550 pages. About 300 pages are for individual listings. About 150 are for buildings. Most of the listings pages have 180-240 words. Would it be better from an SEO perspective to have multiple listings on a single page, say all Chelsea listings on the Chelsea neighborhood page? Are we shooting ourselves in the foot by having separate URLs for each listing? Are we at risI for a thin cogent Google penalty? Would the same apply to building pages (about 150)? Sample Listing: http://www.nyc-officespace-leader.com/listings/364-madison-ave-office-lease-1802sf Sample Building: http://www.nyc-officespace-leader.com/for-a-new-york-office-space-rental-consider-one-worldwide-plaza-825-eighth-avenue My concern is that the existing site architecture may result in some form of Google penalty. If we have to consolidate these pages what would be the best way of doing so? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Canonical tags for duplicate listings
Hi there, We are restructuring a website. The website originally lists jobs that will have duplicate content. We have tried to ask the client not to use duplicates but apparently their industry is not something they can control. The recommendations I had is to have categories (which will have the idea description for a group of jobs), and the job listing pages. The job listing pages will then have canonical tags pointing to the category page as the primary URL to be indexed. Another opinion came from a third party that this can be seen as if we are tricking Google and would get penalised, **Is that even true? **Why would Google penalise for this if thats their recommendations in the first place? This third party suggested using nofollow on the links to these listings, or even not not index them all together. What are your thoughts? Thanks Issa
Intermediate & Advanced SEO | | iQi0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
Rel=next/prev for paginated pages then no need for "no index, follow"?
I have a real estate website and use rel=next/prev for paginated real estate result pages. I understand "no index, follow" is not needed for the paginated pages. However, my case is a bit unique: this is real estate site where the listings also show on competitors sites. So, I thought, if I "no index, follow" the paginated pages that would reduce the amount of duplicate content on my site and ultimately support my site ranking well. Again, I understand "no index, follow" is not needed for paginated pages when using rel=next/prev, but since my content will probably be considered fairly duplicate, I question if I should do anyway.
Intermediate & Advanced SEO | | khi50 -
Local search vs. Organic Listings
Hi ~ I was interested to see if anyone feels there might be an advantage to keeping a business out of Google's Local Search listing area or at least trying to keep it out of the 7-pack display? It seems to me that sites who are not listed in the 7-pack can often be ranked above the maps/7-pack area in the regular organic listings. Also, is there anyway for a homepage to be listed on the 1st page in both the local search and organic listings? Thanks!
Intermediate & Advanced SEO | | hhdentist0 -
ECommerce product listed in multiple places, best SEO practice?
We have an eCommerce site we have built for a customer and the products are allowed to appear in more than one product category within the web site. Now I know this is a bad idea from a duplicate content point of view, But we are going to allow the customer to select which out of the multiple categories the product appears in will be the default category. This will mean we will have a way of defining what the default url is for a product. So am I correct in thinking all the other urls where the product appears we should add a rel canonical to these pages pointing to the default url to stop duplicate content? Is this the best way?
Intermediate & Advanced SEO | | spiralsites0 -
Does a High percentage of nofollow inconming links, harm my campaign?
Hi all, and first of all, thanks in avant, my question is.... Does a High percentage of nofollow inconming links, harm my campaign? and when i mean a high percentage, i mean a 67% of all my incomings links, that are nofollow. Thanks! w7HDh.png
Intermediate & Advanced SEO | | ofuente0 -
I tried the directorie list of seomoz, but almost all of them charged for the inclusion. This is a black hat situation?
I need backlinks for my site, and several places inform that directories are a good place. But they charge for the inclusion. Should I pay? This is a blackhat situation where I'm buying for links?
Intermediate & Advanced SEO | | Naghirniac0