Stock lists - follow of nofollow?
-
a bit of a catch 22 position here that i could use some advice on please!
We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle.
We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there.
now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages?
is there a "best practice" way to do this or is it really personal preference?
-
I would take the "aggregation" route.
Instead of having lots of pages for each make and model of vehicle, I would make single pages that list all of the vehicles of a single make and model. These pages would be more substantive, permanent, impressive, useful, competitive, than a lot of skimply single pages that appear and disappear from your website.
Competitors are probably not doing this because it is difficult instead of easy. Put checkboxes down the side of the page that visitors can "check to compare".
-
The idea of redirecting a user to a car that might not match their search does not seem like a very user friendly option, if they wanted a mustang and clicked a search listing for it but none are available and were then redirected to a camaro page the user would not be happy, and that flow is built only for the site and not the customer, IMO.
-
Hi Ben,
Is it possible to create a basic sold page with some dynamic info about the vehicle. After the vehicle becomes sold or no longer available then 301 the old page to the sold page populated with the vehicle info with parameters and some possible other buying choices.
For example:
siteblah.com/Make/Moldel/Year/short-car-desciption
When sold, 301 page to:siteblah.com/Vehicles/Sold/Vehicle-Sold.php?listing-id='12222190'
The benefit here is the old page sends the new page the link juice so you don't lose that. With content the customer understands the car is sold, and providing them with actionable options. The search engines learn about the new page and can treat as such. Additionally you'd only have to create one new page and plugin the parameters. Every 3 months or so you can probably remove the old pages and the 301 redirect depending on server performance.
-
Thanks for the response.
The two routes i was looking at are both for the user. i'm looking at either not allowing search engines to serve the content that can expire, or redirecting them to similar vehicles/relevant content within the site.
i was purely wondering which would have additional benefits with google, as the first option is the easier of the two development wise.
-
My thoughts are instead of worrying about what is best for Google, think of what will give a user the best experience and go with that. While it is nice to have a lot of pages index, if by the time they get to Google they are gone, what good does that do a visitor who was searching for a specific term that your site no longer offers? They are much more likely to leave which will effect your whole site negativity as bounces from search go up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
Intermediate & Advanced SEO | | GeezerG
a major site replacement and I mean total replacement. Things you should know
Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet. So here are my thoughts on the possible ways of meeting our need,
1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
so the 404 issue is overcome but the user doesn't open the precise page they are after.
3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
5: We could just let the pages all die and list them with Google to advise of their death.
6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old. Many be I am missing another, simpler approach - please discuss0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
I have rebuilt a website on a new domain and followed SEO protocol to maintain authority, but the results and rankings are declining.
We took over an account for a company called knightdoorservices.com who specialize in doors and windows in Edmonton, Alberta. We built them a new website on a new domain: knightdoorsandwindows.com. We did 301 redirects on all of the old URLs so that they now point to the new URLs so most of the authority should transfer over. Additionally, each page has a properly optimized title, h1 tag, a series of pertinent alt tags, and many instances of the focus keyword for that particular page. Additionally, the website loads quickly and has many high authority inbound links pointing to the domain. We have done this for many other companies and have seen their rankings maintain their position or increase. Is there something that I am missing for this company in particular? Thanks so much!
Intermediate & Advanced SEO | | Web3Marketing870 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
Site wide links - should they be nofollow or followed links
Hi We have a retail site and a blog that goes along with the site. The blog is very popular and the MD wanted a link from the blog back to the main retail site. However as this is a site wide link on the blog, am I right in thinking this really should be no follow link. The link is at the top of every page. Thanks in advance for any help
Intermediate & Advanced SEO | | Andy-Halliday0 -
Is a dynamic online user list bad for SEO?
Hello everyone, I have a question that is currently puzzling me, and I hope you can help me with. On musicianspage.com (one of our websites), we show a list of online users embedded within the page which, as you may expect, changes all the time according to who's online at that moment. That list appears on every page of the site, so at any time any page on the site has a different content and different link profile (sometimes we have just a few users connected, other times we may have over 50 users connected at the same time). My question is: is such a "dynamical-embedded" list bad, good or neutral from a SEO stand point? If it is bad, what do you suggest to do? Put it inside a frame? Using AJAX? Any thoughts and suggestions are very welcome! Thanks in advance to anyone reading this. All the best, Fabrizio
Intermediate & Advanced SEO | | fablau0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640