Pages For Products That Don't Exist Yet?
-
Hi,
I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for.
Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger.
What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party?
The pages would be like "coming soon" pages, but still optimized to the main product search term.
About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking.
What do you think? Thanks!
-
Hi Fellows,
Thanks for the thoughts Mike, Charles & Kieran. All good ideas!
Best...Mike
-
I would second / third this. And don't put coming soon write a short spiel that says 2011 Super Widget is coming soon and should be this color and size and we will have ht charger when it comes out. What other items do you think the widget should have as accessories.
-
Ditto, I think it is a very good idea. One additional thing I would do is create some sort of "Signup" form so you can collect details of people interested in purchasing once released you could then send a email to once you get the full page up and running.
Kind of opposite to what you said, which I have also had great success with, is when a product isn't made any more, don't delete it, however keep the page and put on it alternative versions of the widget. People still search for years old products.
-
I think this is a very common thing to do, and I'd recommend you do the same. I wouldn't do it crazy ahead, but if a product comes soon, why not give yourself an advantage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Do you get penalized for Keyword Stuffing in different page URLs?
If i have a website that provides law services in varying towns and we have pages for each town with unique content on each page, can the page URLS look like the following: mysite.com/miami-family-law-attorney mysite.com/tampa-family-law-attorney mysite.com/orlando-family-law-attorney Does this get penalized when being indexed?
White Hat / Black Hat SEO | | Armen-SEO0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
301 redirect a set of pages to one landing page/URL?
I'm planning to redirect the following pages to one new URL/landing page: Old URLs: http://www.olddomain.com/folder/page/1 http://www.olddomain.com/folder/page/2 http://www.olddomain.com/folder/page/3 http://www.olddomain.com/folder/page/4 http://www.olddomain.com/folder/page/5 http://www.olddomain.com/folder/page/6 New URL: http://www.newdomain.com/new-folder/new-page Code in .htaccess that I will be using: RedirectMatch 301 /folder/page/(.*) http://www.newdomain.com/new-folder/new-page Let me know if this is correct. Thanks!
White Hat / Black Hat SEO | | esiow20130 -
My site www.sriindustries.com dropped to back page penguin 2.1
My website penguin 2.1 dropped back to page 5 and beyond, can you help me to come out from this ? My head is breaking, also I would like to know how to be on top for local business search (maps)
White Hat / Black Hat SEO | | synchronyinfo0 -
Does Backlinks from User Profile Page Still Matter
Though, lot of algorithmic changes, updates have happened. However, backlinks from quality source or trusted sites has hardly lost its importance and thats why Open SIte Explorer or Majestic seo is still there to find quality of backlinks, trust factor and authority factor of the backlinks. I need to know does having backlink (dofollow) from Profile Page - still matters and if yes - will a do follow backlink from Moz Profile Page or any authority site or forum page, having dofollow link from user profile page - does it still count as a mark to authority of your site.
White Hat / Black Hat SEO | | Modi0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
How to run SEO tests you don't want to be associated with
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for. I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so. I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself. The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another. How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter? Tom
White Hat / Black Hat SEO | | lethal0r0