Pages For Products That Don't Exist Yet?
-
Hi,
I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for.
Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger.
What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party?
The pages would be like "coming soon" pages, but still optimized to the main product search term.
About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking.
What do you think? Thanks!
-
Hi Fellows,
Thanks for the thoughts Mike, Charles & Kieran. All good ideas!
Best...Mike
-
I would second / third this. And don't put coming soon write a short spiel that says 2011 Super Widget is coming soon and should be this color and size and we will have ht charger when it comes out. What other items do you think the widget should have as accessories.
-
Ditto, I think it is a very good idea. One additional thing I would do is create some sort of "Signup" form so you can collect details of people interested in purchasing once released you could then send a email to once you get the full page up and running.
Kind of opposite to what you said, which I have also had great success with, is when a product isn't made any more, don't delete it, however keep the page and put on it alternative versions of the widget. People still search for years old products.
-
I think this is a very common thing to do, and I'd recommend you do the same. I wouldn't do it crazy ahead, but if a product comes soon, why not give yourself an advantage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
E-Commerce - Include the product creation DATE?
Hi there! I would to ask you if it's advisable to include the creation date of new products (the product description ) in a e-commerce shop, I mean, if it makes a difference (in SEO terms, of course). Thanks for the info and my best regards! Z.
White Hat / Black Hat SEO | | zkouska0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
Website Vulnerability Leading to Doorway Page Spam. Need Help.
Keywords he is ranking for , houston dwi lawyer, houston dwi attorney and etc.. Client was acquired in June and since then we have done nothing but build high quality links to the website. None of our clients were dropped/dinged or impacted by the panda/penguin updates in 2012 or updates previously published via Google. Which proves we do quality SEO work. We went ahead and started duplicating links which worked for other legal clients and 5 months later this client is either dropping or staying in local maps results and we are performing very badly in organic results. Some more history..... When he first engaged our company we switched his website from a CMS called plone to word press. During our move I ran some searches to figure out which pages we needed to 301 and we came across many profile pages or member pages created on the clients CMS (PLONE). These pages were very spammy and linked to other plone sites using car model,make,year type keywords (ex:jeep cherokee dealerships). I went through these sites to see if they were linking back and could not find any back links to my clients website. Obviously nobody authorized these pages, they all looked very hackish and it seemed as though there was a vulnerability on his plone CMS installation which nobody caught. Fast forward 5 months and the newest OSE update is showing me a good 50+ back links with unrelated anchor text back links. These anchor text links are the same color as the background and can only be found if you hover your mouse over certain areas of the site. All of these sites are built on Plone and allot of them are linked to other businesses or community websites. These websites obviously have no clue they have been hacked or are being used for black hat purposes. There are dozens of unrelated anchor text links being used on external websites which are pointing back to our clients website. Examples: <a class="clickable title link-pivot" title="See top linking pages that use this anchor text">autex Isuzu, </a><a class="clickable title link-pivot" title="See top linking pages that use this anchor text">Toyota service department ratings, </a><a class="clickable title link-pivot" style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;" title="See top linking pages that use this anchor text">die cast BMW and etc..</a> Obviously the first step is to use the disavow link tool, which will be completed this week. The second step is to take some feedback from the SEO community. It seems like these pages are automatically created using some type of bot. It will be very tedious if we have to continually remove these links. I hope there is a way to notify Google that these websites are all plone and have a vulnerability, which black hats are using to harm the innocent... If i cannot get Google to handle this, then the only other option is to start fresh with a new domain name. What would you do in this situation. Your help is greatly appreciated. Thank you
White Hat / Black Hat SEO | | waqid0 -
Can our white hat links get a bad rap when they're alongside junk links busted by Panda?
My firm has been creating content for a client for years - video, blog posts and other references. This client's web vendor has been using bad links and link farms to bolster rank for key phrases - successfully. Until last week when Google slapped them. They have been officially warned on WMT for possibly using artificial or unnatural links to build PageRank. They went from page one of the most popular term in Chicago for their industry where they had been for over a year - to page 8 - overnight. Other less generic terms that we were working on felt the sting as well. I was aware of and had warned the client of the possibility of repercussions from these black hat tactics (http://www.seomoz.org/blog/how-google-makes-liars-out-of-the-good-guys-in-seo#jtc170969), but didn't go as far as to recommend they abandon them. Now I'm wondering if one of our legitimate sites (YoChicago.com), which has more than its share of the links into the client site is being considered a bad link. All of our links are legitimate, i.e., anchor text equals description of destination, video links describe the entity that is linked to. Our we vulnerable? Any insight would be appreciated.
White Hat / Black Hat SEO | | mikescotty0 -
Campaign landing pages
Hi At our company we decided we wanted to reach out to a more global audience. So we bought a bank of domains for different countries, e.g. ".asia". Some are our company name, others are things like "barcelonaprivatejets.com." We then put up single page websites for each of these domains, which link to our main .com site. However, I don't know if this is good for our SEO or bad. I've seen so many different things written but I cannot find a definitive answer. The text will be different on all the pages, but being only one page, and the "design" being the same, will we get penalized in some way or another? I've also added links to 2/3 of them in the footer of our main site but now I'm reading that this is bad too - so should I remove these? If anyone also has any ideas of how better we could use these Country-specific domains I would be welcome to suggestions to that too! I am not an SEO person really, I'm a web developer, so this is all completely different to me. P.S My name is Michael not Andy.
White Hat / Black Hat SEO | | JetBookMike0