Correct strategy for long-tail keywords?
-
Hi,
We are selling log houses on our website. Every log house is listed as a "product", and this "product" consists of many separate parts, that are technically also products. For example a log house product consists of doors, windows, roof - and all these parts are technically also products, having their own content pages.
The question is - Should we let google index these detail pages, or should we list them as noindex?
These pages have no content, only the headline, which are great for long-tail SEO. We are probably the only manufacturer in the world who has a separate page for "log house wood beam 400x400mm". But otherwise these pages are empty.
My question is - what should we do? Should we let google index them all (we have over 3600 of them) and maybe try to insert an automatic FAQ section to every one of them to put more content on the page?
Or will 3600 low-content pages hurt our rankings? Otherwise we are ranking quite well.
Thanks, Johan
-
Thank You very much, Philipp. We will change our website according to your suggestions
Have a great week! Johan
-
Hej Johan!
Ah okay, then the answer is easier: If your visitors won't be able to see those sites, Google must not access them either.I also suggest not to list them in an XML-Sitemap since only pages with value for users should be listed there. As you asked above, IMO these 3600 low quality pages would not have a positive impact - they might even hurt you.
Focus on pushing those sites for search engines that are of value to users.
Cheers,
Phil -
Hi, Philipp!
Thank You for the response. Visitors currently do not see these options anyway, they are only in the system because of our custom pricing quote system.
I think my main question is, should we hide these pages from google or should we list them in our sitemap. Our site visitors will never see these pages, but they do exist. We can make them noindex, if google may not look well at them.
-
That amount is probably a bit too much. I guess you want to bundle your efforts more into top-hierarchy overview pages. Keep in mind that users hardly will perform specific searches for something like "log house wood beam 400x400mm"...
Usually it's a better choice to have one landing page for the product and let users customize via dropdown (to chose 400mm, 600mm, etc.). This will reduce your amount of pages and probably leads to a better user experience too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best approach to rank for this keyword?
Hi i want to rank for the keyword "white sandals" on Google Australia. Currently, the top 5 ranking pages are not optimised and specific to white sandals. See screenshot: https://image.prntscr.com/image/WenSRHqTTFSqYNg2MHvH1A.png To rank for this keyword, would you create a page dedicated to white sandals even though it looks like it doesn't matter and you could rank the broader sandals page (not colour specific). Any recommendations? Cheers.
Intermediate & Advanced SEO | | crazy4seo780 -
Potential keyword cannibalization?
Hi, I'm doing an audit of a site for a very competitive term (project management software). The site ranks for its root domain on the second page. They have a lot of other non-blog pages that are geared towards longer tail versions that include that term (project management software pricing, project management tool comparison, etc). My question is: are those pages cannibalizing potential search traffic? Should they just stick to the one page (root domain) and include those longtail keywords on the page instead of creating various pages that seem to possibly be cannibalizing traffic? Is this a fair conclusion that these other pages is causing them to rank lower for the main head term?
Intermediate & Advanced SEO | | jim_shook0 -
Keywords in WMT
Hello, In Googles Web master tools under "content keywords" 2 of my major keywords are missing. My site used to rank well for the keyphrase "short hairstyles" but gets very little traffic from google at all now, about 1% of what it did before april 2012. Someone did a negative seo number on us by pointing 10k+ spammy links to us from message boards, this and the timing of the traffic loss leads me to suspectthe penguin update. I am removing them as best I can but no increase in traffic has resulted so I'm looking for any and all issues and the missing keywords seems to be an oddity. The missing keywords include "short" which is pretty fundemental. The word is in the domain and plenty of times in the content. Any ideas?
Intermediate & Advanced SEO | | jwdl0 -
Hidden keywords - how many per page?
Hi All, We have a booking website we want to optimize for keywords we cannot really show, because some of our partners wouldn't want it. We figured we can put said keywords or close synonyms onpage in various places that are not too dangerous though (e.g. image names, image alt tags, URLs, etc.). The question is how much keywords we can target though? We know keyword stuffing is detrimental, and we will not start to create long URLs stuffed with keywords, same for H1 tags or page titles. So how many is acceptable/not counterproductive? Thanks!
Intermediate & Advanced SEO | | Philoups0 -
New domain, and Local Service, Moderate Keyword
Hi all i have a new client, who has bought a shiny new domain name for his business, it has one keyword in it related to his business. He is a local plastering looking to get his site ranked, but my worry is the domain age, its very new less than 2 months, i know google sandboxes new domains, The term he is targeting is Mod Completive (26%) is getting his domain ranked page one within 6 months a possibility, or will it just seem utterly spammy in googles eyes. Any tips please thanks will
Intermediate & Advanced SEO | | Will_Craig0 -
Using exact keyword domains for local SEO
The website is for the attorney that serves several nearby cities. The main page is optimized for the biggest central city. I have several options how to go after the smaller surrounding cities: 1. Create optimized pages inside the main domain 2. Get more or less exact keyword domains for each city e.g. for the city ABC get yourABClawyer.com and then a) use 1 page websites that use the same template as main website and link all the menu items to the main website b)use 1 page website with a link "for more information go to our main website" c) point exact keyword domains to the optimized pages within the main domain. Which option would be the best in terms of SEO and user experience? Would people freak out if they click on the menu item and go to a different domain website even though it uses the same template (option 2a) Would I get more bounces with option 2b in your opinion? Would option 2c have any positive SEO effect? Should I not even bother with exact keyword domain and go with option 1?
Intermediate & Advanced SEO | | SirMax1 -
Linking to Short vs Long URL
Suppose I have a long url on an established site and created a shorter version of it so it is easier for people to enter directly and click. I 301 the short version to the long. I don't think there is much concern for people linking to the long version pages, but will there be a tendency for people to link to the short url instead of the long for the domain links? Will I not benefit as much from links to the short vs the long? Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Domain migration strategy
Imagine you have a large site on an aged and authoritative domain. For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable. The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain. The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc. My question is whether we should allow the new site to be indexed during the beta period? My gut feeling is yes for the following reasons: It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much. Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?
Intermediate & Advanced SEO | | Charlie_Coxhead0