20-30% of our ecommerce categories contain no extra content, could this be a problem
-
Hello,
About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda?
Thanks!
-
It's not an exact science in regard to any one signal, however yes, the more you can reinforce the ability to strengthen topical focus, the less likely Panda would find category pages to be weak.
-
No worries Bob. Ignore my original suggestion then.
Alan has some good suggestions for you to follow
-Andy
-
Thanks Alan, this is perfect.
So if we had at least a couple of good paragraphs on every category page, and a few extra very appropriate internal links pointing to each of those category pages then we would be in good shape as far as Panda and category strength. Correct?
-
Hi Andy,
Sorry for the confusion. This is an ecommerce site. I edited the original question to be clear.
-
I'm assuming that this is a Wordpress site (more info would be useful) and a common issue is category pages causing problems due to them showing the same excerpts over and over. No indexing them gets around this.
if I have misread the type of issue this is, then of course, this doesn't apply. With this being posted in blogging and content, this was my assumption.
A URL to look at would I'm sure confirm more of the problem.
-Andy
-
Andy,
why would you noindex/follow category pages? Thats like saying "hey - we have X products for this category - so it's really a high value and important page we deserve ranking for. Except hey - we don't have the willingness to boost the trust signals on the category page itself, so don't bother."
That in turn negatively impacts the site's ability to gain maximum ranking signals for any products in those categories (at least in highly competitive fields).
So I'm curious why you'd take that path.
-
It could be Bob. I always advise that category pages are noindex / follow to avoid issues.
if you are using Wordpress and Yoast, this is just a setting.
-Andy
-
If a category page has almost no content (other than photos and product names), then that's a potential "thin content" issue, though the way your question is worded, I'm not confident my interpretation is actually what you meant by "no content beyond".
If product names don't reference the category name, and if there's a lack of any descriptive content on the category page, that's likely even more of a problem - thin content and lack of topical reinforcement of the category itself.
A general rule (barring other issues or considerations) is to have at least a couple paragraphs of unique, descriptive paragraph based text that reinforces the topical focus of each category page. There are numerous ways to split that content out across a category page, and in highly competitive categories, more content may be needed if not enough products exist in the category.
Other factors that can help mitigate this to a certain degree include (but aren't necessarily limited to):
- hierarchical URL structure (nested URLs so product detail pages are seen at the URL as being "beneath" their category
- Proper nested breadcrumbs to reinforce that hierarchical structure
- Strong internal linking a) within categories this would include pagination code (rel-next/rel-prev). b) outside a category this would include links and highly refined relevant content elsewhere on the page linking to the category page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Whether to use new domain or old ecommerce site domain that has been incomplete for a long time.
Hello, We are starting a second store in our niche. Which of the following should I choose: A. We have a site from a year and a half ago that we put content on but never actually added products. The category and article content needs to be completely rewritten. We will completely rewrite the content to be much better and up to date. We're planning on adding products and rewriting the manufacturer descriptions. B. We could use a new domain that is closer to exact match for our main keyword. We'd just buy one for $15 I don't know whether A or B would be the fastest way to get the site going. I'm concerned that leaving a site half done for a year could cause an issue, but I really don't know. If you've got experience with this, please advise. Thank you.
White Hat / Black Hat SEO | | BobGW0 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
The purpose of these Algo updates: To more harshly push eCommerce sites toward PPC and enable normal blogs/forums toward reclaiming organic search positions?
Hi everyone, This is my first post here, and absolutely loving the site and the services. Just a quick background, I have dabbled in SEO in the past, and have been reading up over the last few months and am amazed at the speed at which things are changing. I currently have a few clients that I am doing some SEO work for 2 of them, and have had an ecommerce site enquire about SEO services. They are a medium sized oak furniture ecommerce site. From all the major changes..the devaluing of spam links, link networks, penalization of overuse of exact match anchor text and the overall encouraging of earned links (often via content marketing) over built links, adding to this the (not provided) section in Google Analytics, and the increasing screen real estate that PPC is getting over organic search...all points to me thinking on major thing..... That the search engine is trying to push eCommerce sites and sites that sell stuff harder toward using PPC and paid advertising and allowing the blogs/forums and informational sites to more easily reclaim the organic part of the search results again. The above is elaborated on a bit more below.. POINT 1 Firstly as built links (article submission, press releases, info graphic submission, web 2.0 link building ect) rapidly lose their effectiveness, and as Google starts to place more emphasis on sites earning links instead - by producing amazing interesting and unique content that people want to link to. The fact remains that surely Google is aware that it is much harder for eCommerce sites to produce a constant stream of interesting link worthy content around their niche (especially if its a niche that not an awful lot could be written about). Although earning links is not impossible for eCommerce sites, for a lot of them it is more difficult because creating link worthy content is not what eCommerce sites were originally intended for. Whereas standard blogs and forums were built for that exact purpose. Therefore the search engines must know that it is a lot easier for normal blogs/forums to "earn" links through content, therefore leading to them reclaiming more of the organic search ranking for transaction and non transaction terms, and therefore forcing the eCommerce sites to adopt PPC more heavily. POINT 2 If we add to the mix the fact that for the terms most relevant to eCommerce sites, the search engine results page has a larger allocation of PPC ads than organic results (above the fold), and that Google has limited the amount of data that sites can see in terms of which keywords people are using to arrive on their sites, which effects eCommerce sites more - as it makes it harder for them to see which keywords are resulting in sales. Then this provides further evidence that Google is trying to back eCommerce sites into a corner by making it more difficult for them to make sense of and track sales from organic results in comparison to with PPC, where data is still plentiful. Conclusion Are the above just over exaggerations? Can most eCommerce sites still keep achieving a good percentage of sales from organic search despite the above? if so, what do the more niche eCommerce sites do to "earn" links when content topics are thin and unique outreach destinations can be exhausted quickly. Do they accept the fact that the are in the business of selling things, so should be paying for their traffic as opposed to normal blogs/forums which are not. Or is there still a place for them to get even more creative with content and acquire earned links..? And finally, is the concentration on earned links more overplayed than it actually is? Id really appreciate your thoughts on this..
White Hat / Black Hat SEO | | sanj50500 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0