20-30% of our ecommerce categories contain no extra content, could this be a problem
-
Hello,
About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda?
Thanks!
-
It's not an exact science in regard to any one signal, however yes, the more you can reinforce the ability to strengthen topical focus, the less likely Panda would find category pages to be weak.
-
No worries Bob. Ignore my original suggestion then.
Alan has some good suggestions for you to follow
-Andy
-
Thanks Alan, this is perfect.
So if we had at least a couple of good paragraphs on every category page, and a few extra very appropriate internal links pointing to each of those category pages then we would be in good shape as far as Panda and category strength. Correct?
-
Hi Andy,
Sorry for the confusion. This is an ecommerce site. I edited the original question to be clear.
-
I'm assuming that this is a Wordpress site (more info would be useful) and a common issue is category pages causing problems due to them showing the same excerpts over and over. No indexing them gets around this.
if I have misread the type of issue this is, then of course, this doesn't apply. With this being posted in blogging and content, this was my assumption.
A URL to look at would I'm sure confirm more of the problem.
-Andy
-
Andy,
why would you noindex/follow category pages? Thats like saying "hey - we have X products for this category - so it's really a high value and important page we deserve ranking for. Except hey - we don't have the willingness to boost the trust signals on the category page itself, so don't bother."
That in turn negatively impacts the site's ability to gain maximum ranking signals for any products in those categories (at least in highly competitive fields).
So I'm curious why you'd take that path.
-
It could be Bob. I always advise that category pages are noindex / follow to avoid issues.
if you are using Wordpress and Yoast, this is just a setting.
-Andy
-
If a category page has almost no content (other than photos and product names), then that's a potential "thin content" issue, though the way your question is worded, I'm not confident my interpretation is actually what you meant by "no content beyond".
If product names don't reference the category name, and if there's a lack of any descriptive content on the category page, that's likely even more of a problem - thin content and lack of topical reinforcement of the category itself.
A general rule (barring other issues or considerations) is to have at least a couple paragraphs of unique, descriptive paragraph based text that reinforces the topical focus of each category page. There are numerous ways to split that content out across a category page, and in highly competitive categories, more content may be needed if not enough products exist in the category.
Other factors that can help mitigate this to a certain degree include (but aren't necessarily limited to):
- hierarchical URL structure (nested URLs so product detail pages are seen at the URL as being "beneath" their category
- Proper nested breadcrumbs to reinforce that hierarchical structure
- Strong internal linking a) within categories this would include pagination code (rel-next/rel-prev). b) outside a category this would include links and highly refined relevant content elsewhere on the page linking to the category page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Please help? unique penguin problem with a blogger template
**Can any one help? The problem: **There is a free blogger template on this site http://btemplates.com/2012/blogger-template-crystalweb/ that has a anchor text link to our site using the keyword "wholesale" in the footer, that is the main course of our site being hit with a penguin penalty.**The story so far:**On the 24th April our website dropped out of the serps for our main keywords, traffic has been down 90% ever since, we are a small family run business that relies on the inter-net and goggle for our site to work. Goggle organic serps is about 30% of our turnover and have already had no choice to let 3 people go, problem now is we are left with Me, my Dad and Mum, Both my Brothers and nephew and my wife and my brothers wife so unless we can turn this around I can see us going bankrupt.**What I have done so far:**After the 24th I have learnt a lot about S.E.O , and managed to remove 99% of all bad/spammy links and have now come to a dead end. I have been promoting what we do as a company and promoting our blog over the last 4 months and also built a great twitter/facebook following with lots of re-tweets and shares which we have made some good sales from. We have re-designed most parts of our website and managed to up the conversion rate by 300% We have worked on all aspects of our website to make sure we have little/no duplicate content , have worked on ways to speed up the site and fixed most dead links/404 problems.<var id="yiv904548185yui-ie-cursor"></var>**Now onto our main problem:**After a few weeks of removing links I found a blogger page that kept coming up with the same link, after some detective work I found the template was originally designed by http://www.deluxetemplates.com/ after a few emails we found out that someone paid deluxetemplates to add the link to the site, I'm guessing it was a S.E.O. company we used for 2 years, but they did not admit to this and could not help. A guy called Klodian from deluxtemplates was really helpful and helped remove from his site, also he agreed to a cost of $250 to remove all the pictures on his server to force the blogger's to update, this is what the template from deluxtemplates now looks like vozconuncion.blogspot.co.uk .Now this was only helping fix this issue a small bit as a different site called btemplates also used the template and added it to there website as a free download and hosted the template pictures on there servers. I have emailed a few times, I have sent them twitter messages and also added messages to lots of there templates on there site in the hope they can help, I have also contacted the owner directly on his goggle+1 page but no reply. This template is being downloaded once or twice a day, with no way to get hold of the blogger's using it. As a last resort I offered the owner $1000 to help me remove the template but still no luck.Does anyone have any ideas how to resolve? we are willing to pay to resolve this and will do what ever needs to be done.Thank-you for taking the time to read.Karl.
White Hat / Black Hat SEO | | wcuk0