Wordpress Category Archives - Index - but will this cause duplication?
-
Okay something I am struggling with
Using YOAST - but have a recipe blog -
However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below.
My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
-
This should be totally fine. It's pretty common blog/wordpress practice to have excerpts on a category page and then the full article/recipe on that individual page.
Also, just to dispel a myth there is no duplicate content "penalty" - so nothing to fear from that standpoint anyway - just try to make each page have a distinct purpose, which they do. The category page allows users to browse all the recipes in that category and choose which to view. The recipe page allows users to view the whole recipe and use it to hopefully cook something tasty
-
Often in a cookbook, there are different sections like breads, cakes, and cookies. Each might start out with a page or two explaining general rules about choosing and preparing that type of food and also contain a listing of recipes specific to the sections, with brief descriptions, so the reader can get an idea of what each recipe is for.
If that is what you are doing with your website, I wouldn't worry about duplicate content. If there is a good amount of original content at the top, and then short excerpts to explain what the links are about, you should be fine. As Andy said, just be sure the pages themselves are good pages and the amount of text you duplicate in your recipe descriptions is fairly short. [You could even write custom descriptions for the links, rather than using excerpts. Something to tempt the readers to read more...]
-
Hi Kelly,
will these recipe excerpts be picked up as duplicate content?
Yes, it is likely that crawlers will see it as duplicate content, but that doesn't necessarily equate to an issue for you. How are you finding that they are being indexed? Are they appearing well in the SERPs? Is the additional content on the pages just there to satisfy Google, or is it genuinely useful?
You could also reduce the excerpt size so the levels of duplication aren't as high.
You also need to look at the page and decide how / why you are optimising the pages. Is it just to gain more keywords and then funnel people to other articles? If so, you may fall foul of the Doorway Penalty. I posted this in another question a short while ago:
Here are questions to ask of pages that could be seen as doorway pages:
- Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
- Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
- Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
- Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
- Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?
If you answer yes to any of those, then it might not just be duplicate content that is your issue.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this considered duplicate content?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: NOTE: the summaries are written by us, and not copied/pasted from other websites. Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Duplicate Content issue in Magento
I am getting duplicate content issue because of the following product URL in my Magento store. http://www.sitename.com/index.php/sports-nutritions/carbohydrates http://www.sitename.com/sports-nutritions/carbohydrates Please can someone guide me on how to solve it. Thanks Guys
White Hat / Black Hat SEO | | webteamBlackburn0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Magento Category URL
Hello I do SEO consulting and a client launch a new Magento Enterprise site. My question is they changed a URL from /category-list/ to /category/ and it is 301 redirecting correctly but the links off of that URL are not. Example: www.site.com/category/ if you click a link the next one shows up like this: www.site.com/categorty-list/vendor/ for example. Any help would be appreciated! Thanks
White Hat / Black Hat SEO | | christaylorconsulting0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0