Wordpress Category Archives - Index - but will this cause duplication?
-
Okay something I am struggling with
Using YOAST - but have a recipe blog -
However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below.
My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
-
This should be totally fine. It's pretty common blog/wordpress practice to have excerpts on a category page and then the full article/recipe on that individual page.
Also, just to dispel a myth there is no duplicate content "penalty" - so nothing to fear from that standpoint anyway - just try to make each page have a distinct purpose, which they do. The category page allows users to browse all the recipes in that category and choose which to view. The recipe page allows users to view the whole recipe and use it to hopefully cook something tasty
-
Often in a cookbook, there are different sections like breads, cakes, and cookies. Each might start out with a page or two explaining general rules about choosing and preparing that type of food and also contain a listing of recipes specific to the sections, with brief descriptions, so the reader can get an idea of what each recipe is for.
If that is what you are doing with your website, I wouldn't worry about duplicate content. If there is a good amount of original content at the top, and then short excerpts to explain what the links are about, you should be fine. As Andy said, just be sure the pages themselves are good pages and the amount of text you duplicate in your recipe descriptions is fairly short. [You could even write custom descriptions for the links, rather than using excerpts. Something to tempt the readers to read more...]
-
Hi Kelly,
will these recipe excerpts be picked up as duplicate content?
Yes, it is likely that crawlers will see it as duplicate content, but that doesn't necessarily equate to an issue for you. How are you finding that they are being indexed? Are they appearing well in the SERPs? Is the additional content on the pages just there to satisfy Google, or is it genuinely useful?
You could also reduce the excerpt size so the levels of duplication aren't as high.
You also need to look at the page and decide how / why you are optimising the pages. Is it just to gain more keywords and then funnel people to other articles? If so, you may fall foul of the Doorway Penalty. I posted this in another question a short while ago:
Here are questions to ask of pages that could be seen as doorway pages:
- Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
- Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
- Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
- Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
- Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?
If you answer yes to any of those, then it might not just be duplicate content that is your issue.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My indexed site URL removed from google search without get any message or Manual Actions???
On Agust 2 or 3.. I'm not sure about the exact date...
White Hat / Black Hat SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results I never get any penalties to my site on Google search console. I noticed some drops on some keywords before that happens (in June and July) but it all of it was related to web design keywords for local Qatar, but all other keywords that related to SEO and digital marketing were not have any changes and been on top My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Recently, based on google console, I found some new links but I have no idea how it been added to links of my website:
essay-writing-hub.com - 9,710
tiverton-market.co.uk - 252
facianohaircare.com - 48
prothemes.biz - 44
worldone.pw - 2
slashdot.org - 1
onwebmarketing.com - 1 the problem is that all my high PR real links deleted from google console as well although it still have my site link and it could be recognized by MOZ and other sites! Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Server down - What will happen to the SERP?
Hi everybody, we have a lot of websites (about 100) on one Server in Italy. This Server crashed 5 days ago and now it should go online (I hope!). What will happen to the SERP? What shall I do to recover the rank of every key? New links, new content, just wait...what? Tnks 😉
White Hat / Black Hat SEO | | Sognando0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Duplicate user reviews from hotel based database?
Hello, Just got a new client who has a hotel comparison site, the problem is the reviews and the hotel data is all pulled in from a database, which is shared and used by other website owners. This obviously brings up the issue for duplicate content and panda. I read this post by Dr Pete: http://www.seomoz.org/blog/fat-pandas-and-thin-content but am unsure what steps to take. Any feedback would be much appreciated. Its about 200,000 pages. Thanks Shehzad
White Hat / Black Hat SEO | | shehzad0 -
Ways to find private - non-indexed forums in a niche
I would wondering if there were ways to find non-indexed content in private forums/discussion boards. Is there a scalable 'foot-print' that suggests the forum has a private section?
White Hat / Black Hat SEO | | ilyaelbert0 -
How to transfer posts from a specific category into a subdomain
Hi Guys, Is there a way in Wordpress to transfer or redirect a category and all posts under it into a sub-domain? Thanks in advance..
White Hat / Black Hat SEO | | Trigun0