Are recipes excluded from duplicate content?
-
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
-
Even though the recipes are the same, I doubt they'd be penalized for having duplicate content. If the search engine decides that it's duplicate content, it may filter out one or the other, and may knock one page down in the search results. If your first page of results was all the same page from different domains, the search engine would have failed you. The duplicate content penalties you're thinking about are more for spammers and spam content, where they're duplicating a lot of content, and doing it intentionally.
In your example, there is a lot else on the page other than the ingredients and the recipes, so that also should mitigate some of the duplication, and they're also laid out a bit differently in the HTML between the two pages, one with divs and one with tables.
Sources:
- [Google Webmaster Central Blog '08](Google Webmaster Central Blog '08 )
- High Rankings Advisor (Jill Whalen) '10
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
What is the Impact of Duplicate Content on Multiple Managed Property Domains?
Hi Moz Community! Our team is having an internal (and external) debate regarding the extent and implications of duplicate content for a hospitality client that I would love to get some feedback on. I unfortunately cannot divulge the brand/URL, but will give as much info as possible. The brand in question manages dozens of properties in the US and worldwide and has recently rolled up all of the domains under a singular brand.com domain. So whereas the properties used to have their own domains (property1.com, property2.com, etc...), they are now housed in sub-folders (brand.com/property1, brand.com/property2.com and so forth). The concern we have is that they launched the new brand site with all of the property sites/content rolled up under the new brand.com domain, however all of the individual property sites and their pages are still live as well. All of the canonicals on both brand.com as well as property1.com (property2.com, property3.com, etc...) are self-referencing (so the canonicals for brand.com/property1 and all of its sub-ages do not point to the still live property1.com and all of its sub-pages, for example). On the brand side, they believe this is the best path forward as brand.com grows and gains some authority, with the later intent on eventually redirecting the individual property domains - but we are unclear of that timeline (though we do think its more months as opposed to days/weeks) So our questions for the community here are: What is the perceived impact in this state of limbo to the individual property sites (ideally they house the original content and have the history, but could Google still give preference to the brand.com/property URLs and/or could both of them suffer in rank/search experience from the duplicate content an non-uniform presentation?) Could brand.com be "dinged" so-to-speak due to launching with this much duplicate content? (And if so, could that affect how quickly normalization occurs after the property sites are finally redirected?) Anything else we should consider/Any other feedback from the community? Thank you all for your time and support!
Technical SEO | | imiJoe0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0