You're given 10,000 recipes and told to build a site--what would you do?
-
Say you were given a list of 10,000 recipes and asked to build an SEO friendly site. Would you build a recipe search engine and index the search results (of course making sure that IA and user engagement metrics are great)?
Or, would you try to build static pages?
-
I have also one https://besttoasterovenguides.com/ about kitchen niche. Can someone check it's not showing any links correctly.
-
I would use tools liek copyscape or anyother to see if the recipes (exact text) are already not available online so you won't get into any sort of duplication issues.
Considering 10,000 it will take some time for sure.
If its not copied material then surely go for a website.
Building static pages will take much more time compared to using a cms i guess.
-
There are some great responses on the SEO aspect of your project already from Keri and Matt. As far as building the site, I would build it off of WordPress and use a custom post type for "Recipes", and custom taxonomies for "ingredients" and "type" etc... Then you can use the default WP search function and taxonomy lists for users to easily search for the right recipe.
-
I'd ask if the recipes were already on the web and see if I'm going to be fighting a huge duplicate content problem against an established site.
-
I don't know that they're mutually exclusive. I think you need to create a page for each of the recipes. Then I think you need a great search engine for it (search by ingredient, by course, by main protein, by what's in pantry, etc.) You'll want to definitely get your ideas from successful sites like AllRecipes.com, Taste.com.au, FoodNetwork.com and such.
Also - from an SEO/on-site perspective, you need to figure out how to get integrated hRecipe (schema/rich snippet) data into Google. Search "banana bread" click "more" then "Recipes" - at the top you'll see Search Tools > Ingredients. You need your ingredients for every one of those 10k recipes to show up in this part of Google. This is how food bloggers search and they're going to be a HUGE part of your audience.
Make sure each can be rated, and if I were doing this from scratch right now, I'd make sure everyone who submits UGC in the future has a place to put their rel=author on each recipe. If you can integrate ingredients, rel=author and ratings, you'll be on the way to great food SEO.
-
I would look at similar sites. Not sure where you are located, but here in Australia we have a great site called Taste (Taste.com.au).
There's got to be tonnes of great recipe sites like that - I'd just copy elements of what they are doing. Or at least use it as a starting point to do some significant research.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Should I better noindex 'scripted' files in our portfolio?
Hello Moz community, As a means of a portfolio, we upload these PowerPoint exports – which are converted into HTML5 to maintain interactivity and animations. Works pretty nicely! We link to these exported files from our products pages. (We are a presentation design company, so they're pretty relevant). For example: https://www.bentopresentaties.nl/wp-content/portfolio/ecar/index.html However, they keep coming up in the Crawl warnings, as the exported HTML-file doesn't contain text (just code), so we get errors in: thin content no H1 missing meta description missing canonical tag I could manually add the last two, but the first warnings are just unsolvable. Therefore I figured we probably better noindex all these files… They appear to don't contain any searchable content and even then; the content of our clients work is not relevant for our search terms etc. They're mere examples, just in the form of HTML files. Am I missing something or should I better noindex these/such files? (And if so: is there a way to include a whole directory to noindex automatically, so I don't have to manually 'fix' all the HTML exports with a noindex tag in the future? I read that using disallow in robots.txt wouldn't work, as we will still link to these files as portfolio examples).
Intermediate & Advanced SEO | | BentoPres0 -
Baffled by this site's inability to rank
Hi guys, I've been working on a site for quite a while and it has a really good link profile, excellent content, no errors or penalties (as far as I can tell) but for some reason it consistently ranks below a lot of thin poor quality websites with spammy EMDs and a few obviously paid links from old-skool business directories etc. It has a significantly higher DA and linking root domains that almost all of them. Also it just bounces around from #40 to #28 to#35 to #40 to #28 on a weekly basis for many of our primary keywords. There just seems to be no logic to this and it goes against everything I know and everything we're taught. (I should probably point out that I've been doing this quite a while and have a number of other sites ranking extremely well in quite a few different verticals), Has anyone ever experienced anything like this and what did you do? Before I throw in the towel it would be good to hear from others and try and understand why this happens and if there is anything else I can try to help my client and fix it. Many thanks in advance.
Intermediate & Advanced SEO | | Blaze-Communication0 -
Drip Feeding Free Top 10 Blog Sites for Link Building?
Is it a good move to pick 10 free blogging sites to build links. Like drip feeding them. Let's say 10 blogging sites irrespective of its a sub-domain as we get in wordpress or a sub-folder blog as we get in livejournal. Now adding articles related to my money website on those blogs newly created & building links from them. Then drip feeding them by putting 1 article a month at regular intervals with anchor as links in each of them. Do you think its a good move?
Intermediate & Advanced SEO | | welcomecure0 -
Shouldn't Lower Bounce Rate Correlate into Greater Click Thru Rate for a Web Site?
Greetings: I run a real estate web site in New York City with about 650 pages out of which 330 are property listing pages. About 250 of those listing pages contain less than 150 words of content. In late August I set about 250 of the listing pages that generated the least traffic (generally corresponding to those with the least content) to "no-index, follow". Now Google has removed those pages from their index. The overall bounce rate for the site has been reduced from about 69% to about 64% since the removal of these low quality listing pages. However the click thru rate has not improved and is stuck at about 2.2 pages per visitor. Shouldn't the click thru rate improve if the bounce rate goes own? Am I missing something? Also, is a lower bounce rate something that Google will take into account when calculating rank? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Network Of Sites...
Hi Guys, Just wondering if anyone can help me out... We have recently been hit by the Google penguin update and I'm currently working though all the bad / spammy backlinks that previous SEO companies have built for us. I have come across 1 particular domain www.justgoodcars.com they seem to have a lot of different domain names: <colgroup><col width="390"></colgroup>
Intermediate & Advanced SEO | | ScottBaxterWW
| http://www.justpulsarcars.com/nissan-pulsar-warranties/1/United_Kingdom/all.html |
| http://www.justpumacars.com/ford-puma-warranties/1/United_Kingdom/all.html |
| http://www.justpuntocars.com/dutch-site/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom |
| http://www.justpuntocars.com/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom | Now all of theses domains names have exactly the same IP Address?? Above is just a few I would say there are 100s of them. Do you think this could have an affect on us? Thanks, Scott0 -
Getting 260,000 pages re-indexed?
Hey there guys, I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed? Thanks!
Intermediate & Advanced SEO | | StefanJDorresteijn0