You're given 10,000 recipes and told to build a site--what would you do?
-
Say you were given a list of 10,000 recipes and asked to build an SEO friendly site. Would you build a recipe search engine and index the search results (of course making sure that IA and user engagement metrics are great)?
Or, would you try to build static pages?
-
I have also one https://besttoasterovenguides.com/ about kitchen niche. Can someone check it's not showing any links correctly.
-
I would use tools liek copyscape or anyother to see if the recipes (exact text) are already not available online so you won't get into any sort of duplication issues.
Considering 10,000 it will take some time for sure.
If its not copied material then surely go for a website.
Building static pages will take much more time compared to using a cms i guess.
-
There are some great responses on the SEO aspect of your project already from Keri and Matt. As far as building the site, I would build it off of WordPress and use a custom post type for "Recipes", and custom taxonomies for "ingredients" and "type" etc... Then you can use the default WP search function and taxonomy lists for users to easily search for the right recipe.
-
I'd ask if the recipes were already on the web and see if I'm going to be fighting a huge duplicate content problem against an established site.
-
I don't know that they're mutually exclusive. I think you need to create a page for each of the recipes. Then I think you need a great search engine for it (search by ingredient, by course, by main protein, by what's in pantry, etc.) You'll want to definitely get your ideas from successful sites like AllRecipes.com, Taste.com.au, FoodNetwork.com and such.
Also - from an SEO/on-site perspective, you need to figure out how to get integrated hRecipe (schema/rich snippet) data into Google. Search "banana bread" click "more" then "Recipes" - at the top you'll see Search Tools > Ingredients. You need your ingredients for every one of those 10k recipes to show up in this part of Google. This is how food bloggers search and they're going to be a HUGE part of your audience.
Make sure each can be rated, and if I were doing this from scratch right now, I'd make sure everyone who submits UGC in the future has a place to put their rel=author on each recipe. If you can integrate ingredients, rel=author and ratings, you'll be on the way to great food SEO.
-
I would look at similar sites. Not sure where you are located, but here in Australia we have a great site called Taste (Taste.com.au).
There's got to be tonnes of great recipe sites like that - I'd just copy elements of what they are doing. Or at least use it as a starting point to do some significant research.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can't support IE 7,8,9, 10\. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern browers while still providing a destination of IE browsers?
Hi, Our site can't support IE 7,8,9, 10. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern broswers while still providing a destination of IE browsers? Would their be an SEO penalty? Thanks!
Intermediate & Advanced SEO | | dspete0 -
When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
I just want a second opinion 🙂 The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
Intermediate & Advanced SEO | | Zanox0 -
Removing A Blog From Site...
Hi Everyone, One of my clients I am doing marketing consulting for is a big law firm. For the past 3 years they have been paying someone to write blog posts everyday in hopes of improving search traffic to site. The blog did indeed increase traffic to the site, but analyzing the stats, the firm generates no leads (via form or phone) from any of the search traffic that lands in the blog. Furthermore, I'm seeing Google send many search queries that people use to get to the site to blog pages, when it would be much more beneficial to have that traffic go to the main part of the website. In short, the law firm's blog provides little to no value to end users and was written entirely for SEO purposes. Now the law firm's website has 6,000 unique pages, and only 400 pages of the site are NON-blog pages (the good stuff, essentially). About 35% of the site's total site traffic lands on the blog pages from search, but again... this traffic does not convert, has very high bounce rate and I doubt there is any branding benefit either. With all that said, I didn't know if it would be best to delete the blog, redirect blog pages to some other page on the site, etc? The law firm has ceased writing new blog posts upon my recommendation, as well. I am afraid of doing something ill-advised with the blog since it accounts now for 95% of the pages of the website. But again, it's useless drivel in my eyes that adds no value and was simply a misguided SEO effort from another marketer that heard blogs are good for SEO. I would certainly appreciate any guidance or advice on how best to handle this situation. Thank you for your kind help!
Intermediate & Advanced SEO | | gbkevin0 -
Severe health issues are found on your site. - Check site health (GWT)
Hi, We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message: Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
Intermediate & Advanced SEO | | bjs2010
</a>Is robots.txt blocking important pages? Some important page is blocked by robots.txt. Now, this is the weird part - the page being blocked is the admin page of magento - under
www.domain.com/index.php/admin/etc..... Now, this message just wont go away - its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ?? Any ideas? THanks0 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1 -
Site #2 beats site #1 in every aspect?
Hey guys, loving SEOMoz so far and will definitely continue my subscription after the free trial. I have a question however, which I am really confused about. When researching my primary keyword, I have found that the second ranked site beats the top site in every single aspect, apart from domain age, which is almost 6 years for the top one and 6 months for the second. When I say every single aspect, I mean everything. More authority for the page and domain, more links, more anchor text links, more authoritive links, more social signals, more relevant links, better domain (although second ranked site is a .net), better MozRank, better MozTrust etc.... I have noticed though, that in the UK SERPs, those sites are switched, so #2 is actually #1. Could it be that the US SERPs just haven't updated yet, or am I missing something completely different.
Intermediate & Advanced SEO | | darrenspeed1 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0