Using Programmatic Content
-
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites.
Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
-
Exactly. We were using what amounted to computer generated mad-libs to turn out a high volume of content. Panda came along and killed 1/3 of our traffic. We were also sent us a warning that we would be delisted.
-
Have you seen this article? http://www.wired.com/gadgetlab/2012/04/can-an-algorithm-write-a-better-news-story-than-a-human-reporter/
Sounds like this is legit for now.
"If you click that “little or no original content” link, it takes you to a page with some examples. Among them: thin affiliate sites, doorway pages, scraped content and auto-generated content.
Wait a minute. Auto-generated content?
Auto-generated content: Content generated programatically. Often this will consist of random paragraphs of text that make no sense to the reader but that may contain search keywords.
Well, I don’t think Google had something like Narrative Science in mind when they came up with that, but it poses an interesting question: just how does Google feel about this kind of content? On the one hand, it is “content generated programatically”. On the other hand, it’s not going to “consist of random paragraphs of text that make no sense to the reader.”"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Can I Use Meta NoIndex to Block Unwanted Links?
I have a forum thread on my site that is completely user generated, not spammy at all, but it is attracting about 45 backlinks from really spammy sites. Usually when this happens, the thread is created by a spammer and I just 404 it. But in this instance, the thread is completely legit, and I wouldn't want to 404 it because users could find it useful. If I add a meta noindex, nofollow tag to the header, will the spammy pagerank still be passed? How best can I protect myself from these low quality backlinks? I don't want to get slapped by Penguin! **Note: I cannot find contact information from the spam sites and it's in a foreign language.
White Hat / Black Hat SEO | | TMI.com0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Competitor using "unatural inbound links" not penalized??!
Since Google's latest updates, I think it would be safe to say that building links is harder. But i also read that Google applies their latest guidelines retro-actively. In other words, if you have built your ilnking profile on a lot of unnatural links, with spammy anchor text, you will get noticed and penalized. In the past, I used to use SEO friendly directories and "suggest URL's" to build back links, with keyword/phrase anchor text. But I thought that this technique was frowned upon by Google these days. So, what is safe to do? Why is Google not penalizing the competitor? And bottom line what is considered to be "unnatural link building" ?
White Hat / Black Hat SEO | | bjs20101 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0