Looking at creating some auto-generated pages - duplicate content?
-
Hi Everyone!
We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars).
We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked.
It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it?
Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta
Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta
PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta
I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about?
Thanks,
JP
-
This feels like looking for permission to take a short cut and I personally cannot justify that. I would highly recommend to take the time to make the pages correctly and not even worry about duplicate content or Panda. None of use can tell you if it will "work" or if it'll be seen as duplicate content or when it'll be hit by panda, but if it's auto-generated, I can promise it'll be hit at some point.
However, the worst you can do is test it. Take the top 10% of make/model deal pages and create them how you want. Do it like you're talking and see how they perform. I'm not talking a few weeks, I mean a few months. All the while, be developing the content for some of those pages.
This way, if it works and people like the pages, they are relevant produced the way you want to do it now, the short way, then you can produce the other pages that way. At that point however, your top make/model pages will have unique content that you can put up. So a few months later, you'll have 10% of your deals pages with unique content and the rest automated. Then start working on the next top 10%. Keep going until they are all done.
-
First of all, I think your landing pages look awesome. Especially the review page; it's probably one of the most visual/engaging pages I've seen in a long time. The design and content is awesome and you should be really pleased with the result - It really looks great!
However, I do think the deals page looks quite thin, especially as the review pages are _so _strong. Branded3 published a piece last year on how voucher code websites were impacted by Panda (Source), so I'd be hesitant to use these deals pages. As you state, the quick solution would probably be to roll up all of this content into one article, but I think this would be the best for the consumer, even though it's more time consuming.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexing pages content that is not needed
Hi All, I have a site that has articles and a side block that shows interesting articles in a column block. While we google for a keyword i can see the page but the meta description is picked from the side block "interesting articles" and not the actual article in the page. How can i deny indexing that block alone Thanks
Technical SEO | | jomin740 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Duplicate Content Due to Pagination
Recently our newly designed website has been suffering from a rankings loss. While I am sure there are a number of factors involved, I'd like to no if this scenario could be harmful... Google is showing a number of duplicate content issues within Webmaster Tools. Some of what I am seeing is duplicate Meta Titles and Meta Descriptions for page 1 and page 2 of some of my product category pages. So if a category has many products and has 4 pages, it is effectively showing the same page title and meta desc. across all 4 pages. I am wondering if I should let my site show, say 150 products per page to get them all on one page instead of the current 36 per page. I use the Big Commerce platform. Thank you for taking the time to read my question!
Technical SEO | | josh3300 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Duplicate content on my home
Hello, I have duplication with my home page. It comes in two versions of the languages: French and English. http://www.numeridanse.tv/fr/ http://www.numeridanse.tv/en/ You should know that the home page are not directories : http://www.numeridanse.tv/ Google indexes the three versions: http://bit.ly/oqKT0H To avoid duplicating what is the best solution?
Technical SEO | | android_lyon
Have a version of the default language? Thanks a lot for your answers. Take care. A.0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0