Auto genrated content problem?
-
Hi all,
I operate a Dutch website (sneeuwsporter.nl), the website is a a database of European ski resorts and accommodations (hotels, chalets etc). We launched about a month ago with a database of about 1700+ accommodations. Of every accommodation we collected general information like what village it is in, how far it is from the city centre and how many stars it has. This information is shown in a list on the right of each page (e.g. http://www.sneeuwsporter.nl/oostenrijk/zillertal-3000/mayrhofen/appartementen-meckyheim/). In addition a text of this accomodation is auto generated based on some of the properties that are also in the list (like distance, stars etc).
Below the paragraph about the accommodation is a paragraph about the village the accommodation is located in, this is a general text that is the same with all the accommodations in this village. Below that is a general text about the resort area, this text is also identical on all the accommodation pages in the area. So a lot of these texts about the village and area are used many times on different pages.
Things went well at first and every day we got more Google traffic, and more and more pages. But a few days ago our organic traffic took a near 100% dive, we are hardly listed anymore and if we are at very low places. We expect the Google gave us a penalty. We expect this to be the case because of 2 reasons:
-
we have auto generated text that only vary slightly per page
-
we re-use the content about villages and area's on many pages
We quickly removed the content of the villages and resort area's because we are pretty sure that this is definitely something Google does not want. We are less sure about the auto generated content, is this something we should remove as well? These are normal readable text, they just happen to be structured more or less the same way on every page. Finally, when we made these and maybe some other fixes, what is the best and quickest ways to let Google see us again and show them we improved?
Thanks in advance!
-
-
The page that you have linked to has 3 sentences of text. When I search Google for "Appartementen Meckyheim" it looks like there is a lot of competition. 3 sentence of text is not going to add a lot fo quality to a page.
But, I do think there is more than just a poor ranking issue. I searched through 6 pages and didn't see your page at all. It's still in the index, but it's not ranking.
Also, I'm concerned that the Trail Map and Accessibility pages may look like duplicated content to Google. They really can only evaluate what they can crawl, so this page likely looks the same on every listing you have in Google's eyes.
I am suspicious that there may have been a Panda update in the last few days. Sometimes Google doesn't announce them right away.
Thin content like you have shown us as well as duplicate content are what Panda goes after.
I'm guessing that you ranked well until the Panda filter detected thin and duplicate content. It's possible that removing the duplicated pages will be enough but I'm suspicious that you'll need to have substantially more content such as a thorough review of each place in order to get back to ranking again.
If I am right and there was a Panda update then you may not see recovery after beefing the content up until Panda runs again.
-
Google has been treating sites with lots of page-to-page duplication this way for at least five or six years.
You get indexed, ranked and start getting traffic but when Google figures out that your site was made with a cookie cutter then most of your pages will be filtered from the SERPs.
In my opinion this is different from a penalty. It's simply not showing dupes in the SERPs.
I used to have a lot of autogenerated content. Entire sites with hundreds of thousands of pages dedicated to it. They were kickass for a few weeks to a few months and then tanked hard.
I found that autogenerated content (where it is mainly boiler plate or duplicated) is a continuous expense. (Get killed and replace it, get killed and replace it.)
However, genuine authorship can be an investment that might continue to pay after I am dead (I wouldn't say that if I was twenty years old because strong competitors are popping up in every niche... but since I am one of the older people posting here I can say that with a little more certainty.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
Is it a problem to have an image + link in your menu
Hi, My menu has a image with links to some of the main pages on the site and text underneath it explaining what the banner is. Will it be beneficial or harmful to have the text hyperlinked to the same pages the images go to?
Technical SEO | | theLotter0 -
Content is king, is it okay if its in a widget?
My home page for my site, isn't really a home page, not sure how to describe that. We have additional pages that are stand alone pages which we work on and add content too, just not for the main page. So I have put my 300 words in a widget on the front page (which actually shows up on all the page being a widget. Is that good for SEO, or should it be in the body of a page? Thanks!
Technical SEO | | greenhornet770 -
Is this considered Duplicate Content?
Good Morning, Just wondering if these pages are considered duplicate content? http://goo.gl/t9lkm http://goo.gl/mtfbf Can you please take a look and advise if it is considered duplicate and if so, what should i do to fix... Thanks
Technical SEO | | Prime850 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Theft of content
Hi i done a post a while ago about a compeditor stealing content from our site time and time again and this morning i have found a referal link from a site that i am not sure what it does and would like more info please. The site is http://headmetrics.com/ I have read about the site and I am just wondering if people can use this site to copy our site any advice would be great
Technical SEO | | ClaireH-1848860