Auto genrated content problem?
-
Hi all,
I operate a Dutch website (sneeuwsporter.nl), the website is a a database of European ski resorts and accommodations (hotels, chalets etc). We launched about a month ago with a database of about 1700+ accommodations. Of every accommodation we collected general information like what village it is in, how far it is from the city centre and how many stars it has. This information is shown in a list on the right of each page (e.g. http://www.sneeuwsporter.nl/oostenrijk/zillertal-3000/mayrhofen/appartementen-meckyheim/). In addition a text of this accomodation is auto generated based on some of the properties that are also in the list (like distance, stars etc).
Below the paragraph about the accommodation is a paragraph about the village the accommodation is located in, this is a general text that is the same with all the accommodations in this village. Below that is a general text about the resort area, this text is also identical on all the accommodation pages in the area. So a lot of these texts about the village and area are used many times on different pages.
Things went well at first and every day we got more Google traffic, and more and more pages. But a few days ago our organic traffic took a near 100% dive, we are hardly listed anymore and if we are at very low places. We expect the Google gave us a penalty. We expect this to be the case because of 2 reasons:
-
we have auto generated text that only vary slightly per page
-
we re-use the content about villages and area's on many pages
We quickly removed the content of the villages and resort area's because we are pretty sure that this is definitely something Google does not want. We are less sure about the auto generated content, is this something we should remove as well? These are normal readable text, they just happen to be structured more or less the same way on every page. Finally, when we made these and maybe some other fixes, what is the best and quickest ways to let Google see us again and show them we improved?
Thanks in advance!
-
-
The page that you have linked to has 3 sentences of text. When I search Google for "Appartementen Meckyheim" it looks like there is a lot of competition. 3 sentence of text is not going to add a lot fo quality to a page.
But, I do think there is more than just a poor ranking issue. I searched through 6 pages and didn't see your page at all. It's still in the index, but it's not ranking.
Also, I'm concerned that the Trail Map and Accessibility pages may look like duplicated content to Google. They really can only evaluate what they can crawl, so this page likely looks the same on every listing you have in Google's eyes.
I am suspicious that there may have been a Panda update in the last few days. Sometimes Google doesn't announce them right away.
Thin content like you have shown us as well as duplicate content are what Panda goes after.
I'm guessing that you ranked well until the Panda filter detected thin and duplicate content. It's possible that removing the duplicated pages will be enough but I'm suspicious that you'll need to have substantially more content such as a thorough review of each place in order to get back to ranking again.
If I am right and there was a Panda update then you may not see recovery after beefing the content up until Panda runs again.
-
Google has been treating sites with lots of page-to-page duplication this way for at least five or six years.
You get indexed, ranked and start getting traffic but when Google figures out that your site was made with a cookie cutter then most of your pages will be filtered from the SERPs.
In my opinion this is different from a penalty. It's simply not showing dupes in the SERPs.
I used to have a lot of autogenerated content. Entire sites with hundreds of thousands of pages dedicated to it. They were kickass for a few weeks to a few months and then tanked hard.
I found that autogenerated content (where it is mainly boiler plate or duplicated) is a continuous expense. (Get killed and replace it, get killed and replace it.)
However, genuine authorship can be an investment that might continue to pay after I am dead (I wouldn't say that if I was twenty years old because strong competitors are popping up in every niche... but since I am one of the older people posting here I can say that with a little more certainty.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique Contextual Content
Let's say you have a page on your website which displays the current discounts available for iPhones. The page is a list of deals with buttons to reveal a promo code. Would adding contextual content to these pages improve rankings? If the main keywords are already on the page, such as "Save 20% on iPhone 5 with this great iPhone coupon code" where iPhone coupon code is the target keyword. Does it still make sense to put 500+ words of contextual content on that page, even when the content isn't really something the viewer cares about? I've noticed websites doing this, and ranking well. I wanted to know if this is a significant ranking factor or just a coincidence.
Technical SEO | | poke10 -
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Index problems
“The website http://www.vaneyckshutters.com/nl/ does not show in the index of Google (site:vaneyckshutters.com/nl/). This must be the homepage in the Netherlands. Previously, the page www.vaneyckshutters.com was redirected to /nl/. This page is accessible now with a canonical tag to http://www.vaneyckshutters.com/nl/ in the hope to let /nl/ be indexed. When we look at the SERPS for keyword ‘shutters’, the page http://www.vaneyckshutters.com/ is shown in Google.nl on #32 and in Belgium #3. Problem & question: Why is it that /nl/ has not been indexed properly and why is it that we rank with http://www.vaneyckshutters.com on ‘shutters’ instead the/nl/ page?”
Technical SEO | | Happy-SEO1 -
Is the content on my website is garbage?
I received a mail from google webmasters, that my website is having low quality content. Website - nowwhatmoments.com
Technical SEO | | Green.landon0 -
SEO for User Authenticated Content
Hi Everyone - I have a potential client who is seeking SEO for a site that contains about 95% of content only accessible through user authentication . Does anyone have tips for getting this indexed without having to open it up to the public? I was considering adding "snippets" into the robots.txt or creating an additional page with snippets linking to the login page. I'd appreciate any thoughts! Thanks!
Technical SEO | | manutx0 -
Set base-href to subfolders - problems?
A customer is using the <base>-tag in an odd way: <base href="http://domain.com/1.0.0/1/1/"> My own theory is that the subfolders are added as the root because of revision control. CSS, images and internal links are used like this:
Technical SEO | | Vivamedia
internal link I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag. I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.0 -
Crawling and indexing content
If a page element (div, e.g.) is initially hidden and shown only by a hover descriptor or Javascript call, will Google crawl and index it’s content?
Technical SEO | | Mont0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0