Thin/Duplicate Content
-
Hi Guys,
So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this.
Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process.
Thanks!
-
Each page is about 100 words all of which are exact duplicates except for where the "keyword" for that page is changed.
So like "Keyword" in California / "Keyword" in Nevada
and so on.
Yeah the long term goal is to get rid of these pages all together, but in the mean time i'd feel much better if our Real to Auto gen ratio was 1 : 0 instead of the current 1 : 1,000. Simply blocking them in the robots.txt will make 95% of the site become a 404. So far my best bet is to Noindex, Follow the pages to give me to to actually fix the internal linking of the site. I'm just not sure if I should do all pages at once or do them slowly over time?
-
do these pages have incomming links? if not then there is nothing to gain by 301ing them, excluding them in them in robots.txt will cause link juice leaks when you have internal links pointing to them. You can use a no-index,follow meta tag, this will allow link juice to flow to and back out of the non indexed pages, saving link juice.
But one would ask why have the pages if they are not in the index?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
From: http://www. to https://
Hi all, I am changing my hosting for legal and SEO reasons from http://www to https:// . Now I hear different stories on the redirects: 1: should i try and change my backlinks? 2: internally all links will be 301 redirected at first. Than I want to (manually) change them. It;s within Wordpress so there should be a plugin for this. Tips? 3: Will it affect my rankings and for what period? What I now know that at first it will drop little but eventually you will rank higher than before. Thanks so much in advance! Tymen
Technical SEO | | Tymen1 -
Duplicate Content due to CMS
The biggest offender of our website's duplicate content is an event calendar generated by our CMS. It creates a page for every day of every year, up to the year 2100. I am considering some solutions: 1. Include code that stops search engines from indexing any of the calendar pages 2. Keep the calendar but re-route any search engines to a more popular workshops page that contains better info. (The workshop page isn't duplicate content with the calendar page). Are these solutions possible? If so, how do the above affect SEO? Are there other solutions I should consider?
Technical SEO | | ycheung0 -
Duplicate Content of Reseller Product?
There is a particular product/service that I resell through an API. There are quite a few of them and each one requires a lot of content. The company provides web content for each product but I'm wondering about the SEO implications of using it? Obviously using the content, it will not be unique so I won't be able to rank (easily at least) for these products. Are there any _negative_results that I can get from using this content though? If I simply won't rank for those products it's not an issue since I get traffic elsewhere. Thanks!
Technical SEO | | reliabox0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Duplicate Content on 2 Sites - Advice
We have one client who has an established eCommerce Site and has created another site which has the exact same content which is about to be launched. We want both sites to be indexed but not be penalised for duplicate content. The sites have different domains The sites have the same host We want the current site to be priority, so the new site would not be ranking higher in SERPs. Any advice on setting up canonical, author tags, alternate link tag etc Thanks Rich
Technical SEO | | SEOLeaders0 -
Getting multiple errors for domain.com/xxxx/xxxx/feed/feed/feed/feed...
A recent SEOMoz crawl report is showing a bunch 404's and duplicate page content on pages with urls like http://domain.com/categories/about/feed/feed/feed/feed/feed and on and on. This is a wordpress install. Does anyone know what could be causing this or why SEOMoz would be trying to read these non-existent feed pages?
Technical SEO | | Brandtailers0 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0