Can't get auto-generated content de-indexed
-
Hello and thanks in advance for any help you can offer me!
Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated.
Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly).
When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up.
Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary.
I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for
Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them.
Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed.
One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either).
Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end?
Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813
Thanks for reading through all this!
-
I don't think there's any harm in submitting a new/full list, even if it duplicates past lists. The URLs haven't been removed, and you did fix the tags. This isn't like disavowing links - it's more of a technical issue. Worst case, it doesn't work, from what I've seen.
-
Thanks for helping me with this.
You are correct that all the product pages are in the same folder regardless of whether they are public or private so unfortunately, removing an entire folder isn't an option at this point.
When I go to Webmaster tools and view past removal requests, each one shows as either "Expired" or "Removed". WMT only allows me to resubmit the removal request if the label is "Expired". Going back past 90 days, many are still labeled "removed" but the further back I go, more and more say "Expired". There are too many requests to try to determine whether or not each page is indexed - so I think our best bet is to re-submit every expired private product page removal request and then monitor removal. Does this make sense?
Back in August, a Moz crawl showed tons of duplicates for the designer pages (the pages where the user actually designs the jewelry). Using NOINDEX tags and removal requests (credit to Dr. Pete and Everett Sizemore) the number of designer pages in the index dropped from 5K to exactly 8 - so it worked.
Our XML sitemap is dynamic and doesn't list private product pages.
-
It honestly sounds like you're on the right track - you do need to explicitly mark those (and META NOINDEX should be fine). Could you just request removal for all private pages? Worst case, Google removes some that aren't in the index, or attempts to. Since the public/private setting can be changed, you can't really put the private pages all in one folder (real or virtual) - that would make life easier, long-term, but probably isn't useful/appropriate for your case.
I'd also recommend having a clean XML sitemap with just the public entries (updated dynamically). That won't deindex the other pages, but it's one more cue Google can use. You want all of the signals you're sending to be consistent.
I agree with Doug, though - this is really tricky, because ideally you would want people to share these pages, and if you NOINDEX then you're losing out on that. My gut feeling is that, until your site is stronger, you probably can't support 3K near duplicates (and counting). If you want to get sophisticated, though, you could dynamically NOINDEX and only noindex posts that have very little content or our obvious dupes. As people fill out or share a product, you could remove the NOINDEX.
-
Hi Doug,
Thanks for the quick response. I will do my best to answer each of your points.
In Webmaster Tools, under Index Status, it shows 1781 pages indexed, with a high of 6515 on June 2, 2013. Not sure that helps to clarify anything but it's another piece of Google data to consider.
We continually monitor WMT and Analytics. I'm addressing this issue specifically because search impressions on our product pages average less than 5 impressions/day despite continuous improvements over the last 12 months - keyword research, better page titles/product names and longer, more informative descriptions. These 500 or so product pages are vastly better today than then were 12 months ago - but impressions have not improved at all.
Every design, public or private, has social/sharing buttons. As I mentioned above, these designs can all be linked to directly from any external website.
I think the category pages are sufficient. There is some fine-tuning that could be done in terms of how products are organized within categories but overall it's pretty solid and probably not an issue.
Our initial strategy was to attract long-tail traffic with user-generated content but the problem is most users gave their products personal, irrelevant (and possibly spammy) product names. There were other problems with the user generated designs as well - like one user who designed 15 earrings that looked exactly the same except for one bead which she changed to a different color for each design. Anyway, we left all these designs public for over 12 months - as more and more designs were added to the site, organic search traffic actually fell.
-
I agree with Doug.
create better category pages - make sure each product page is under a category.
the user generated products are great and should be indexed.
-
Hey Richard,
First, note that the estimated number of pages displayed by that is an estimate which gets refined the deeper you go into the search results. On page one, they tend to be wildly inaccurate.
If you go all the way to the end (page 13) and then repeat the process with ommitted results included you still get to page 13, and a total of 123 pages. (Somewhat better than the 2k+ results.)
This is less than the 716 pages you mention so maybe you've got he opposite problem? What do you see if you check your google analytics and webmaster tools? Which pages are getting organic traffic from google? Which pages are showing in the search results (Webmaster Tools, Impressions)
What are the pages you want to appear in search and what are the keywords you're targeting?
My first thought is - if you're allowing people to design your own jewellery - are you also allowing them to easily share their creations on social, etc? Have you got embed codes so that they can put their designs on their blog etc? If you're not then I think you're missing a trick.
All of these individual items, designed by users, will (should) all be linking back to the specific category pages (or other landning page) and increasing the authority of that page. Make sure your category/landing pages have good unique content that communicates both the value proposition and the products you've got available.
If you don't have these category pages, then it might be worth looking at your site architecture/hierarchy and think about creating them.
Your individual product pages might get long-tail traffic (and having lots of different variations, described in real-people's own words might actually work to your advantage here), your category pages should be the ones targeting head terms.
I notice you've no-indexed and no-followed the product pages in question. This means that if these pages are shared, then any inbound authority/link equity/link-juice/ is just being discarded. Are you sure you want to do that?
I don't think you need to worry too much about google's index at this point and I certainly wouldn't consider deindexing the whole site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side. I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work. Also, how can I build a test to see if it does or does not work?
Intermediate & Advanced SEO | | MJTrevens0 -
Why doesn't my website crawl by Google?
Hi mozzers and members, I am having issues, why my website: http://profilecosmeticsurgery.com/ crawl by Google? let me share more clearly when this starts happening. A month or around 45 days back our website is being indexed and crawled quite well without any issues with having .html extension pages with static built website.
Intermediate & Advanced SEO | | SEOOOOOoooooooo
We finally thought to change to .php version and make whole website and its pages to be treated dynamically.
Once we changed all changes, thereafter this issues started. It has been more than 45 days, our website isn't being crawled since then. I didn't know what are the things preventing this to? Please help. Thanks in Advance Capture1.PNG0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Duplicate content URLs from bespoke ecommerce CMS - what's the best solution here?
Hi Mozzers Just noticed this pattern on a retail website... This URL product.php?cat=5 is also churning out products.php?cat=5&sub_cat= (same content as product.php?cat=5 but from this different URL - this is a blank subcat - there are also unique subcat pages with unique content - but this one is blank) How should I deal with that? and then I'm seeing: product-detail.php?a_id=NT001RKS0000000 and product-detail.php?a_id=NT001RKS0000000&cont_ref=giftselector (same content as product-detail.php?a_id=NT001RKS0000000 but from this different URL) How should I deal with that? This is a bespoke ecommerce CMS (unfortunately). Any pointers would be great 🙂 Best wishes, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
How does the crawl find duplicate pages that don't exist on the site?
It looks like I have a lot of duplicate pages which are essentially the same url with some extra ? parameters added eg: http://www.merlin.org.uk/10-facts-about-malnutrition http://www.merlin.org.uk/10-facts-about-malnutrition?page=1 http://www.merlin.org.uk/10-facts-about-malnutrition?page=2 These extra 2 pages (and there's loads of pages this happens to) are a mystery to me. Not sure why they exist as there's only 1 page. Is this a massive issue? It's built on Drupal so I wonder if it auto generates these pages for some reason? Any help MUCH appreciated. Thanks
Intermediate & Advanced SEO | | Deniz0 -
My warning report says I have too many on page links - 517! I can't find 50% of them but my q is about no follow
if we put 'no follow' on some of these links does that mean the search engines won't index the no follow pages even if those pages are linked to from elsewhere? no link juice will flow from the page with the (no follow) links on? Just trying to understand why my rankings have dropped so dramatically in the last 6 weeks or so since we redesigned the site, and it might be that now we have too many links on the homepage. This is the page http://www.suffolktouristguide.com/ All suggestions appreciated!
Intermediate & Advanced SEO | | SarahinSuffolk0