Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
-
Happy Friday, everyone! This week's Community Discussion comes from Monday's blog post by Everett Sizemore.
Everett suggests that pruning underperforming product pages and other content from your ecommerce site can provide the greatest ROI a larger site can get in 2016. Do you agree or disagree? While the "pruning" tactic here is suggested for ecommerce and for larger sites, do you think you could implement a similar protocol on your own site with positive results? What would you change? What would you test?
-
I don't think there is a one-size-fits all recommendation to make here, which is why that post has so much detail about how to do the research necessary to determine what the best route is for your business.
I agree that improving content is better than simply noindexing it, but I also think noindexing it is better than leaving it up long-term unimproved. And the reality is many businesses with tens-of-thousands or hundreds-of-thousands of product pages, and most blogs with thousands of posts, aren't going to be able to economically scale rewriting all of it. The best solution for them, in my opinion, is to get rid of the pages that are dragging them down - at least get them out of the index.
They can always be reintroduced once they're improved.
-
Matt,
I totally agree.
In my former life (er, job) I wrote thousands of ecomm product description for some of the world's biggest brands. It was a painful process at first, in large part because EVERY company felt that more was always better. Until I was able to show them that it wasn't.
At first, each description was bloated with text.
Then, I imposed a strict word (50) and character count (220) for descriptions, and conversions improved dramatically. Also, customer service calls and complaints diminished considerably.
Here's why: Customers visiting a specific product are more likely to know something about it, so they don't need a bunch of details. They ask friends and family members, read reviews, etc., so they're educated to a degree when they visit the page. Also, if they do have questions, it's better to have a Q&A-style setup on the page, similar to what REI does.
For folks who aren't as educated but who have landed on the page for a specific product, the diminutive descriptions means they have enough info to whet their appetites (i.e., to read reviews and conduct research) but not so much to confuse them.
RS
-
If you are slapped with a Panda problem, you better either be pruning or noindexing. I really don't care to disagree with the advice given by some prominent Googlers, but if you got the poison on your site I think that you are better off pruning than allowing it to be consumed by their crawlers a thousand times every month.
-
I think it's better to add more valuable content rather than removing content which you feel is under-performing. Unless its wickedly harmful then you should just leave it and move onto making better stuff.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
Competing Pages on Ecommerce Site - Very Frustrating
We have multiple issues with this situation. We rank #1 for "Lace Fabric", #3 for "Lace Trim", and #80 for "Lace". We also rank for "Lace Ribbon", and "Lace Appliques". The Lace Fabric and Lace Trim pages have plenty of backlinks, wherein may lie the problem. We have a similar issue for "Satin". "Silk Satin", "Polyester Satin", "Satin Trim", "Satin Ribbon", etc. This is a very annoying and common pattern. Our backlink profile is sterling, and our competitors with inferior backlink profiles and branded search are outranking us. We outrank them across the board for 2 word terms. Based on my evaluation of TF/CF, PA/DA, Content, etc., we should be on page 1 for "Lace". IMHO, these pages are competing for the head term. Any ideas on how to eliminate this issue to rank for head terms?
Intermediate & Advanced SEO | | GWMSEO0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Removing content from Google's Indexes
Hello Mozers My client asked a very good question today. I didn't know the answer, hence this question. When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company. However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report. They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)! By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content. I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits. Is this the case or am I being too optimistic?
Intermediate & Advanced SEO | | catherine-2793880 -
Best-of-the-web content in steep competition, ecommerce site
Hello, I'm helping my client write a long, comprehensive, best-of-the-web piece of content. It's a boring ecommerce niche, but on the informational side the top 10 competitors for the most linked to topic are all big players with huge domain authority. There's not a lot of links in the industry, should I try to top all the big industries through better content (somehow), pictures, illustrations, slideshows with audio, and by being more thorough than these very good competitors? Or should I go for something that's less linked to (maybe 1/5 as much people linking to it) but easier? or both? We're on a short timeline of 3 and 1/2 months until we need traffic and our budget is not huge
Intermediate & Advanced SEO | | BobGW1 -
Will implementing a 'Scroll to Div Anchor' cause a duplicate content issue?
I have just been building a website for a client with pages that contain a lot of text content. To make things easier for site visitors I have created a menu bar that sticks to the top of the page and the page will scroll to different areas of content (i/e different Div id anchors) Having done this I have just had the thought that this might inadvertently introduce duplicate content issue. Does anyone know if adding an #anchor to the end of a url will cause a duplicate content error in google? For example, would the following URLs be treated as different:- http://www.mysite.co.uk/services
Intermediate & Advanced SEO | | AdeLewis
http://www.mysite.co.uk/services#anchor1
http://www.mysite.co.uk/services#anchor2
http://www.mysite.co.uk/services#anchor3
http://www.mysite.co.uk/services#anchor4 Thanks.0 -
An Infrastructure Change for a Large eCommerce Site - Any advice?
Hello Mozers, We're currently under going quite a large infrastructure change to our website and I wouldn't to hear your thoughts on the type of things we should be careful of. We currently have close to 4,000 individual products each with their own page. The seo work is then driven behind certain pages which house a catalog display of groups of products. The groups are done by style. An example is we have a page called "Style A" which displays 8 different colours of style A. We then seo the style A page and the individual items received minimal seo work. The change would involve having one individual product page for each style but on that page the user would have the ability to purchase the different colours/variations via menus. This will result in approximately a %70 reduction in the size of our site (as several products will no longer be published) The things we are currently concerned with are: 1. The lose of equity to those unwanted 'style A' pages - I think a series of careful planned 301s will be the solution. 2. Possible loss of long tail traffic to the individual products which might not be caught by one individual page per style. 3. Internal link structure will need to be monitored to make sure that we're still highlight the most important pages as well, important. Sorry for the long post, it's a difficult change to explain without revealing the clients name - any other things we should be thinking about would be greatly appreciated! Thanks Nigel
Intermediate & Advanced SEO | | NigelJ0