Deleting low quality content
-
Hi there. I have a question about deleting low quality content pages hopefully anyone could share your feedback on.
We have a b2c ecom store and Product Pages are our target LDPs from search. We've built many information pages that are related to different products in the long past that are linked to related product pages.
Problem is many of them lack so-called quality content in terms of volume and quality and they aren't helping. Especially since early this year, organic traffic started declining after having peaked in Feb.
So I'm considering deleting those we and Moz consider low quality that are not receiving search traffic.
Firstly, is that a good idea? Secondly, how should I go about it? Just delete them and put a redirect so that deleted pages will point to related pages or even homepage?
Looking forward to any expert input.
-Yuji -
you do need to obtain seo advice, but often, we don't advise to delete the page but to improve it substantially.
If you have duplicated content, remove it and replace it with well-written, white-hat, high-quality content marketing. This is how we've improved many businesses' local seo by improving on-page SEO, rather than deleting it completely.
-
It would be best to talk to an[SEO Agency to get advice before you delete any blog posts or main pages.
-
Thanks for your advice. Yes, we will definitely be careful deleting pages. Thanks a lot!
-
That's a really good idea! Cut down what you have to manage to the essentials and then spend more time on those pages. Make sure you do some kind of ranking or traffic audit against all the pages though. You don't want to delete the versions of each page which have some (even if it is small) SEO power. You want to target the ones which Google isn't using
-
Thanks a lot for your feedback. It was helpful. I think we may need to remove pages leaving only unique ones and update their content to be more valuable. Thanks!
-
This is usually speaking **not the right mind set **to succeed.
When Google says (through decreasing ranking positions) that you haven't put in enough effort, usually deleting a poor attempt garners no favour in the ranking results. Think about it. Google are saying "you don't have enough quality content" and your answer is to delete content, thus having less than before. Does that seem like a genuine attempt to comply with the increasing stringency of Google's guidelines?
Deleting stuff is the easy way out. Think about it as if you wrote an essay in College and Google were the examiner. They Give you a D- for your essay and mark certain areas of your work as needing improvement. If you deleted those paragraphs, did nothing else and re-submitted the essay would you honestly expect a better grade?
Google want to see effort, unique content, value-add for end users. _Real _hard graft.
If you have high volumes of pages which are identical other than one tiny tab of information or a variable price, then maybe streamlining your architecture by removing pages is the answer. If most of the pages are unique in function (e.g: factually different products, not just parameter-based URL variants etc) then it's more a comment on the lack of invested effort and you must tackle your mindset if you want to rank.
N.B: By effort I don't mean your personal effort. I could also be alluding to the fact that budget was too low when producing content. I'm describing the site - not you personally!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Rendering by Googlebot vs. Visitor
Hi Moz! After a different question on here, I tried fetching as Google to see the difference between bot & user - to see if Google finds the written content on my page The 2 versions are quite different - with Googlebot not even rendering product listings or content, just seems to be the info in the top navigation - guessing this is a massive issue? Help Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Cross Domain duplicate content...
Does anyone have any experience with this situation? We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly. Question 1: If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input. The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals. Question 2 Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex: Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine. Question 3: These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
Intermediate & Advanced SEO | | AMHC1 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Making AJAX called content indexable
Hi, I've read a bit up on making AJAX called content indexable and there seems to be a number of options available, and the recommended methods seems to chaneg with time. My situation is this: On a product pages I have a list of reviews - of which I show the latest 10 reviews. The rest of the reviews are in a paginated format where if the user clicks a "next" button, the next set loads in the same page via AJAX. No ideally I would like all this content indexable as we have hundreds of reviews per product - but at the moment on the latest 10 reviews are indexed. So what is the best / simplest way of getting google to index all these reviews and associate them with this product page? Many thanks
Intermediate & Advanced SEO | | James770 -
Blog content - what to do, and what to avoid in terms of links, when you're paying for blog content
Hi, I've just been looking at a restaurant site which is paying food writers to put food news and blogs on their website. I checked the backlink profile of the site and the various bloggers in question usually link from their blogs / company websites to the said restaurant to help promote any new blogs that appear on the restaurant site. That got me wondering about whether this might cause problems with Google. I guess they've been putting about one blog live per month for 2 years, from 12/13 bloggers who have been linking to their website. What would you advise?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content Question
My understanding of duplicate content is that if two pages are identical, Google selects one for it's results... I have a client that is literally sharing content real-time with a partner...the page content is identical for both sites, and if you update one page, teh otehr is updated automatically. Obviously this is a clear cut case for canonical link tags, but I'm cuious about something: Both sites seem to show up in search results but for different keywords...I would think one domain would simply win out over the other, but Google seems to show both sites in results. Any idea why? Also, could this duplicate content issue be hurting visibility for both sites? In other words, can I expect a boost in rankings with the canonical tags in place? Or will rankings remain the same?
Intermediate & Advanced SEO | | AmyLB0 -
Duplicate content ramifications for country TLDs
We have a .com site here in the US that is ranking well for targeted phrases. The client is expanding its sales force into India and South Africa. They want to duplicate the site entirely, twice. Once for each country. I'm not well-versed in international SEO. Will this cause a duplicate content filter? Would google.co.in and google.co.za look at google.com's index for duplication? Thanks. Long time lurker, first time question poster.
Intermediate & Advanced SEO | | Alter_Imaging0