Is this the best way to get rid of low quality content?
-
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination.
However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process).
Some advise on how to proceed from here would be fantastico and danke
<colgroup><col width="493"></colgroup>
-
Hi! I've asked for another associate with more Panda experience than I to come in and comment on this question.
Byork, knowing a little more about your trailing slash issue could help out. Do you have trailing slash redirects in place for all of your pages? Were they put in at a particular time, where you might be able to look at data from just after that date?
If the trailing slashes are in place correctly and always have been, and it's just some weird artifact of GA that is causing these pages to show up with 0 visits, can you ignore those pages that don't have the trailing slash and focus just on the metrics for those with the trailing slash?
-
rel=canonical is more for when there are parameters on your URLs that you can't really do anything about. When you know one URL is being served, but should be another, you should use a 301 redirect. So in this case, you should pick which URL you like better, either with or without the trailing slash, and redirect one to the other. Google treats both of these as two completely separate pages, which is why you're seeing views on one and not the other. If you can't configure the redirect, then you could resort to rel=canonical.
If you have pages with similar content but not a lot of views, then 301 redirecting that page to another page with more views would be fine. That'll pass it's pagerank along, and good for people who find that original URL later, because they'll go to an actual page instead of your 404 page.
-
Great question.
I'd appreciate a pro seo opinion on this, but here's what I am doing on our website.
To Rel Canonical or 301? That is the question for the /countries/Panama to countries/Panama/ and the other examples like that.
On the other pages, what about moving the best part of the content from a low view page to a similar content higher view page and then 301 the old page to the better page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
Algorithm Updates | | Stamats0 -
How do you get great content for a small business?
We always talk about great engaging content being the way forwards for sites. As a small business this is an expensive commodity to outsource when you have probably in the region of 250 pages that could probably all use some work. To that end I have some questions. how do do you make a product or category description engaging? Should they still contain a certain number of words ( personally I hate ready reams of text) As on-page SEO what should we be striving to achieve? I am sure this has all been asked before but what the general consensus right now?
Algorithm Updates | | Towelsrus0 -
Are all duplicate contents bad?
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back. CASE 1:
Algorithm Updates | | Gautam.Jain
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content. How does Google view this? Did Google penalize us due to this reason? CASE 2: In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website. How does Google view this? Did Google penalize us due to this reason? Along with all the download sites, there are also software piracy & crack sites that have the duplicate content. So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites? Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content? Confused 😞 Please help.0 -
Content on Wordpress blog inside the main website for SEO
Hi, We have our main website and our blog on blog.practo.com. Now what I see is that we wish to write in content to grow our seo keywords and links. Should we put the blog as www.practo.com/blog and then begin writing all the content or we should put the wordpress blog as www.practo.com/(wordpress blog here) and then begin writing the content. For best practices I suppose we should have content lined up as www.sitename.com/category/article name etc or www.sitename.com/article name etc - am I correct? Our main site consists of few html pages and then we have our software on a different sub domain. What are the best ways to publish content and get it crawled at a faster rate for growth? I would also wish to understand how to measure the number of growth in % to our content we are writing. Only via google analytics or some other tool? Say I wish to see the growth of 10 articles from month of may and compare it to the month of april or march 2012. So what tools could I use to see if we are progressing or not? Thanks
Algorithm Updates | | shanky10 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0 -
Is there a way to know what rank my site is listed on google ?
My current client web page was listed at the 4th page 1 month ago. Im trying real hard to make him understand that the traffic from beiing on the first page is important and that he need to give me additionnal ressource to make it happen ( i don't prog at all). So i had the idea of checking every page to see whats is current rank. but instead of looking from page 1 to page X, i was wondering if there was something somewhere that could give me my rank right away. It woud help saving time. Thx.
Algorithm Updates | | Promoteam0