Is this the best way to get rid of low quality content?
-
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination.
However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process).
Some advise on how to proceed from here would be fantastico and danke
<colgroup><col width="493"></colgroup>
-
Hi! I've asked for another associate with more Panda experience than I to come in and comment on this question.
Byork, knowing a little more about your trailing slash issue could help out. Do you have trailing slash redirects in place for all of your pages? Were they put in at a particular time, where you might be able to look at data from just after that date?
If the trailing slashes are in place correctly and always have been, and it's just some weird artifact of GA that is causing these pages to show up with 0 visits, can you ignore those pages that don't have the trailing slash and focus just on the metrics for those with the trailing slash?
-
rel=canonical is more for when there are parameters on your URLs that you can't really do anything about. When you know one URL is being served, but should be another, you should use a 301 redirect. So in this case, you should pick which URL you like better, either with or without the trailing slash, and redirect one to the other. Google treats both of these as two completely separate pages, which is why you're seeing views on one and not the other. If you can't configure the redirect, then you could resort to rel=canonical.
If you have pages with similar content but not a lot of views, then 301 redirecting that page to another page with more views would be fine. That'll pass it's pagerank along, and good for people who find that original URL later, because they'll go to an actual page instead of your 404 page.
-
Great question.
I'd appreciate a pro seo opinion on this, but here's what I am doing on our website.
To Rel Canonical or 301? That is the question for the /countries/Panama to countries/Panama/ and the other examples like that.
On the other pages, what about moving the best part of the content from a low view page to a similar content higher view page and then 301 the old page to the better page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
How much content is it safe to change?
I have read that it is unsafe to change more than 20% of your site’s content in any update. The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on." Is this true, has anyone had any direct experiences of this or similar?
Algorithm Updates | | GrouchyKids0 -
How is best to use Permalinks for Wordpress /category/postname or /postname
Hello , I have a question Regarding the Permalink structure form Wordpress ,I am trying to figure out what would be the best structure of the blog post link ,for the moment I am using the structure example.com/postname and I changed the structure to example.com/category/postname ,redirected with 301 the old links to the new links and I thought about it and wanted to ask , I would really appreciate if you could tell me what is best form SEO point of view to do. Regards,
Algorithm Updates | | anitawapa0 -
Duplicate content on a sub domain
I have two domains www.hairremoval.com and a sub domain www.us.hairromoval.com both sites have virtual the same content apart from around 8 pages and the sub domain is more focused to US customers so the spelling are different, it is also hosted in the states. Would this be classed as duplicate content ? (The url’s are made up for the question but the format is correct)
Algorithm Updates | | Nettitude0 -
How to get Yahoo visitors to my site
I get great traffic from Google but Yahoo is at about a 20 to 1 ratio on visitors. Is there anything I should do to increase Yahoo traffic? I bought a Yahoo Directory listing about 3 months ago but it did no good. Thanks, Boo
Algorithm Updates | | Boodreaux0 -
How to get global search results on Google ? Also, is it possible to get results based on some other geographic location?
I don't want results based on my geographic location. When I am in India, I don't want local search results. In fact, I want results which are not dependent on my current location. Also, can I change my current location to some other city and will it affect the results ? For eg: While I am in London, can my search results be modified as if I am sitting in New York ?
Algorithm Updates | | EricMoore0 -
SEOmoz suddenly reporting duplicate content with no changes???
I am told the crawler has been updated and wanted to know if anyone else is seeing the same thing I am. SEOmoz reports show many months of no duplicate content problems. As of last week though, I get a little over a thousand pages reported as dupe content errors. Checking these pages I find there is similar content (hasn't changed) with keywords that are definitely different. Many of these pages rank well in Google, but SEOmoz is calling them out as duplicate content. Is SEOmoz attempting to closely imitate Google's perspective in this matter and therefore telling me that I need to seriously change the similar content? Anyone else seeing something like this?
Algorithm Updates | | Corp0