Is this the best way to get rid of low quality content?
-
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination.
However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process).
Some advise on how to proceed from here would be fantastico and danke
<colgroup><col width="493"></colgroup>
-
Hi! I've asked for another associate with more Panda experience than I to come in and comment on this question.
Byork, knowing a little more about your trailing slash issue could help out. Do you have trailing slash redirects in place for all of your pages? Were they put in at a particular time, where you might be able to look at data from just after that date?
If the trailing slashes are in place correctly and always have been, and it's just some weird artifact of GA that is causing these pages to show up with 0 visits, can you ignore those pages that don't have the trailing slash and focus just on the metrics for those with the trailing slash?
-
rel=canonical is more for when there are parameters on your URLs that you can't really do anything about. When you know one URL is being served, but should be another, you should use a 301 redirect. So in this case, you should pick which URL you like better, either with or without the trailing slash, and redirect one to the other. Google treats both of these as two completely separate pages, which is why you're seeing views on one and not the other. If you can't configure the redirect, then you could resort to rel=canonical.
If you have pages with similar content but not a lot of views, then 301 redirecting that page to another page with more views would be fine. That'll pass it's pagerank along, and good for people who find that original URL later, because they'll go to an actual page instead of your 404 page.
-
Great question.
I'd appreciate a pro seo opinion on this, but here's what I am doing on our website.
To Rel Canonical or 301? That is the question for the /countries/Panama to countries/Panama/ and the other examples like that.
On the other pages, what about moving the best part of the content from a low view page to a similar content higher view page and then 301 the old page to the better page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
If we have all products on-site for indexing, do we get dinged by Google for not transacting on-site?
I am trying to do research on the SEO impact of having an off-site transactional website. For example, Pepsi.com lists all product information on their site but guides visitors to transact on Amazon or Walmart. What impact, if any, does guiding the customer to a separate transactional site have on SEO? In short, if we have all products on-site for indexing, do we get dinged by Google for not transacting on-site?
Algorithm Updates | | KaylaV0 -
Best way to geotag
Hi guys - this question isnt strictly SEO its more of a programming one. I am currently building a directory with a number of different retail shops listed there. At the moment I have google map installed there but I have to drag and drop the pin - rather than google searching the address I input and putting it in themselves automatically. Can anyone point me to documentation on how to get this to word properly? Whether or not I can get the above to work aside - could someone tell me how to use the google drop pin to get the co-ordinates and and correctly add them into the page header? Finally - I want to geotag all images on the page with the same coordinates as the droppin too (as the pictures are taken on premises). Can anyone recommend software that might be able to do this en mass automatically? Thanks in advance Alex
Algorithm Updates | | socialgrowth0 -
Frequency & Percentage of Content Change to get Google to Cache Every Day?
What is the frequency at which your homepage (for example) would have to update and what percentage of the page's content would need to be updated to get cached every day? What are your opinions on other factors.
Algorithm Updates | | bozzie3110 -
Does anyone know what it takes to get your Google Plus statuses to show up under the Knowledge Graph?
I've been looking into G+ and how to get the information and status updates in to the Knowledge Graph for small companies and have not been able to. Does anyone know exactly how to do it?
Algorithm Updates | | DragonSearch1 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0