Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
-
Hello Guys,
I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago).
Reasons are:
-
These pages sent no organic traffic at all in this 8 months
-
Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
-
Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well?
That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree?
Now to the real question:
Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects.
Cheers,
Heiko
-
-
If you remove a URL and allow it to 404 you can either remove it in GWT as well, or wait for them to update it. I would remove it in GWT as well just to be sure.
There is no difference whether you have the files on the server or not unless the redirect comes down someday for awhile (even for an hour), which could result in all of those pages being reindexed. Other potential issues are if you have the site available on another domain or sub-domain that points to the same folder, in which case your redirects might not work on the other domain, depending on how they were written.
For these reasons, I would go ahead and remove the files from the server just to be safe. You can back them up somewhere local or at some point before the "Public HTML" folder on the server.
-
Thanks Everett for your response, changes are in process and I will implement it this week. But it would be even better do remove the not redirected URLs in webmaster tools. right?
Technical question to the redirected URLs: Is there any difference if I leave the redirected webpages on the server or if I delete them?
-
I've done this many times with good results. If the page has no traffic and no external links just remove it, and allow it to 404 so the URLs get removed from the index. If the page has traffic and/or external links, 301 redirect it to the most appropriate page about the topic. In either case remove/update internal links, including those within sitemaps.
Simple as that.
-
It all make sense.
-
-
Well, yes I expect that the other pages will benefit from it, because I basically can overtake the good content parts to the similiar pages. Moreover I can set more internal links to the pages which are actually ranking and generating more traffic. Of course, I could just take off all internal links from the dead pages, but I see no sense in there existence any more.
-
I know that you don't get a penalty for duplicate content. But I think it makes more sense to have one (improved) page for a topic/keyword than having 2 pages and one is basically dead from traffic perspective. From their whole structure the pages are just to simiiliar beside the "content" and even if this cannot force manual actions, it can lead to panda/hummingbird issues you will never recognize.
-
Yeah this action has nothing to do with the dead pages, you are right, I just wanted to mention it, because for me I inptreted it in the way, that google tests similiar pages in there performance and this can lead to longterm decreases. That was for me just another reason for putting similiar websites together and think more in "topical hubs". I talk about really similiar websites like for 3 phrase keywords when just the last word differs and the content is unique but basically tells the user the same like on the other page...
-
-
Question. If the fluctuations were due to the different pages competing with each other, shouldn't you see the different pages exchange places, one goes up, the other far down, then swap places and keep dancing?
-
Yes make sense. It's also what the people at koozai describe in the link Sheena posted.
Yet, my personal seo-religion so far have dictated me to never remove, every time I asked myself if I should, I got to the conclusion was better not to.
Let me re-check your motivation to do so:
- These pages sent no organic traffic at all in this 8 months
That's horrible, but removing them is going to improve something else? Maybe, or maybe not. You can find out only trying out (testing).
- Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
If you are worried about duplicate content penalization, there's no such thing as a duplicate content penalization, google doesn't penalize duplicate content, google just make a choice, choosing one among different duplicate page to rank. Matt Cutts on that here.
If you have multiple landing pages for similar keyword with thin content, improve the content. You can find authoritative voices advocating multiple landing pages for related keyword interlinking as a perfectly whitehat LSI SEO strategy.
- Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50..
I doubt your algo penalization is due to those 0-traffic landing page mentioned above, remove them and see what happen, but I bet won't change it. Instead I would look honestly at all your website and ask myself what spammy, stuffing, nasty dirty little things did I in the past?
-
Yes I checked, these pages don't have external backlinks, generating only link juice through internally linking. As I will change the internal linking and the pages I take down will not get any more internal links this should'nt make any difference...
I just want to avoid any redirect, which is not necessary to really make sure that only pages who have a relevant similiar page get a redirect. makes sense, right?
-
Have you checked with OSE and other tools to see the page juice/authority they may have?
-
Thanks for your opinions!
There are no manual actions against the pages, so shouldn't care about this! Like I said mostly they are generating no traffic at all (for these ones I cannnot see a good reason to redirect and not just delete them from the index and take them down) and some URL's are just competing against each other and the ranking fluctuations are quite high and therefore I want to put these competing pages together.
I guess I will redirect the pages which still have relevant similiar pages left, but don't redirect pages which basically had no traffic at all in 8 months and no real similiar page is existing.
-
This article is about removing blog posts, but I think it's still relevant: http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
The 'removals/redirects' & 'lessons learnt' sections are particularly important to consider.
-
It's possible, but it sounds like the ranking fluctuations are likely from multiple URLs competing for the same search queries ("Often really similar landing pages exist - just minor keyword targeting difference and I would call it "thin" content") rather than poor link profiles. He didn't mention any manual penalties either.
I agree that you would not want all 50 URLs redirecting to one or even just a few URLs. Only redirect the ones that are really related to the content of the remaining pages and let the rest drop off. Also make sure you have a killer 404 page that helps users get to the right pages.
-
I'm not so sure.
Common sense tells me that pages without any Page Authority, or those that may have been penalised (or indeed not indexed) for having spammy, thin content, etc will only pass these **negative **signals on through a 301 redirect?
Also surely if there is as many as 250 potential landing pages all redirecting (maybe even to one single URL), it'd surely raise alarm bells for a crawler?
-
What you're really doing is consolidating 'orphan SEO pages' to fewer, higher value pages - which is a specific example Google providesas a "good reason to redirect one URL to another." I would 301 the pages to their most relevant, consolidated landing pages that remain.
Hope this helps!
-
Why not to redirect? If you don't you will keep seeing them in error in WMT, which is not a good thing. Also returning 410 in theory is an option, but I tried in the past and WMT ignores that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Mobile site content and main site content
Help, pls! I have one main site and a mobile version of that site (m.domain.com). The main site has more pages, more content, different named urls. The main site has consistently done well in Google. The mobile site has not: the mobile site is buried. I am working on adding more content to the mobile site, but am concerned about duplicate content. Could someone pls tell me the best way to deal with these two versions of our site? I can't use rel=canonical because the urls do not correspond to the same names on the main site, or can I? Does this mean I need to change the url names, offer different content (abridged), etc? I really am at a loss as to how to interpret Google's rules for this. Could someone please tell me what I am doing wrong? Any help or tips would GREATLY appreciated!!!!! Thanks!
Technical SEO | | lfrazer0 -
How to make my good sub-page rank ahead of my generic home page?
I have an ecommerce site for the clothes drying racks my family business makes, and it sells a few other laundry items also. It's about 5 years old. We used to rank on the first page for basic phrases like "clothes drying rack" and "umbrella clothesline". About 1.5 years ago we fell hard in the rankings. Since then "umbrella clothesline" has moved back to the first page, but "clothes drying rack" is stuck on the 3rd page and always with the result being the generic homepage instead of the good sub-page (which used to rank on the first page) that really shows-n-tells about our drying rack. Here are the three pages I am talking about. Home page = http://www.bestdryingrack.com/ Drying rack page = http://www.bestdryingrack.com/clothes-drying-rack-main.html and umbrella clothesline page = http://www.bestdryingrack.com/umbrella-clotheslines.html Any ideas on how to get the drying rack page to start ranking well again? (hopefully better than the generic homepage ranks) A little technical background: the Moz campaign on this site says that the home page has a PA = 42 with 190 LRD's and 344 external links. Both the umbrella clothesline page and the clothes drying rack page have almost equal statistics of PA = 35 with 20 LRD's and 23 external links. My anchor text distribution is maybe unbalanced. The drying rack page has 15 external links with the anchor of "Clothes Drying Rack". But the umbrella clothesline page has 14 external links with the anchor of "outdoor umbrella clothesline" and it ranks on the first page for that search. I can't figure out how to get OSE to tell me anchor text stats for just the homepage and not the whole site since www.bestdryingrack.com/index.html 301's to the plain www.bestdryingrack.com (if you know how, please share) What's wrong with my poor neglected clothes drying rack page? The only way I can get it to show up on the first page is to do a real specific search like "round wooden clothes drying rack" Your help could save a faltering family business. Thank you!
Technical SEO | | GregB1230 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Numerous 404 errors on crawl diagnostics (non existent pages)..
As new as them come to SEO so please be gentle.... I have a wordpress site setup for my photography business. Looking at my crawl diagnostics I see several 4xx (client error) alerts. These all show up to non existent pages on my site IE: | http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109 | Totally lost on what could be causing this. Thanks in advance for any help!
Technical SEO | | Swanny8110 -
How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases. Anyone been through this? Any sort of timeline for a recovery? Much appreciated!
Technical SEO | | bheard0 -
Video thumbnail pages with "sort" feature -- tons of duplicate content?
A client has 2 separate pages for video thumbnails. One page is "popular videos" with a sort function for over 700 pages of video thumbnails with 10 thumbnails and short desriptions per page. (/videos?sort_by=popularity). The second page is "latest videos" (/videos?sort_by=latest) with over 7,000 pages. Both pages have a sort function -- including latest, relevance, popularity, time uploaded, etc. Many of the same video thumbnails appear on both pages. Also, when you click a thumbnail you get a full video page and these pages appear to get indexed well. There seem to be duplicate content issues between the "popular" and "latest" pages, as well as within the sort results on each of those pages. (A unique URL is generated everytime you use the sort function i.e. /videos?sort_by=latest&uploaded=this_week). Before my head explodes, what is the best way to treat this? I was thinking a noindex,follow meta robot on every page of thumbnails since the individual video pages are well indexed, but that seems extreme. Thoughts?
Technical SEO | | 540SEO0 -
Htm vs. aspx page extensions & duplicate content
We have a client whose site is fairly new. There isn't much in the way of SEO results so far. In their content management system they have implemented friendly URLs and changed the extensions from aspx to htm. Now the htm pages are all indexed in Google but when I run a campaign report in SEOmoz it shows that all pages are duplicated with there being both htm and aspx pages for each page. Should we do 301 redirects from the aspx pages to the htm pages? Or would we be safe by removing the htm pages and letting Google reindex the site with the aspx page extensions? Does Google have any kind of preference as to what the page extensions are as long as the URLs include keywords?
Technical SEO | | IvieDigital0