Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
-
Hello Guys,
I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago).
Reasons are:
-
These pages sent no organic traffic at all in this 8 months
-
Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
-
Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well?
That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree?
Now to the real question:
Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects.
Cheers,
Heiko
-
-
If you remove a URL and allow it to 404 you can either remove it in GWT as well, or wait for them to update it. I would remove it in GWT as well just to be sure.
There is no difference whether you have the files on the server or not unless the redirect comes down someday for awhile (even for an hour), which could result in all of those pages being reindexed. Other potential issues are if you have the site available on another domain or sub-domain that points to the same folder, in which case your redirects might not work on the other domain, depending on how they were written.
For these reasons, I would go ahead and remove the files from the server just to be safe. You can back them up somewhere local or at some point before the "Public HTML" folder on the server.
-
Thanks Everett for your response, changes are in process and I will implement it this week. But it would be even better do remove the not redirected URLs in webmaster tools. right?
Technical question to the redirected URLs: Is there any difference if I leave the redirected webpages on the server or if I delete them?
-
I've done this many times with good results. If the page has no traffic and no external links just remove it, and allow it to 404 so the URLs get removed from the index. If the page has traffic and/or external links, 301 redirect it to the most appropriate page about the topic. In either case remove/update internal links, including those within sitemaps.
Simple as that.
-
It all make sense.
-
-
Well, yes I expect that the other pages will benefit from it, because I basically can overtake the good content parts to the similiar pages. Moreover I can set more internal links to the pages which are actually ranking and generating more traffic. Of course, I could just take off all internal links from the dead pages, but I see no sense in there existence any more.
-
I know that you don't get a penalty for duplicate content. But I think it makes more sense to have one (improved) page for a topic/keyword than having 2 pages and one is basically dead from traffic perspective. From their whole structure the pages are just to simiiliar beside the "content" and even if this cannot force manual actions, it can lead to panda/hummingbird issues you will never recognize.
-
Yeah this action has nothing to do with the dead pages, you are right, I just wanted to mention it, because for me I inptreted it in the way, that google tests similiar pages in there performance and this can lead to longterm decreases. That was for me just another reason for putting similiar websites together and think more in "topical hubs". I talk about really similiar websites like for 3 phrase keywords when just the last word differs and the content is unique but basically tells the user the same like on the other page...
-
-
Question. If the fluctuations were due to the different pages competing with each other, shouldn't you see the different pages exchange places, one goes up, the other far down, then swap places and keep dancing?
-
Yes make sense. It's also what the people at koozai describe in the link Sheena posted.
Yet, my personal seo-religion so far have dictated me to never remove, every time I asked myself if I should, I got to the conclusion was better not to.
Let me re-check your motivation to do so:
- These pages sent no organic traffic at all in this 8 months
That's horrible, but removing them is going to improve something else? Maybe, or maybe not. You can find out only trying out (testing).
- Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
If you are worried about duplicate content penalization, there's no such thing as a duplicate content penalization, google doesn't penalize duplicate content, google just make a choice, choosing one among different duplicate page to rank. Matt Cutts on that here.
If you have multiple landing pages for similar keyword with thin content, improve the content. You can find authoritative voices advocating multiple landing pages for related keyword interlinking as a perfectly whitehat LSI SEO strategy.
- Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50..
I doubt your algo penalization is due to those 0-traffic landing page mentioned above, remove them and see what happen, but I bet won't change it. Instead I would look honestly at all your website and ask myself what spammy, stuffing, nasty dirty little things did I in the past?
-
Yes I checked, these pages don't have external backlinks, generating only link juice through internally linking. As I will change the internal linking and the pages I take down will not get any more internal links this should'nt make any difference...
I just want to avoid any redirect, which is not necessary to really make sure that only pages who have a relevant similiar page get a redirect. makes sense, right?
-
Have you checked with OSE and other tools to see the page juice/authority they may have?
-
Thanks for your opinions!
There are no manual actions against the pages, so shouldn't care about this! Like I said mostly they are generating no traffic at all (for these ones I cannnot see a good reason to redirect and not just delete them from the index and take them down) and some URL's are just competing against each other and the ranking fluctuations are quite high and therefore I want to put these competing pages together.
I guess I will redirect the pages which still have relevant similiar pages left, but don't redirect pages which basically had no traffic at all in 8 months and no real similiar page is existing.
-
This article is about removing blog posts, but I think it's still relevant: http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
The 'removals/redirects' & 'lessons learnt' sections are particularly important to consider.
-
It's possible, but it sounds like the ranking fluctuations are likely from multiple URLs competing for the same search queries ("Often really similar landing pages exist - just minor keyword targeting difference and I would call it "thin" content") rather than poor link profiles. He didn't mention any manual penalties either.
I agree that you would not want all 50 URLs redirecting to one or even just a few URLs. Only redirect the ones that are really related to the content of the remaining pages and let the rest drop off. Also make sure you have a killer 404 page that helps users get to the right pages.
-
I'm not so sure.
Common sense tells me that pages without any Page Authority, or those that may have been penalised (or indeed not indexed) for having spammy, thin content, etc will only pass these **negative **signals on through a 301 redirect?
Also surely if there is as many as 250 potential landing pages all redirecting (maybe even to one single URL), it'd surely raise alarm bells for a crawler?
-
What you're really doing is consolidating 'orphan SEO pages' to fewer, higher value pages - which is a specific example Google providesas a "good reason to redirect one URL to another." I would 301 the pages to their most relevant, consolidated landing pages that remain.
Hope this helps!
-
Why not to redirect? If you don't you will keep seeing them in error in WMT, which is not a good thing. Also returning 410 in theory is an option, but I tried in the past and WMT ignores that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel=canonical on landing page question
Currently we have two versions of a category page on our site (listed below) Version A: www.example.com/category • lives only in the SERPS but does not live on our site navigation • has links • user experience is not the best Version B: www.example.com/category?view=all • lives in our site navigation • has a rel=canonical to version A • very few links and doesn’t appear in the SERPS • user experience is better than version A Because the user experience of version B is better than version A I want to take out the rel=canonical in version B to version A and instead put a rel=canonical to version B in version A. If I do this will version B show up in the SERPS eventually and replace version A? If so, how long do you think this would take? Will this essentially pass page rank from version A to version B
Technical SEO | | znotes0 -
Rel=Canonical on a page with 302 redirection existing
Hi SEOMoz! Can I have the rel=canonical tag on a URL page that has a 302 redirection? Does this harm the search engine friendliness of a content page / website? Thanks! Steve
Technical SEO | | sjcbayona-412180 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Local SEO for service industry - one landing page for every town...in every county...in every state?
Starting a second local based service site. Initially going to target a couple counties and move on from there as the business grows. The first site of mine I set up a page for each town [service] + [town] + [state] + [zip]. I am afraid this could get out of control though if I don't have unique content on each page. For the last site I simply copied the page and replace the town name in each as well as the picture, picture title, and image name to make it look more unique for users but not necessarily Google. I had pretty good results but I want this next site to be done properly. Should I only target a few of the major markets to begin with? What about long tail searches for smaller towns that currently bring in a good amount of business? I am concerned about having "too many" long tail pages for each town which would essentially become a listing of every town and county in the state if I was to maintain the pace I want to. Also I would need a good amount of backlinks to each specific town page url if I wanted to do well in each of those specific markets right? Is this where the fine line between niche term and broad search is? Is there any happy medium?
Technical SEO | | kabledesigns0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140