Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
-
Hello Guys,
I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago).
Reasons are:
-
These pages sent no organic traffic at all in this 8 months
-
Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
-
Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well?
That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree?
Now to the real question:
Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects.
Cheers,
Heiko
-
-
If you remove a URL and allow it to 404 you can either remove it in GWT as well, or wait for them to update it. I would remove it in GWT as well just to be sure.
There is no difference whether you have the files on the server or not unless the redirect comes down someday for awhile (even for an hour), which could result in all of those pages being reindexed. Other potential issues are if you have the site available on another domain or sub-domain that points to the same folder, in which case your redirects might not work on the other domain, depending on how they were written.
For these reasons, I would go ahead and remove the files from the server just to be safe. You can back them up somewhere local or at some point before the "Public HTML" folder on the server.
-
Thanks Everett for your response, changes are in process and I will implement it this week. But it would be even better do remove the not redirected URLs in webmaster tools. right?
Technical question to the redirected URLs: Is there any difference if I leave the redirected webpages on the server or if I delete them?
-
I've done this many times with good results. If the page has no traffic and no external links just remove it, and allow it to 404 so the URLs get removed from the index. If the page has traffic and/or external links, 301 redirect it to the most appropriate page about the topic. In either case remove/update internal links, including those within sitemaps.
Simple as that.
-
It all make sense.
-
-
Well, yes I expect that the other pages will benefit from it, because I basically can overtake the good content parts to the similiar pages. Moreover I can set more internal links to the pages which are actually ranking and generating more traffic. Of course, I could just take off all internal links from the dead pages, but I see no sense in there existence any more.
-
I know that you don't get a penalty for duplicate content. But I think it makes more sense to have one (improved) page for a topic/keyword than having 2 pages and one is basically dead from traffic perspective. From their whole structure the pages are just to simiiliar beside the "content" and even if this cannot force manual actions, it can lead to panda/hummingbird issues you will never recognize.
-
Yeah this action has nothing to do with the dead pages, you are right, I just wanted to mention it, because for me I inptreted it in the way, that google tests similiar pages in there performance and this can lead to longterm decreases. That was for me just another reason for putting similiar websites together and think more in "topical hubs". I talk about really similiar websites like for 3 phrase keywords when just the last word differs and the content is unique but basically tells the user the same like on the other page...
-
-
Question. If the fluctuations were due to the different pages competing with each other, shouldn't you see the different pages exchange places, one goes up, the other far down, then swap places and keep dancing?
-
Yes make sense. It's also what the people at koozai describe in the link Sheena posted.
Yet, my personal seo-religion so far have dictated me to never remove, every time I asked myself if I should, I got to the conclusion was better not to.
Let me re-check your motivation to do so:
- These pages sent no organic traffic at all in this 8 months
That's horrible, but removing them is going to improve something else? Maybe, or maybe not. You can find out only trying out (testing).
- Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
If you are worried about duplicate content penalization, there's no such thing as a duplicate content penalization, google doesn't penalize duplicate content, google just make a choice, choosing one among different duplicate page to rank. Matt Cutts on that here.
If you have multiple landing pages for similar keyword with thin content, improve the content. You can find authoritative voices advocating multiple landing pages for related keyword interlinking as a perfectly whitehat LSI SEO strategy.
- Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50..
I doubt your algo penalization is due to those 0-traffic landing page mentioned above, remove them and see what happen, but I bet won't change it. Instead I would look honestly at all your website and ask myself what spammy, stuffing, nasty dirty little things did I in the past?
-
Yes I checked, these pages don't have external backlinks, generating only link juice through internally linking. As I will change the internal linking and the pages I take down will not get any more internal links this should'nt make any difference...
I just want to avoid any redirect, which is not necessary to really make sure that only pages who have a relevant similiar page get a redirect. makes sense, right?
-
Have you checked with OSE and other tools to see the page juice/authority they may have?
-
Thanks for your opinions!
There are no manual actions against the pages, so shouldn't care about this! Like I said mostly they are generating no traffic at all (for these ones I cannnot see a good reason to redirect and not just delete them from the index and take them down) and some URL's are just competing against each other and the ranking fluctuations are quite high and therefore I want to put these competing pages together.
I guess I will redirect the pages which still have relevant similiar pages left, but don't redirect pages which basically had no traffic at all in 8 months and no real similiar page is existing.
-
This article is about removing blog posts, but I think it's still relevant: http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
The 'removals/redirects' & 'lessons learnt' sections are particularly important to consider.
-
It's possible, but it sounds like the ranking fluctuations are likely from multiple URLs competing for the same search queries ("Often really similar landing pages exist - just minor keyword targeting difference and I would call it "thin" content") rather than poor link profiles. He didn't mention any manual penalties either.
I agree that you would not want all 50 URLs redirecting to one or even just a few URLs. Only redirect the ones that are really related to the content of the remaining pages and let the rest drop off. Also make sure you have a killer 404 page that helps users get to the right pages.
-
I'm not so sure.
Common sense tells me that pages without any Page Authority, or those that may have been penalised (or indeed not indexed) for having spammy, thin content, etc will only pass these **negative **signals on through a 301 redirect?
Also surely if there is as many as 250 potential landing pages all redirecting (maybe even to one single URL), it'd surely raise alarm bells for a crawler?
-
What you're really doing is consolidating 'orphan SEO pages' to fewer, higher value pages - which is a specific example Google providesas a "good reason to redirect one URL to another." I would 301 the pages to their most relevant, consolidated landing pages that remain.
Hope this helps!
-
Why not to redirect? If you don't you will keep seeing them in error in WMT, which is not a good thing. Also returning 410 in theory is an option, but I tried in the past and WMT ignores that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content issue in Magento: The product pages are available true 3 URL's! How can we solve this?
Right now the product page "gedroogde goji bessen" (Dutch for: dried goji berries) is available true 3 URL's! **http://www.sportvoeding.net/gedroogde-goji-bessen ** =>
Technical SEO | | Zanox
By clicking on the product slider on the homepage
http://www.sportvoeding.net/superfood/gedroogde-goji-bessen =>
First go to sportvoeding.net/superfood (main categorie) and than clicking on "gedroogde Goji bessen"
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen =>
When directly go to the subcategorie "Goji Bessen" true the menu and there clicking on "gedroogde Goji Bessen" We want to have the following product URL:
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen Does someone know´s a good Exetension for this issue?0 -
Discontinuing a site & Redirecting Traffic to an Internal Page
We are wondering the best way to redirect the traffic from a site that will no longer exist. The Scenario:
Technical SEO | | TopFloor
Our client wants to discontinue this website http://www.animalcarepackaging.com/. We’d like to redirect the traffic from this site to an internal page on our client's other website: http://www.glenroy.com/packaging/. This internal page is the most appropriate to the content that appears on animalcarepackaging.com (as opposed to just the entire site glenroy.com). Possible Options We Are Considering:
Option 1: Keep hosting animalcarepackaging.com and add a 301 redirect for all pages to glenroy.com/packaging/. Our concern with this option is that Google/Bing will see animalcarepackaging.com as a gateway, which could hurt glenroy.com. Option 2: Keep hosting animalcarepackaging.com and add a 301 redirect so all pages are sent to glenroy.com/packaging/; AND file a change of address with Google and Bing. We believe this will allow people who have bookmarked animalcarepackaging.com to go to glenroy.com/packaging/; while people searching for animalcarepackaging.com will go to glenroy.com's home page. We would augment this by posting a message on the homepage of animalcarepackaging.com notifiying users that the site will be discontinued and info will be found at glenroy.com/packaging. Option 3: Do a change of address with Google/Bing and send all traffic to glenroy.com (rather than an internal page). Post information on the homepage of animalcarepackaging.com that the site will be discontinued on X-date, and info about animalcarepackaging.com will be able to be found at glenroy.com/packaging. Looking for feedback on our options and suggestions on how this can be handled.0 -
How to verify a page-by-page level 301 redirect was done correctly?
Hello, I told some tech guys to do a page-by-page relevant 301 redirect (as talked about in Matt Cutts video https://www.youtube.com/watch?v=r1lVPrYoBkA) when a company wanted to move to a new domain when their site was getting redesigned. I found out they did a 302 redirect on accident and had to fix that, so now I don't trust they did the page-by-page relevant redirect. I have a feeling they just redirected all of the pages on the old domain to the homepage of the new domain. How could I confirm this suspicion? I run the old domain through screaming frog and it only shows 1 URL - the homepage. Does that mean they took all of the pages on the old domain offline? Thanks!
Technical SEO | | EvolveCreative0 -
Avoiding duplicate content on internal pages
Lets say I'm working on a decorators website and they offer a list of residential and commercial services, some of which fall into both categories. For example "Internal Decorating" would have a page under both Residential and Commercial, and probably even a 3rd general category of Services too. The content inside the multiple instances of a given page (i.e. Internal Decorating) at best is going to be very similar if not identical in some instances. I'm just a bit concerned that having 3 "Internal Decorating" pages could be detrimental to the website's overall SEO?
Technical SEO | | jasonwdexter0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140