Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
-
Hello Guys,
I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago).
Reasons are:
-
These pages sent no organic traffic at all in this 8 months
-
Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
-
Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well?
That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree?
Now to the real question:
Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects.
Cheers,
Heiko
-
-
If you remove a URL and allow it to 404 you can either remove it in GWT as well, or wait for them to update it. I would remove it in GWT as well just to be sure.
There is no difference whether you have the files on the server or not unless the redirect comes down someday for awhile (even for an hour), which could result in all of those pages being reindexed. Other potential issues are if you have the site available on another domain or sub-domain that points to the same folder, in which case your redirects might not work on the other domain, depending on how they were written.
For these reasons, I would go ahead and remove the files from the server just to be safe. You can back them up somewhere local or at some point before the "Public HTML" folder on the server.
-
Thanks Everett for your response, changes are in process and I will implement it this week. But it would be even better do remove the not redirected URLs in webmaster tools. right?
Technical question to the redirected URLs: Is there any difference if I leave the redirected webpages on the server or if I delete them?
-
I've done this many times with good results. If the page has no traffic and no external links just remove it, and allow it to 404 so the URLs get removed from the index. If the page has traffic and/or external links, 301 redirect it to the most appropriate page about the topic. In either case remove/update internal links, including those within sitemaps.
Simple as that.
-
It all make sense.
-
-
Well, yes I expect that the other pages will benefit from it, because I basically can overtake the good content parts to the similiar pages. Moreover I can set more internal links to the pages which are actually ranking and generating more traffic. Of course, I could just take off all internal links from the dead pages, but I see no sense in there existence any more.
-
I know that you don't get a penalty for duplicate content. But I think it makes more sense to have one (improved) page for a topic/keyword than having 2 pages and one is basically dead from traffic perspective. From their whole structure the pages are just to simiiliar beside the "content" and even if this cannot force manual actions, it can lead to panda/hummingbird issues you will never recognize.
-
Yeah this action has nothing to do with the dead pages, you are right, I just wanted to mention it, because for me I inptreted it in the way, that google tests similiar pages in there performance and this can lead to longterm decreases. That was for me just another reason for putting similiar websites together and think more in "topical hubs". I talk about really similiar websites like for 3 phrase keywords when just the last word differs and the content is unique but basically tells the user the same like on the other page...
-
-
Question. If the fluctuations were due to the different pages competing with each other, shouldn't you see the different pages exchange places, one goes up, the other far down, then swap places and keep dancing?
-
Yes make sense. It's also what the people at koozai describe in the link Sheena posted.
Yet, my personal seo-religion so far have dictated me to never remove, every time I asked myself if I should, I got to the conclusion was better not to.
Let me re-check your motivation to do so:
- These pages sent no organic traffic at all in this 8 months
That's horrible, but removing them is going to improve something else? Maybe, or maybe not. You can find out only trying out (testing).
- Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
If you are worried about duplicate content penalization, there's no such thing as a duplicate content penalization, google doesn't penalize duplicate content, google just make a choice, choosing one among different duplicate page to rank. Matt Cutts on that here.
If you have multiple landing pages for similar keyword with thin content, improve the content. You can find authoritative voices advocating multiple landing pages for related keyword interlinking as a perfectly whitehat LSI SEO strategy.
- Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50..
I doubt your algo penalization is due to those 0-traffic landing page mentioned above, remove them and see what happen, but I bet won't change it. Instead I would look honestly at all your website and ask myself what spammy, stuffing, nasty dirty little things did I in the past?
-
Yes I checked, these pages don't have external backlinks, generating only link juice through internally linking. As I will change the internal linking and the pages I take down will not get any more internal links this should'nt make any difference...
I just want to avoid any redirect, which is not necessary to really make sure that only pages who have a relevant similiar page get a redirect. makes sense, right?
-
Have you checked with OSE and other tools to see the page juice/authority they may have?
-
Thanks for your opinions!
There are no manual actions against the pages, so shouldn't care about this! Like I said mostly they are generating no traffic at all (for these ones I cannnot see a good reason to redirect and not just delete them from the index and take them down) and some URL's are just competing against each other and the ranking fluctuations are quite high and therefore I want to put these competing pages together.
I guess I will redirect the pages which still have relevant similiar pages left, but don't redirect pages which basically had no traffic at all in 8 months and no real similiar page is existing.
-
This article is about removing blog posts, but I think it's still relevant: http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
The 'removals/redirects' & 'lessons learnt' sections are particularly important to consider.
-
It's possible, but it sounds like the ranking fluctuations are likely from multiple URLs competing for the same search queries ("Often really similar landing pages exist - just minor keyword targeting difference and I would call it "thin" content") rather than poor link profiles. He didn't mention any manual penalties either.
I agree that you would not want all 50 URLs redirecting to one or even just a few URLs. Only redirect the ones that are really related to the content of the remaining pages and let the rest drop off. Also make sure you have a killer 404 page that helps users get to the right pages.
-
I'm not so sure.
Common sense tells me that pages without any Page Authority, or those that may have been penalised (or indeed not indexed) for having spammy, thin content, etc will only pass these **negative **signals on through a 301 redirect?
Also surely if there is as many as 250 potential landing pages all redirecting (maybe even to one single URL), it'd surely raise alarm bells for a crawler?
-
What you're really doing is consolidating 'orphan SEO pages' to fewer, higher value pages - which is a specific example Google providesas a "good reason to redirect one URL to another." I would 301 the pages to their most relevant, consolidated landing pages that remain.
Hope this helps!
-
Why not to redirect? If you don't you will keep seeing them in error in WMT, which is not a good thing. Also returning 410 in theory is an option, but I tried in the past and WMT ignores that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Very wierd pages. 2900 403 errors in page crawl for a site that only has 140 pages.
Hi there, I just made a crawl of the website of one of my clients with the crawl tool from moz. I have 2900 403 errors and there is only 140 pages on the website. I will give an exemple of what the crawl error gives me. | http://www.mysite.com/en/www.mysite.com/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en | | | | | | | | | | There are 2900 pages like this. I have tried visiting the pages and they work, but they are only html pages without CSS. Can you guys help me to see what the problems is. We have experienced huge drops in traffic since Septembre.
Technical SEO | | H.M.N.0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
When Is It Good To Redirect Pages on Your Site to Another Page?
Suppose you have a page on your site that discusses a topic that is similar to another page but targets a different keyword phrase. The page has medium quality content, no inbound links, and the attracts little traffic. Should you 301 redirect the page to a stronger page?
Technical SEO | | ProjectLabs1 -
Optimizing Dynamic landing pages in specific Geo-Locations
I have a website that has a a dynamic URl for individual dealer landing pages - this works based on the geo-location tool on the site. When a user comes it it will recognize their location based on IP address then present the appropriate content based on zip code logic. Example URL: http://www.3mwater.com/?___store=dealer Example URL: http://www.3mwater.com/dealer/ Question: How can I best optimize each individual page for each individual dealer when the URL stays static regardless of dealer? Thank you!
Technical SEO | | Yanez0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Does google recognize original content when affiliates use xml-feeds of this content
Hi, Concerning the upcoming (We're from the Netherlands) Panda release: -Could the fact that our affiliates use XML-feeds of our content effect our rankings in some way -Is it possible to indicate to google that content is yours? Kind regards, Dennis Overbeek dennis@acsi.eu | ACSI publishing | www.suncamp.nl | www.eurocampings.eu
Technical SEO | | SEO_ACSI0