301 or 404 Question for thin content Location Pages we want to remove
-
Hello All,
I have a Hire Website with many categories and individual location pages for each of the 70 depots we operate. However, being dynamic pages, we have thousands of thin content pages.
We have decided to only concentrate on our best performing locations and get rid of the rest as its physically impossible to write unique content for all our location pages for every categories.
Therefore my question is. Would it cause me problems by having to many 301's for the location pages I am going to re-direct ( i was only going to send these back to the parent category page) or should I just 404 all those location pages and at some point in the future when we are in a position to concentrate on these locations then redo them with new content ? in terms of url numbers It would affect a few thousand 301's or 404's depending on people thoughts.
Also , does anyone know what percentage of thin content on a site should be acceptable ?.. I know , none is best in an ideal world but it would be easier if there we could get away with a little percentage.
We have been affected by Panda , so we are trying to tidy things up as best at possible,
Any advice greatly appreciated?
thanks
Peter
-
Many Travis,
A good detailed answer.
thanks for your help , I will look at doing this.
Pete
-
The Matt Cutts says The Googles treat 404 and 410 codes nearly the same. (Getting to the 301 candidates in a bit.)
If the pages are going to be gone for a long time, if not permanently, I would go ahead and serve 410 codes for those pages. 'A few' 404 results are okay. Though serving up thousands of 404 results doesn't sound like it's going to do the site any favors.
If a page has some 'good links', 'enough traffic'/'enough conversions' and a relevant/related 'good page' you're going to keep - 301 redirect to the 'good page'.
If you serve up thousands of 301 results, you're likely wasting crawl budget. The major bots only have so much bandwidth they're going to use crawling the site. So rather than having the 'good pages' and new pages frequently and thoroughly crawled, you could be inhibiting discovery/indexation. Considering we're talking thousands of redirects, and the site in question probably isn't Zappos, it's probably best to 410 the chaff/thin pages. Google bot will still come back to see if the pages are really gone, but at least you won't be wasting everyone's (Your time and Google bot's time) time in the near future.
There isn't really any hard and fast percentage for what's thin and what isn't. But I can say if you're looking at a page, and it just feels 'thin', you can supplement with other types of content. You can add videos, images, real original reviews - just to name a few possibilities.
At the end of the day, if it's not worth your time to do a page justice, why should search engines - or people for that matter - bother with the site? If five out of 10 people in your target market wouldn't find the page useful, or easily fulfill a need, it's probably best not to make the page at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Should I remove pages to concentrate link juice?
So our site is database powered and used to have up to 50K pages in google index 3 years ago. After re-design that number was brought down to about 12K currently. Legacy URLs that are now generating 404 have mostly been redirected to appropriate pages (some 13K 301 redirects currently). Trafficked content accounts for about 2K URLs in the end so my question is should I in context of concentrating link juice to most valuable pages: remove non-important / least trafficked pages from site and just have them show 404 no-index non-important / least trafficked pages from site but still have them visible 1 or 2 above plus remove from index via Webmaster Tools none of the above but rather something else? Thanks for any insights/advice!
Intermediate & Advanced SEO | | StratosJets0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
Serving different content based on IP location
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B. Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B. My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript? We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient? Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
Intermediate & Advanced SEO | | ChatterBlock0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0 -
Should I 301 Redirect Old Pages to Newer Ones?
I know there is value having lots of unique content on our websites, but I'm wondering how long it should be kept for, and if there is any value in 301 redirecting it? So, for example we have a number of pages on our website that are dedicated to single products (blue widget x, blue widget y, red widget x, red widget y). Nice unique content, with some (but not many) links. These products are no longer available though and have been replaced. So I'm faced with three choices: 1. Leave it as it is, and hope it adds to the overall site authority (by value of being another page), and also perhaps mop up a few longer tail keywords. Add a link to the replacement product on these pages; 2. 301 redirect these pages to the replacement products to give these a bit of a boost, and lose the content; 3. 301 redirect these pages to the replacement products and move all the old content to a new 'blue widgets archive' and 'red widgets archive' page? Would appreciate everyones thoughts!
Intermediate & Advanced SEO | | BigMiniMan0