301 or 404 Question for thin content Location Pages we want to remove
-
Hello All,
I have a Hire Website with many categories and individual location pages for each of the 70 depots we operate. However, being dynamic pages, we have thousands of thin content pages.
We have decided to only concentrate on our best performing locations and get rid of the rest as its physically impossible to write unique content for all our location pages for every categories.
Therefore my question is. Would it cause me problems by having to many 301's for the location pages I am going to re-direct ( i was only going to send these back to the parent category page) or should I just 404 all those location pages and at some point in the future when we are in a position to concentrate on these locations then redo them with new content ? in terms of url numbers It would affect a few thousand 301's or 404's depending on people thoughts.
Also , does anyone know what percentage of thin content on a site should be acceptable ?.. I know , none is best in an ideal world but it would be easier if there we could get away with a little percentage.
We have been affected by Panda , so we are trying to tidy things up as best at possible,
Any advice greatly appreciated?
thanks
Peter
-
Many Travis,
A good detailed answer.
thanks for your help , I will look at doing this.
Pete
-
The Matt Cutts says The Googles treat 404 and 410 codes nearly the same. (Getting to the 301 candidates in a bit.)
If the pages are going to be gone for a long time, if not permanently, I would go ahead and serve 410 codes for those pages. 'A few' 404 results are okay. Though serving up thousands of 404 results doesn't sound like it's going to do the site any favors.
If a page has some 'good links', 'enough traffic'/'enough conversions' and a relevant/related 'good page' you're going to keep - 301 redirect to the 'good page'.
If you serve up thousands of 301 results, you're likely wasting crawl budget. The major bots only have so much bandwidth they're going to use crawling the site. So rather than having the 'good pages' and new pages frequently and thoroughly crawled, you could be inhibiting discovery/indexation. Considering we're talking thousands of redirects, and the site in question probably isn't Zappos, it's probably best to 410 the chaff/thin pages. Google bot will still come back to see if the pages are really gone, but at least you won't be wasting everyone's (Your time and Google bot's time) time in the near future.
There isn't really any hard and fast percentage for what's thin and what isn't. But I can say if you're looking at a page, and it just feels 'thin', you can supplement with other types of content. You can add videos, images, real original reviews - just to name a few possibilities.
At the end of the day, if it's not worth your time to do a page justice, why should search engines - or people for that matter - bother with the site? If five out of 10 people in your target market wouldn't find the page useful, or easily fulfill a need, it's probably best not to make the page at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
301 Redirect to Home Page or Sub-Page?
What do you think about 301 redirect of good expired domain to a sub-page instead of the home page? I'm doing this so I don't hurt my brand name. Let me know your thoughts please. Thank you
Intermediate & Advanced SEO | | JuanWork0 -
Contextual FAQ and FAQ Page, is this duplicate content?
Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan
Intermediate & Advanced SEO | | JonathanLeplang0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Page disappears from search results when Google geographic location is close to offline physical location
If you use Google to search georgefox.edu for "doctor of business administration", the first search result is http://www.georgefox.edu/business/dba/ - I'll refer to this page as the DBA homepage from here on. The second page is http://www.georgefox.edu/offices/sfs/grad/tuition/business/dba/ - I'll refer to this page as the DBA program costs page from here on. Search: https://www.google.com/search?q=doctor+of+business+administration+site%3Ageorgefox.edu This appears to hold true no matter what your geographic location is set to on Google. George Fox University is located in Newberg, Oregon. If you search for "doctor of business administration" with your geographic location set to a location beyond a certain distance away from Newberg, Oregon, the first georgefox.edu result is the DBA homepage. Set your location on Google to Redmond, Oregon
Intermediate & Advanced SEO | | RCF
Search: https://www.google.com/search?q=doctor+of+business+administration But, if you set your location a little closer to home, the DBA homepage disappears from the top 50 search results on Google. Set your location on Google to Newberg, Oregon
Search: https://www.google.com/search?q=doctor+of+business+administration Now the first georgefox.edu page to appear in the search results is the DBA program costs page. Here are the locations I have tested so far: First georgefox.edu search result is the DBA homepage Redmond, OR Eugene, OR Boise, ID New York, NY Seattle, WA First georgefox.edu search result is the DBA program costs page Newberg, OR Portland, OR Salem, OR Gresham, OR Corvallis, OR It appears that if your location is set to within a certain distance of Newberg, OR, the DBA homepage is being pushed out of the search results for some reason. Can anyone verify these results? Does anyone have any idea why this is happening?0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
How much content on PDF download page
Hello, This is about content for an ecommerce site. We have an article page that we also created a PDF out of. We have an HTML page that doesn't have anything commercial on it that is the download page for the PDF page. How much of the article do you recommend we put on the non-commercial HTML download page? Should we put most of the article on there? We're trying to get people to link to the HTML Download page, not the PDF.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0