Redirecting thin content city pages to the state page, 404s or 301s?
-
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page.
Something like:
if (this city page should be removed) {
header("HTTP/1.0 404 Not Found");
header("Location:http://example.com/state-level-page")
exit();
}Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page?
Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway.
Thanks in advance!
-
Hello BarrelRoll42,
You should easily be able to find out if Google is indexing them by doing a site:yourdomain.com search on Google. But to answer your question, it sounds like you should probably delete them and let them 404. If Google HAS indexed them you may also need to use the URL Removal Tool in Google Webmaster Tools.
One last thing. Please do start a thread for your own question next time, as we try to keep it to one question per thread.
Thanks!
-
I'm dealing with a similar situation, thousands of low content city pages. There is almost 0 traffic or links to these pages, no human would ever navigate to them - in this case it would be best to just delete them? Do they need a 404? I'm not sure if Google is even indexing them.
-
Hi Daniel,
I am very happy I could be of help to you.
Sincerely,
Thomas
-
thanks, I've removed the redirects. I appreciate the advice!
-
Hi Daniel,
when setting up a 404 page you should have it directed to 404 never 200 and make sure there's nothing else occurring on that page for instance redirecting somebody somewhere else.
to answer your question directly I would eliminate the redirect.I hope this is been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
Glossary index and individual pages create duplicate content. How much might this hurt me?
I've got a glossary on my site with an index page for each letter of the alphabet that has a definition. So the M section lists every definition (the whole definition). But each definition also has its own individual page (and we link to those pages internally so the user doesn't have to hunt down the entire M page). So I definitely have duplicate content ... 112 instances (112 terms). Maybe it's not so bad because each definition is just a short paragraph(?) How much does this hurt my potential ranking for each definition? How much does it hurt my site overall? Am I better off making the individual pages no-index? or canonicalizing them?
Intermediate & Advanced SEO | | LeadSEOlogist0 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
I have a page "A" that I want to completely delete and move the written content from A" to page "B". Since I am deleting "A" (not keeping page) is it OK to upload the content from "A" to page "B" and search engines will give "B" credit for the unique content? Or, since the content has already once been indexed on "A", "B" may struggle to get full credit for this new unique content, even though page "A" is deleted?
Intermediate & Advanced SEO | | khi50 -
How to conduct catch 301 redirects & have the separate 301 redirects for the key pages
Hi, We've currently done a site migration mapping and 301 redirecting only the sites key pages. However two GWT (Google Webmaster Tools) is picking a massive amount of 404 areas and there has been some drop in rankings. I want to mitigate the site from further decline, and hence thought about doing a catch 301 - that is 301 redirecting the remaining pages found on the old site back to the home page, with the future aim of going through each URL one by one to redirect them to the page which is most relevant. Two questions, (1) can I do a catch 301 and if so what is the process and requirements that I have to give to the developer? (2) How do you reduce the number of increasing 404 errors from a site, despite doing 301 redirects and updating links on external linking sites. Note: The server is apache and the site is hosted on Wordpress platform. Regards, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
301 Redirect for 2500 pages
Hi, We have an existing site done in DNN and we recreated it on a new platform (EPiServer) and now we're going live. However, there are 2500+ page URLs from the old site which is not exisitng on the new site. What do you reckon is the best way we can address this? Do we create a 301 redirect individually for each of these pages? These 2500+ pages have a domain authority 34-35 and I think it's best that we retain those. We'll be using the same domain name. Suggestions for ways to approach this issue would be greatly appreciated. I have access to the server and IIS. *Also, how do I create a virtual page in IIS? and redirect it to another URL within the site? Thanks.
Intermediate & Advanced SEO | | Peter.Huxley590 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0 -
I have 4,100 302 redirects; How can I change so many to 301s
hi, i have way to many 302 redirects, how can i bulk change these to 301 i have started in cpanel but i could be old by the time i finsih
Intermediate & Advanced SEO | | freedomelectronics1