I know I'm missing pages with my page level 301 re-directs. What can I do?
-
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
-
It really depends on the platform you're on and the way the page level redirects are set up, but if you list all the rules for the existing pages, you can always add a redirect at the very end. If implemented properly, anything left over should just that rule.
The alternative is to build a custom 404 handler that actually implements a 301-redirect to the new site.
I'd agree with this post, though - if the content really is dead, in some cases, it's better to let it 404 -
http://www.seroundtable.com/archives/022739.html
If you're really starting over, and for pages that aren't very active (no links, very little traffic), it can make more sense to clean things up. There's no one-sized-fits-all answer - it depends a lot on the scope of the site and the nature of the change.
-
So there is not a way to put some kind of catch all re-direct without doing it at the root and redirecting everything to the homepage?
-
ErrorDocument 404 /
In the htaccess. Or spider the frick out of the site with Screaming Frog. There's an article on this;
http://www.seomoz.org/blog/8-ways-to-find-old-urls-after-a-failed-site-migration-whiteboard-friday
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowing WP 'author' page archives
Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?
Technical SEO | | Netrepid0 -
Will really old links have any benefit being 301'd
I have a client who when they built their site never had any of their old links 301'd - I've now managed to locate a few of these links and am going to redirect them. The site was rebuilt 2006/07 - and it ranked page one and #1 for lots of relevant keywords, if I redirect these to the current pages will the rankings still carry??
Technical SEO | | lauratagdigital0 -
Is page rank lost through a 301 redirect?
Hi everyone. I'd really appreciate your help with this one 🙂 I've just watched Matt Cutt's video 'what percentage of PageRank is lost through a 301 redirect?' and I am confused. I had taken this to mean that a re-direct would always lose you page rank, but watching it again I am not so sure. He says that the amount of page rank lost through a 301 redirect is the same as any other link. Does this mean that no page rank at all is lost during site migrations? Or is it the case that first page rank would be lost from the original link and then more page rank would be lost from any subsequent redirects? watch?v=Filv4pP-1nw
Technical SEO | | RG_SEO0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
What can be the cause of my inner pages ranking higher than my home page?
If you do a search for my own company name or products we sell the inner pages rank higher than the homepage and if you do a search for exact content from my home page my home page doesn't show in the results. My homepage shows when you do a site: search so not sure what is causing this.
Technical SEO | | deciph220 -
What can i do to let google know we are a lifestyle magazine
Hi my site http://www.in2town.co.uk/ is a lifestyle magazine and before we changed templates and changed the joomla language from 1.0 to 1.5 our lifestyle magazine was getting around 10,000 visitors a day and we were number one in google for the search term lifestyle magazine. In the past few months we have been number five for the search term lifestyle magazine and now since yesterday we have dropped to ten. I have a serious problem that being a online lifestyle magazine i do not have the luxury to have lots of text telling people it is a lifestyle magazine and google. What i mean is, if this was a website about benidorm then you would have an introduction about benidorm so google would understand what the site is about. What i would like help on, is to get back to the top and know where i can put a nice introduction which does not look out of place and what i can do to get back to the top under the search term lifestyle magazine. any help would be great
Technical SEO | | ClaireH-1848860 -
What can I do about missing Meta Description for category pagest etc.?
On all my campaigns I'm returning high levels of 'Missing Meta Description Tags'. The problem with fixing this is they're all for category, tag and author pages. Is there a way to add a meta description to these pages (there are hundreds) or will it not really have any ranking effect?
Technical SEO | | SiliconBeachTraining0 -
Why this page doesn't get indexed?
Hi, I've just taken over development and SEO for a site and we're having difficulty getting some key pages indexed on our site. They are two clicks away from the homepage, but still not getting indexed. They are recently created pages, with unique content on. The architecture looks like this:Homepage >> Car page >> Engine specific pageWhenever we add a new car, we link to its 'Car page' and it gets indexed very quickly. However the 'Engine pages' for that car don't get indexed, even after a couple of weeks. An example of one of these index pages are - http://www.carbuzz.co.uk/car-reviews/Volkswagen/Beetle-New/2.0-TSISo, things we've checked - 1. Yes, it's not blocked by robots.txt2. Yes, it's in the sitemap (http://www.carbuzz.co.uk/sitemap.xml)3. Yes, it's viewable to search spiders (e.g. the link is present in the html source)This page doesn't have a huge amount of unique content. We're a review aggregator, but it still does have some. Any suggestions as to why it isn't indexed?Thanks, David
Technical SEO | | soulnafein0