I know I'm missing pages with my page level 301 re-directs. What can I do?
-
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
-
It really depends on the platform you're on and the way the page level redirects are set up, but if you list all the rules for the existing pages, you can always add a redirect at the very end. If implemented properly, anything left over should just that rule.
The alternative is to build a custom 404 handler that actually implements a 301-redirect to the new site.
I'd agree with this post, though - if the content really is dead, in some cases, it's better to let it 404 -
http://www.seroundtable.com/archives/022739.html
If you're really starting over, and for pages that aren't very active (no links, very little traffic), it can make more sense to clean things up. There's no one-sized-fits-all answer - it depends a lot on the scope of the site and the nature of the change.
-
So there is not a way to put some kind of catch all re-direct without doing it at the root and redirecting everything to the homepage?
-
ErrorDocument 404 /
In the htaccess. Or spider the frick out of the site with Screaming Frog. There's an article on this;
http://www.seomoz.org/blog/8-ways-to-find-old-urls-after-a-failed-site-migration-whiteboard-friday
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
Unused url 'A' contains frameset - can it damage the other site B?
Client has an old unused site 'A' which I've discovered during my backlink research. It contains this source code below which frames the client's 'proper' site B inside the old unused url A in the browser address. Quick question - will google penalise the website B which is the one I'm optimising? Should the client be using a redirect instead? <frameset <span class="webkit-html-attribute-name">border='0' frameborder='0' framespacing='0'></frameset <span> <frame src="http: www.clientwebsite.co.ukb" frameborder="0" noresize="noresize" scrolling="yes"></frame src="http:> Please go to http://www.clientwebsite.co.ukB <noframes></noframes> Thanks, Lu.
Technical SEO | | Webrevolve0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
If a page isn't linked to or directly sumitted to a search engine can it get indexed?
Hey Guys, I'm curious if there are ways a page can get indexed even if the page isn't linked to or hasn't been submitted to a search engine. To my knowledge the following page on our website is not linked to and we definitely didn't submit it to Google - but it's currently indexed: <cite>takelessons.com/admin.php/adminJobPosition/corp</cite> Anyone have any ideas as to why or how this could have happened? Hopefully I'm missing something obvious 🙂 Thanks, Jon
Technical SEO | | TakeLessons0 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0 -
Website Page Structuring and URL re writing - need helpful resources
Hello, I am not technically very sound and I need some good articles that teach me how to think about and go about website pages structuring and url rewriting that is seo friendly. I will be most obliged if some of you great seomoz-ers can pitch in with help. Regards, Talha ZigZag Solutions
Technical SEO | | TopGearMedia0 -
How do I prove to a client their important cat level pages are 301-ing?
I've cut and paste webmaster reports showing the webmaster the "301" page perm moved, however they still believe the pages are running up and normal, as 200s. Its been tough to get them to acknowledge the problem, however I'm certain its negatively affecting results. Any help would be greatly appreciated, asap!
Technical SEO | | ankurv0