Does it really matter to maintain 301 redirect after de-indexing of old URLs?
-
Today, I was reading latest blog post on SEOmoz blog about. Uncrawled 301s - A Quick Fix for When Relaunches Go Too Well
This is very interesting study about 301 & How it useful to maintain traffic. I'm working on eCommerce website and I have done similar stuff on my website. I have big confusion to manage 301 redirect.
My website generates new URLs due to following actions.
- Re-write dynamic URLs.
- Re-launch entire website on different eCommerce platform. [osCommerce to Magento Commerce]
- Re-name category.
- Trasfer one product from one category to another category.
I'm managing my 301 redirect with old practice. Excel sheet data from Google webmaster tools and set specific new URLs for redirect. Hoooo... Now, I have 8.5K redirect in htaccess... And, I'm thinking it's too much.
Can we remove old 301 redirect from htaccess or not? This is big question for me. Because, all pages are not hyperlink on external website. Google have just de-indexed old URLs and indexed new URLs. So, Is it require to maintain 301 redirect after Google process?
-
I always use a 301 redirect.
2K is a lot of pages. If I can redirect them with a couple lines of htaccess code I would do it. If I had to code 2K lines and have that huge file scanned for thousands of visitors I might not do it if I am pretty sure that there is very little traffic.
I use lots of folders on my site and that makes these problems easy to solve.
-
Hi Egol. I am migrating a site with 6k pages. About 2K pages are useless old syndication articles with no incoming links; about 50 are old CID version of forms. These pages won't exist in the new site. How do I delete them, by no doing 301 redirect and not including them in any sitemap? Would they become 404 for a while and then disappear from google index?
-
My ecommerce [www.vistastores.com] website does not have issue due to change or URLs. I rarely change category page URLs and product page URLs. But, I'm facing issue due to narrow by search. If any attribute will remove from narrow by search so 100 pages will convert to 404 due to dynamic structure. That's why I'm setting up 301 redirect to category page URLs from all dynamic pages.
I have concern to reduce 301 redirect and broken links in website. But, I'm not able to stop it. Google may restrict my organic performance due to continue broken links on website and non associated 301 redirect.
-
I have 301 redirects on my sites. Every one that I have done is still out there in htaccess. I am not taking chances on how search engines handle these.
Other than deleting useless pages, I rarely change URLs. If I don't change URLs I don't have to worry about this stuff.
I have not emailed other sites to change the URL in their links. I have only changed the URLs on my own sites. I would worry that asking someone to edit a link might result in a loss... so I am happy with the redirected link.
-
No, I don't want to create romance on my website with 404. EGOL & you have mentioned same about visitors and back links. Now, I have some good feeling after reading 20K redirect which is maintain by you. 8.5K is not big for me.
-
In my view it is important to keep the old redirects, even after you have of replacing the Google index. With the 301 the relevance obtained in the old page, keep to the new page. The 301 is not only to Google index, but also backlinks your page has.
I consider that 8.5 k is not so big, I have sites with 20k and never had any problems or restrictions.
Anyway, if you want to remove the old ones, let the 404 page beautiful, which is always good: D
-
Oh Jesus... This is strong reason Why I'm your big fan since a member on SEO Chat forum. I got your point. OR We can edit hyperlink on external website by email them... Right?
-
Let's say that my website has ten links to your old URLs that deliver hundreds of visitors per day.
What will happen to all of those visitors if you remove the 301 redirects?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Temporary redirect from 302 to 301 for PNG File?
#302HTTP #temporaryredirect
Technical SEO | | Damian_Ed 0
Hi everyone, Recently I have faced a crawl issue with my media images on website. For example this page url https://intreface.com/wp-content/uploads/2022/12/Horion-screen-side-2.png has 302 HTTP Status and the recommendation is to change it 301. I have read the article on temporary redirections here:
https://moz.com/learn/seo/redirection?_ga=2.45324708.1293586627.1702571936-916254120.1702571936
but its not written here how to redirect in my HTML 1 image url not the landing page.
Screenshot 2023-12-15 at 11.02.40.png
I have messaged to MOZ Support but they recommended to go for the MOZ Community!
Screenshot 2023-12-15 at 11.06.02.png Could you assist me wit this issue please? I can reach HTTML of the necessary page and change what I need for permanent redirection but firstly I need to understand how to do that correctly.0 -
301 redirecting a previously abused URL
A client previously had their most important landing page at domain.com/example.htm They carried out the sort of link building that was commonplace a few years back (exact match anchors, paid blog links etc) targeting this URL, but they also got a bunch of legitimate decent quality links here. I believe they may have had a number of issues when link quality algo updates were rolled out, so rather than try and get links removed and go through the disavow process they instead decided to abandon this URL, let it 404 and start afresh at domain.com/example.html - updating all internal navigation, XML sitemaps etc. So fast forward to today. What is the best practice for this URL these days do we think? Is it now possible to 301 domain.com/example.htm > domain.com/example.html and recover whatever value may be left here? The argument for not doing so may be that you could pass over the negative metrics associated with the old URL, but would this not be handled by the real-time penguin update and the poor links just devalued rather than actually harming? And could this just be tested - i.e. add in the 301, monitor the impact and if things don't go the way we'd want then just remove the 301 again? Would be keen to get a few opinions on this. TIA
Technical SEO | | Salience_Search_Marketing0 -
Not All Submitted URLs in Sitemap Get Indexed
Hey Guys, I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right? Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different? Thanks for your help on this 😉 Cheers
Technical SEO | | _Heiko_0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Looking for some help adding a 301 redirect for my Site
Hi there, I am trying to eliminate the 'www' using a 301 redirect script as I have duplicate page titles for both versions (with and without the 'www') I checked the page authority and found the pages without the 'www' to be ranked higher. For this reason I believe it would be wise to go for this option. I have an .htaccess file, all I need is the code and I should be ok 🙂 Thanks!
Technical SEO | | debeenus0 -
Non existant URLs being generated in index
Hi all, I have a pretty big problem with my site at the moment which I'm worried will have an impact on my rankings. I've just had a crawl test done and for some reason I get a load of urls returned that don't actually exist... For example I am getting urls like this in my crawl test and xml sitemap: www.applicablejobs.com/jobs/add/android-designer/android-designer/android-designer/android-developer/android-developer/ www.applicablejobs.com/jobs/add/android-designer/android-designer/android-designer/android-developer/iphone-designer/ All the urls seem to start off with www.applicablejobs.com/jobs/ and there is an entry for every conceivable combination of slugs. I can only assume that if the crawl test and an xml sitemap generator is indexing these urls then Google and other search engines probably are too. Does anyone have any idea what might be causing this issue and what can I do to remove them from Googles index if they are? Thanks
Technical SEO | | Benji870 -
301 redirects inside sitemaps
I am in the process of trying to get google to follow a large number of old links on site A to site B. Currently I have 301 redirects as well a cross domain canonical tags in place. My issue is that Google is not following the links from site A to site B since the links no longer exist in site A. I went ahead and added the old links from site A into site A's sitemap. Unfortunately Google is returning this message inside webmaster tools: When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL. However I do not understand how adding the redirected links from site B to the sitemap in site A will remove the old links. Obviously Google can see the 301 redirect and the canonical tag but this isn't defined in the sitemap as a direct correlation between site A and B. Am I missing something here?
Technical SEO | | jmsobe0 -
Why google index my IP URL
hi guys, a question please. if site:112.65.247.14 , you can see google index our website IP address, this could duplicate with our darwinmarketing.com content pages. i am not quite sure why google index my IP pages while index domain pages, i understand this could because of backlink, internal link and etc, but i don't see obvious issues there, also i have submit request to google team to remove ip address index, but seems no luck. Please do you have any other suggestion on this? i was trying to do change of address setting in Google Webmaster Tools, but didn't allow as it said "Restricted to root level domains only", any ideas? Thank you! boson
Technical SEO | | DarwinChinaSEO0