How to de-index old URLs after redesigning the website?
-
Thank you for reading.
After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain).
It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?)
What is the best way to deal with this issue?
-
Thank you Clever PhD, really valuable insights!
-
I completely agree with all of the above - I've taken her point more like my own. Where receiving thousands of annoying 404 errors from pages that haven't existed for many months just gets annoying!
-
I respectfully disagree with all of the above. Please repeat after me, 404s are not bad, they are diagnostic, 404s are not bad, they are diagnostic, 404s are not bad, they are diagnostic.
After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain).
**Part 1 Internal links that 404s from Moz Crawl: **The 404s that show up in the Moz crawl are only going to be from an internal link on your website. The Moz crawl only looks at internal links and not links from other website. In other words, if you see 404s in your Moz crawl, that means, somewhere, you are linking to those pages and that is why the 404s are showing up. Download the CSV and you will find them in your Moz crawl. Other tools such as screaming frog, Botify, Deep Crawl, will show you a similar analysis.
Simple solution. Go through your code and remove the internal links on your site that direct the Moz crawler to those pages and the 404s will go away. (FYI this same approach will work for any internal 301s) These 404 errors in the Moz report are great diagnostic signals on where to fix your site. It is bad for users to click on a link within your website and get sent to a page that does not exist.
**Part 2 external links from Search Console: **The 404s that show up in Search console can come from your internal links on your site AND external links from other sites. Google will keep trying to crawl these links due to other sites linking to pages on your site and your own internal links. For internal link fixing - see suggestion above. For external links you need a different approach.
Look at the external links, where are they coming from? Are they from quality websites? Do they go to formerly important pages on your websites (ie pages that were good converters? If so, then use the 301 redirect to send them to the correct replacement page (and this is not always the home page). You get users to the correct page and also any link equity is passed along as well and this can help with your site rankings. If the link goes to former page on your site that was not any good to start with and the links that come into it are poor quality, then you just let the page 404. Tools such as Moz Open Site Explorer or Ahrefs or Majestic can help with this assessment - but usually you can just look at a site linking to you and tell if it is crap or not.
You need to consider the above regardless of if you want to get the pages that are 404ing in question out of the Google index as if you get Google to remove the page from the index, it will then see the internal link on your site and then find the 404 again. If you have removed the links to the 404 pages on your site, eventually Google will stop crawling them and drop out of the index.
Important note regarding the use of robots.txt. Blocking Google from crawling the 404s will not remove the pages from the index, Google will just stop crawling them. Google has to be able to crawl the URL to see the 404 and then see that it is a bad page and then remove the page from the index. Blocking with robots.txt stops Google from doing that. As soon as you take the page out of robots Google will recrawl and the 404 shows up again. Robots.txt treats a symptom that is a red herring, allowing the 404 to occur takes care of the issue permanently.
Dead pages are a natural part of the web. Let Google see the 404 (if it truly is a page that should 404 and has no link equity that should be passed along with a 301). Google will crawl the 404 several times, you will see it in search console several times. It is ok. You are not penalized for X number of 404s. You may lose ranking if you 404 a page that Google used to rank well, but this is just because Google will not keep a page highly ranked that does not exist :-). Help Google out by cleaning up your internal link structure so when it sees that you do not link to the page any more, then that is a signal that the page should 404. Google knows that due to the nature of the web, pages will time out on occasion and show an error. Google will continue to recrawl a page just to make sure, it wants to give you the benefit of the doubt. Therefore, you have to give clear directives by not linking to dead pages so that after Google double and triple checks the page, it will finally drop it. You will see the 404 in your Search Console for several months then it will eventually go away.
Hope that makes sense. Good luck!
-
Hey Lana, If you really think that 301 does not make sense in that case you can always add the URLs in the robots.txt file and once Google will recrawl your website, Google will de-index the pages from the index.
Another thing you can do is using the de-index feature in Google webmaster tool. You can do that by getting in to your GWT, Optimization > Remove URLs and do that accordingly.
Hope this helps!
-
I see the point. Thanks Liam. As the most of our 404 pages starts with /en-GB/ i will do like this:
Disallow: /en-GB/
-
Hi Lana,
I've been having the same problem on one of our websites. I've been 301 redirecting over 5,000 URL's but still receive a lot of 404 errors. One of the main reasons for these 404 errors still appearing is other bots such as Bing Bot that is still crawling the old URL's.
To resolve this, I would just block them in your robots.txt file. We blocked our old product URL's that were under a "product directory like this:
User-agent: *
Disallow: /product/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Someone redirected his website to ours
Hi all, I have strange issue as someone redirected website http://bukmachers.pl to ours https://legalnibukmacherzy.pl We don't know exactly what to do with it. I checked backlinks and the website had some links which now redirect to us. I also checked this website on wayback machine and back in 2017 this website had some low quality content but in 2018 they made similar redirection to current one but to different website (our competitor). Can such redirection be harmful for us? Should we do something with this or leave it, as google stop encouraging to disavow low quality links.
Intermediate & Advanced SEO | | Kahuna_Charles1 -
URL structure for am International website with subdirectories
Hello, The company I am working for is launching a new ecommerce website (just a handful of products).
Intermediate & Advanced SEO | | Lvet
In the first phase, the website will be English only, but it will be possible to order internationally (20 countries).
In a second phase, new languages and countries will be added. I am wondering what is the best URL structure for launch: Start with a structure similar to website.com/language/content (later on we will add other languages than english) Start with a structure similar to website.com/country/content
3) Start with a structure similar to website.com/country-language/content (at the beginning it will be all website.com/country-en/content) What do you think? Cheers
Luca0 -
What is the Redirect Rule for corresponding https urls to new domain with the same https urls?
2 sites have the same urls but the owner wants just the 1 site. So I will be doing a 301 redirect with .htaccess from https://www.example.co.uk/sportsbook/SOCCER/today/ redirecting to https://www.example.com//sportsbook/SOCCER/today/ There are a lot of urls that are the same, so I was wondering what the rule is to put in the file please that will change them all to the corresponding urls? Would this be correct?... RewriteEngine on
Intermediate & Advanced SEO | | WSIDW
RewriteCond %{HTTPS_HOST} ^example.co.uk [NC,OR]
RewriteCond %{HTTPS_HOST} ^www.example.co.uk [NC]
RewriteRule ^(.*)$ https://example.com$1 [L,R=301,NC] Or would a simple rule like this work... redirect 301 / http://www.new domain.com/ If not correct could you please give me the correct rule, thanks! Then of course doing a change of address of address in webmaster tools after. Also... do I still need to do the forwarding from the https://www.example.co.uk/ domain provider after as well? Many thanks for your help in advance.0 -
Rankings Tanked since new Site redesign land new url Structure ? Anything Glaringly Obvious I need to check ?
Hi All, I've just checked my rankings and everything on my eCommerce Site has pretty much tanked really badly since my new URL structure and site redesign was put in a place 2 weeks ago. My url structure was originally long and had underscores but we have now made it clean, shorter and use hyphens. We also have location specific pages and we have incorporated these into the new url structure.Basically it now pretty much follows the breadcrumb trail on our website. We were originally a general online hire site but now we have become niche and only concentrating on one types of products, so we got rid of all the other categories/products and pages we do not deal with anymore. Our Rankings issue , was only bought to light in the most recent MOZ Ranking report so it's looking site google hates our new store. Someone mentioned the other day, that Google may have been doing a Panda/Penguin refresh last weekend, but I am surprised to have dropped like 20 to 50 places for most of my keywords. We have set up the 301 redirects, We have also made the site alot smaller and set up a few thousand 404's to get rid of a lot of redundant pages . We have cut down massively on the thin/duplicate content and have lots of good new content on there. We did new sitemaps , set up schema.org. , increase text to code ratio . Setup our H1-H5 tags on all our pages. made site mobile responsive.. Basically , we are trying to do everything right. Is there anything glaringly obvious , I should be checking ?. I attach a Short url link if anyone wants to have a quick glance- http://goo.gl/7mmEx i.e Could it be a problem with the new urls or anything else that I should be looking at ?.. I.e how can I check to make sure the link juice is being passed on to the new url ? Or is all this expected when doing such changes ? Any advice greatly appreciated .. Pete
Intermediate & Advanced SEO | | PeteC120 -
Website Displayed by Google as Https: when all Secure Content is Blocked - Causing Index Prob.
Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well. However my homepage did not recover for its top keywords. Today I notice that when I search for my site, its displayed as https:// Robots.txt blocks all content going to any secure page. Leaving me sort of clueless what I need to do to fix this. Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem. Any ideas? Redirect the google bot only? Will a canonical tag fix this? Thx
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
301 Redirect Dilemma - Website redesign
Hi Guys, We are redesigning a clients ecommerce site. As part of the process, we're changing the URL structure to make it more friendly. I have put together a provisional 301 redirect plan but I'm not sure just how far I need to go with it. So far I have extract all the pages from the existing site that Google Webmaster Tools says have links pointing at them - this totals 93 pages. I have matched each page like for like to the new website structure. My next step was to pull the landing pages report from Google Analytics, I have extracted the pages that received entrances over the last 6 weeks. This totals 553, less the redirects I have already done and cleaning up some Google Translate pages I have circa 410 pages left. Many of these pages has more than 1 URL pointing to that page. I'm debating how important it is that that all of these remaining 410 pages have individual redirects set up for them one by one. I have to rule out regex because there is no pattern that makes sense given that I have already set up redirects for the first 93 pages that have external links. My question therefore is how important are 301 redirects on pages that have no external links and receive less than 10 entrances over a 6 week previous period? Do I need to 301 every single product on the old site to it's corresponding page on the new site? Also, I'm not sure how to treat pages that have mutliple URL's on the old site, the existing URL structure is such a mess that in some instances I have 5 URL's for one product page? I could feasibly create 5 seperate redirects but is this necessary? Also what about speed considerations, the server is going to have to load these redirects and it may slow the site down. I'm sitting at 100 odd so far. Any answers are most appreciated. Thanks Derek.
Intermediate & Advanced SEO | | pulseo0 -
Old Redirecting Website Still Showing In SERPs
I have a client, a plumber, who bought another plumbing company (and that company's domain) at one point. This other company was very old and has a lot of name recognition so they created a dedicated page to this other company within their main website, and redirected the other company's old domain to that page. This has worked fine, in that this page on the main site is now #1 when you search for the other old company's name. But for some reason the old domain comes up #2 (despite the fact that it's redirecting). Now, I could understand if the redirect had only been set up recently, but I'm reasonably sure this happened about a year ago. Could it be due to the fact that there are many sites out there still linking to that old domain? Thanks in advance!
Intermediate & Advanced SEO | | VTDesignWorks1 -
Static index page or not?
Are there any advantages of dis-advantages to running a static homepage as opposed to a blog style homepage. I have be running a static page on my site with the latest posts displayed as links after the homepage content. I would like to remove the static page and move to a more visually appealing homepage that includes graphics for each post and the posts droppping down the page like normal blogs do. How will this effect my site if I move from a static page to a more dynamic blog style page layout? Could I still hold the spot I currently rank for with the optimized index content if I turn to a more traditional blog format? cheers,
Intermediate & Advanced SEO | | NoCoGuru0