How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
-
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew?
- Shawn
-
Gotcha, well depending on the crawl rate (PR/Update Frequency, etc.) and number of pages, Googlebot should be back to check it out, more than a month and I would start to wonder about other issues.
But, hey, as Matt said above, if you can gain something from keeping those pages, 301 them to an equivilant page on the new site.
If you know that thosee pages were low value or had no links (except internal ones), you could go the Remove URL route.
Brian
-
recently is a week ago
-
How recently is recently? The crawl should clear most of that up. But if they are 404ing right now, you could always use the webmasters tools Remove URL (Site Configuration > Crawler access > Remove URL Tab) to request the URL be removed from the index.
Brian
-
You want to re-submit the new sitemap.
It's also a good idea to redirect ALL URL's not just the "important ones" as many of those URL's may have had links or references elsewhere and if you don't redirect that URL not only does Google find the dead link and attribute that to your domain, but you're also missing out on passing that links authority on to your new site.
It's a pain in the ass, but trust me, the more time you spend fully re-directing the old URL's to the new appropriate pages, the happier you'll be.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating two websites from one and building up traffic to the new domain quickly
A client has an existing successful website that sells niche products - they are well known in their marketplace. They have two sets of key customers, let's call them (a) and (b), that need addressing in different ways to maximise sales. (a) is the more specialist end of the market, where people have complex needs - there are fewer of them but repeat business is likely, and we can talk to them in more technical language. (b) is the layman's end of the market - there is a vast pool of potential customers but they'll be more casual buyers and need to be addressed more in layman's terms. So what they want to do is to take their existing website, and essentially split it into two different websites, one for each market. The one that will use the existing domain, with all the links that have built up over the years pointing to it, will be the site for the more specialist end of the market (a). The domain name suits it better, which is why he wants to use the existing domain with that site and not the other. (b) will be a brand new domain. The client will write new product descriptions across the board so that the two sets of product information are not duplicate. I'd rather he didn't do this at all, because of the risk involved, and the difficulty of building up the traffic to the new site, which is after all the one with the best chance of mass market sales. But given that the client has decided that this is definitely what he wants, does anyone have any thoughts on what the action plan should be?
Intermediate & Advanced SEO | | helga730 -
New Website SEO Implications
Hi Moz Community, A client of mine has launched a new website. The new website is well designed, mobile friendly, fast loading and offers a far better UX than the old site. It has similar content but 'less wordy'. The old website was tired, slow, not mobile responsive etc but still ranked well. The domain has marketing leading authority and link metrics. Since the launch, the rankings for virtually every word has plummeted. Even previously ranked #1 words have disappeared to page 3 or 4. New pages have different URLs (301s from the old urls are working fine) and still score the same 98% (using the Moz page optimiser tool). Is it usual to experience some short term pain, or are these rankings drop an indication that something else is missing? My theory is that the new URLs are being treated like new pages, and that those new pages don't have the engagement data which is used for ranking. Thus, despite having the same authority of the old pages, as far as user data is concerned, they are new pages and therefor, not ranking well - yet. That theory would make logical sense but I'm hoping some experts here can help. Any suggestions welcome. Here's a quick checklist of things I have already done: complete 301 redirect list
Intermediate & Advanced SEO | | I.AM.Strategist
New sitemap
Submitted to console
Created internal links from within their large blog
Optimised all the new pages (img alts, H1s etc) Extra info: Platform changed from Wordpress to Expression engine
Target pages now on level 3 not level 2 (extra subfolder used)
Less words used (average word count per page from 400+ to 250) Thanks in advance 🙂0 -
Setting Up Hreflang and not getting return tag errors
I've set up a dummy domain (Not SEO'd I know) in order to get some input on if I'm doing this correctly. Here's my option on the set up and https://technicalseo.com/seo-tools/hreflang/ is saying it's all good. I'm self-referencing, there's a canonical, and there is return tags. https://topskiphire.com - US & International English Speaking Version https://topskiphire.com/au/ - English language in Australia The Australian version is on a subdirectory. We want it this way so we get full value of our domain and so we can expand into other countries eventually e.g. UK. Q1. Should I be self-referencing or should I have only a canonical for US site? Q2. Should I be using x-default if we're only in the English language? Q3. We previously failed when we had errors come back saying 'return tags not found' on a separate site even though the tags were on both sites. Was this because our previous site was only new and Google didn't rank it as often as our main domain.
Intermediate & Advanced SEO | | cian_murphy0 -
Google favoring old site over new site...
Hi, I started a new site for a client: www.berenjifamilylaw.com. His old site: www.bestfamilylawattorney.com was too loaded up with bad links. Here's the weird part: when you Google: "Los Angeles divorce lawyer" you see the old site come up on the 21st page, but Google doesn't even show the new site (even though it is indexed). It's been about 2 weeks now and no change. Has anyone experienced something like this? If so, what did you do (if anything). Also, I did NOT do a 301 redirect from old to new b/c of spammy links. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Experience with Google Disawow Tool and discovering bad back-links
Hi Community, is there any experience to tell here about the disawow tool from Google? Any review? It have helped revocer sites beaten by Penguin or penalized after WMT Unnatural Link building message? Which tools and methods you use to find bad back-links to submit for the disawow tool? Thanks for your feedback,
Intermediate & Advanced SEO | | Braumueller0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590