404 broken URLs coming up in Google
-
When we do a search for our brand, we are get the following results in google.com.au (see image attachment).
As outlined in red, there are listings in Google that result in 404 Page Not Found URLs.
What can we do to enable google to do a recrawl or to ensure that these broken URLs are no longer listed in Google?
Thanks for your help here!
-
Apologies for the delay in responses here. Thanks Andreas and Mike. We ended up doing just that and redirected 404 errors before doing a recrawl. Worked great! Thanks for your help.
-
Agreed. Go to Search Console, see what 404 errors Google is throwing your way, 301 redirect anything that can & should be redirected from the list to their most relevant equivalent on the live site, and then fetch & submit the site for a recrawl.
OR (since the links in question you posted was for a Test Site) if that test version needs to be up for internal testing purposes then you can potentially NoIndex the pages, resubmit for crawl so the bots see the NoIndex on the pages, and then after they've dropped out of the SERPs you can update your robots.txt to disallow the folder those pages are sitting on. (Not sure if there's a better/quicker way to get them out of the SERPs if you still need it Live)
-
You may redirect them 301 to the page wich is now the best answer for the search query. That fixes the problem for the user immediatly.
You can than go to search console and let the bot crawl the url again (fetch as google) - he will see the redirect, you can send it to google.
Make sure your redirection is the best answer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News?
Hi, If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Changing URL to a subdomain?
Hi there, I had a website www.footballshirtcollective.com that has been live since July. It contains both content and eCommerce. I am now separating out the content so that; 1. The master domain is www.footballshirtcollective.com (content) pointing to a new site 2. Subdomain is store.footballshirtcollective.com (ecommerce) - pointing to the existing site. What do you advise I can do to minimise the impact on my search? Many thanks Mike
Intermediate & Advanced SEO | | mjmaxwell0 -
Google penalty was lifted when an SSL was added, only to come back. Have you seen this?
We took on a client who is under penalty (various reasons). The solution was a ground up new website with fresh content, domain, everything. This client has everything "right" in Google. Everything you would want a client to do, he's done. Great reviews, great offsite engagement, video, etc. Recently we updated his SSL and for about 1 week he came off penalty, only to come back on penalty (top three for every major term in a very competitive market back to around page three). Do you have any experience with this and if so, I'd love to hear your advice/rational of why this occurred and what it means?
Intermediate & Advanced SEO | | mgordon0 -
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
Rank on specific Google
Hi folks, a website is hosted with a TLD like .com in the USA. The content etc. is obviously all english but now we want to focus on a specific Google like .co.uk What must be necessarily be done to rank better? Is it enough just to buy a .co.uk domain and set the nameserver up or do we need to get a british hosting? Thanks in advance. Mike
Intermediate & Advanced SEO | | KillAccountPlease0 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0