404 in google webmaster tool
-
I have redesigned my website with new web address over 6 months ago and in the google webmaster tools it still shows my old urls with a reponse code 404 and still crawls those pages.
How do I make sure they don't appear anymore in the webmaster tool and don't get crawled anymore ?
or should I do a re-direct ?
Thank you,
-
Agree with John's response, but to add two points:
1. You can also set redirects manually via a .htaccess file if using Joomla. The format is very simple:
redirect 301 /oldpage.html /newpage.html
It takes some time, but I'm generally of the opinion that it's worth it when these appear in GWMT.
2. There's a feature in Webmaster Tools now to mark the errors as 'resolved'. That can help clean things up for you. If the errors aren't really resolved, Google states in the help documentation that they will eventually reappear for you to find again.
-
If you have new pages that are similar to your old pages, the old pages no longer need to exist. Usually this involves putting redirects in your conf file. I've never used Joomla, but I think the conf file would live outside of that (e.g. in Apache).
-
Your 404's should filter out over time as long as you are not linking to them or listing them in your XML sitemap. It can take quite a while. 301 them if you have proper pages to do so, but if you don't - just let it play out.
Google webmaster's stance on this is that 404's are natural, and they are more of an awareness alert, than a huge issue. If you see a spike make sure it is supposed to be happening.
-
That is what I thought but how do I redirect those knowing that they don't exist on my joomla website anymore ?
-
You should do redirects for these pages if you can. The probable reason they keep showing up and getting crawled is because people linked to those pages before you redid your site, and Google is continuing to come across those links. If you 301 redirect them to their new counterparts, people will be getting to the content they're looking for when they click those links, and also the link juice from those links will pass to the new pages, so the new pages should also rank better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any Submission tool Available
Hello Is There Any Submission tool In moz ?, that Can help to submit website to different search engines like, google, Yahoo & Bing. Then Please Reply, Thanx in Advance
Intermediate & Advanced SEO | | Sanjayth0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Google webmaster Smartphone errors fix
I have certain URL's that I have fixed before in google webmaster. With smartphone addition. It start appearing again. How can I fix the Google webmaster errrors for smartphones?
Intermediate & Advanced SEO | | csfarnsworth0 -
Blocked from google
Hi, i used to get a lot of trafic from google but sudantly there was a problem with the website and it seams to be blocked. We are also in the middle of changing the root domain because we are making a new webpage, i have looked at the webmaster tools and corrected al the errors but the page is still not visible in google. I have also orderd a new crawl. Anyone have any trics? do i loose a lot when i move the domainname, or is this a good thing in this mater? The old one is smakenavitalia.no The new one is Marthecarrara.no Best regards Svein Økland
Intermediate & Advanced SEO | | sveinokl0 -
Google's Structured Data Testing Tool? No Data
I'm stumped as to why some of the pages on my website return no data from Google's Structured Data Testing Tool while other pages work fine and return the appropriate data. My home page http://www.parkseo.net returns no data while many inner pages do. http://www.parkseo.net Returns No Data http://www.parkseo.net/citation-submission.html Does Return Data. I have racked my brains out trying to figure out why some pages return data and others don't. Any help on this issue would be greatly appricated. Cheers!
Intermediate & Advanced SEO | | YMD
Gary Downey0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Google badge extracted to SERPs
It's a while ago that (i thought) I read the following information on the Google badge. Here https://developers.google.com/+/plugins/badge/ we have the implementation guide, however I was under the impression the Google badge could be thereafter extracted into SERPs so the user could follow etc... direct from SERPs. I can't find anything confirming this. I think that it might clash with authorship data which does a similar job, but where a site page is not relevant to authorship at all, I would have thought linking back to the G+ page from SERPs was a sensible option. Can anyone confirm the Google badge can appear in SERPs?
Intermediate & Advanced SEO | | richcowley0