Mystery 404's
-
I have a large number of 404's that all have a similar structure: www.kempruge.com/example/kemprugelaw. kemprugelaw keeps getting stuck on the end of url's. While I created www.kempruge.com/example/ I never created the www.kempruge.com/example/kemprugelaw page or edited permalinks to have kemprugelaw at the end of the url. Any idea how this happens? And what I can do to make it stop?
Thanks,
Ruben
-
One by one is fine with me. I'd much prefer that to screwing up the site.
Thanks again,
Ruben
-
Hi Ruben
I'm glad that has helped you
There is one way you could do multiple updates BUT I would not recommend it as doing it wrong could screw up your site. You could do it via the control panel in your site's hosting by querying your MySQL database via PHPMyAdmin and doing a bulk search and update for all references to www.kempruge.com where it doesn't have http:// in front and replacing www.kemruge.com with http://www.kempruge.com.
Although it is a pain I know, the best way is to fix the errors one by one in the pages themselves and leave the redirects running until you are sure that Google, Bing and Yahoo have updated their indexes, then you can remove them.
If you copy http:// onto your Mac/PC clipboard, then it will make it quicker to open the link dialog and paste at the start of the URL.
Peter
-
Peter,
You're a genius! I'm almost certain that's it, because I can't remember adding "http://" Is there a way to get rid of those pages? I just 301 redirected them to where they are supposed to go, but I have a lot of redirects. When I say a lot, I mean a lot relative to how many pages I have. We have 500 something indexed pages, and probably 200 something redirects. I know that many redirects slows our site down. I'd like to know if there's any better option that the 301s, if I can't just delete them.
Thanks,
Ruben
-
Hi Ruben
You mentioned: In GWT, the 404s are slightly different. They are www.kempruge.com/example/www.kempruge.com
I have seen this type of thing before, or something similar, when an absolute link has been entered into some anchor text or by itself without adding http:// before the link.
So the link has been entered as www.mydomain.com - which causes the error - but it should be entered as http://www.mydomain.com
Your issue may be something completely different, but I thought I would post this as a possible solution.
Peter
-
In GWT, the 404s are slightly different. They are www.kempruge.com/example/www.kempruge.com
In BWT, it's the www.kempruge.com/example/kemprugelaw
In GWT, they say the 404's are coming from my site, but I couldn't find out where it says that for BWT.
Any thoughts, and thanks for helping out. This has been bothering me for awhile.
Ruben
-
It says it in Webmaster Tools, does that matter? I'm going to check on where from now. Also, I know my sitemap 404's, but I can't figure out what happened. If you go here: http://www.kempruge.com/category/news/feed/ that's my sitemap. How it got changed to that, I have no idea. Plus, I can't find that page in the backend of WP to change the url back to the old one.
I tried redirecting the proper sitemap name to the one that works, but that didn't seem to work.
-
I crawled your site and didn't see the 404 errors.
I did notice that your sitemap in your robots.txt 404's so you may want to take a look at that.
-
Are you seeing these 404s in Webmaster Tools or when crawling the site?
If WMT where does it say the 404 is linked to from? Click on the URL with the 404 error in WMT and select the "Linked from" tab.
Crawl the site with Screaming Frog and your user agent set to Googlebot. See if the same 404 errors are being picked up and if so, you can click on them and select the "In Links" tab to see what page the 404 is being picked up on.
I checked the source code of some of the pages on www.kempruge.com and didn't see any relative links which usually create problems like this. My bet is on a site scraping your site and creating 404 errors when they link back to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
Google Not Seeing My 301's
Good Morning! So I have recently been putting in a LOT of 301's into the .htaccess, no 301 plugins here, and GWMT is still seeing a lot of the pages as soft 404's. I mark them as fixed, but they come back. I will also note, the previous webmaster has ample code in our htaccess which is rewriting our URL structure. I don't know if that is actually having any effect on the issue but I thought I would add that. All fo the 301's are working, Google isn't seeing them. Thanks Guys!
Intermediate & Advanced SEO | | HashtagHustler0 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
Solving a Mystery
I've been banging my head trying to solve a mystery for a few months, with little luck. Maybe one of you has an answer. My home page organic SERP rankings have dropped from #1 or #2 late last year, to between #8 and #25 a few months ago for a couple of important keywords. I did not make any on-page changes during this period. Meanwhile, many inner pages continue to rank well (top 5) for long-tail keywords. All of my SEO is white hat, no unusual on-page tactics, no purchased or otherwise fishy backlinks. In fact, looking closely at the on-page and backlink tactics of some of the websites that have leapfrogged ahead of me, I'd say some of them have (and continue to) engaged in shady practices. No errors reported by SEOmoz. I've been judiciously applying canonical tags and other techniques to ensure minimal duplicate content. I've not received a letter from Google warning me of any unnatural backlink profiles. The biggest part of the mystery is that it seems like my home page was the only page really affected. As mentioned above, the inner pages continue to rank well for their respective keywords. Thus, I don't think I've been hit by any kind of site-wide penalty. Also, I continue to rank well on Bing/Yahoo. Does anybody have an insight into what could have happened? Has anybody else experienced anything similar (home page dropping rank while other pages stay put)?
Intermediate & Advanced SEO | | ahirai0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0