Mystery 404's
-
I have a large number of 404's that all have a similar structure: www.kempruge.com/example/kemprugelaw. kemprugelaw keeps getting stuck on the end of url's. While I created www.kempruge.com/example/ I never created the www.kempruge.com/example/kemprugelaw page or edited permalinks to have kemprugelaw at the end of the url. Any idea how this happens? And what I can do to make it stop?
Thanks,
Ruben
-
One by one is fine with me. I'd much prefer that to screwing up the site.
Thanks again,
Ruben
-
Hi Ruben
I'm glad that has helped you
There is one way you could do multiple updates BUT I would not recommend it as doing it wrong could screw up your site. You could do it via the control panel in your site's hosting by querying your MySQL database via PHPMyAdmin and doing a bulk search and update for all references to www.kempruge.com where it doesn't have http:// in front and replacing www.kemruge.com with http://www.kempruge.com.
Although it is a pain I know, the best way is to fix the errors one by one in the pages themselves and leave the redirects running until you are sure that Google, Bing and Yahoo have updated their indexes, then you can remove them.
If you copy http:// onto your Mac/PC clipboard, then it will make it quicker to open the link dialog and paste at the start of the URL.
Peter
-
Peter,
You're a genius! I'm almost certain that's it, because I can't remember adding "http://" Is there a way to get rid of those pages? I just 301 redirected them to where they are supposed to go, but I have a lot of redirects. When I say a lot, I mean a lot relative to how many pages I have. We have 500 something indexed pages, and probably 200 something redirects. I know that many redirects slows our site down. I'd like to know if there's any better option that the 301s, if I can't just delete them.
Thanks,
Ruben
-
Hi Ruben
You mentioned: In GWT, the 404s are slightly different. They are www.kempruge.com/example/www.kempruge.com
I have seen this type of thing before, or something similar, when an absolute link has been entered into some anchor text or by itself without adding http:// before the link.
So the link has been entered as www.mydomain.com - which causes the error - but it should be entered as http://www.mydomain.com
Your issue may be something completely different, but I thought I would post this as a possible solution.
Peter
-
In GWT, the 404s are slightly different. They are www.kempruge.com/example/www.kempruge.com
In BWT, it's the www.kempruge.com/example/kemprugelaw
In GWT, they say the 404's are coming from my site, but I couldn't find out where it says that for BWT.
Any thoughts, and thanks for helping out. This has been bothering me for awhile.
Ruben
-
It says it in Webmaster Tools, does that matter? I'm going to check on where from now. Also, I know my sitemap 404's, but I can't figure out what happened. If you go here: http://www.kempruge.com/category/news/feed/ that's my sitemap. How it got changed to that, I have no idea. Plus, I can't find that page in the backend of WP to change the url back to the old one.
I tried redirecting the proper sitemap name to the one that works, but that didn't seem to work.
-
I crawled your site and didn't see the 404 errors.
I did notice that your sitemap in your robots.txt 404's so you may want to take a look at that.
-
Are you seeing these 404s in Webmaster Tools or when crawling the site?
If WMT where does it say the 404 is linked to from? Click on the URL with the 404 error in WMT and select the "Linked from" tab.
Crawl the site with Screaming Frog and your user agent set to Googlebot. See if the same 404 errors are being picked up and if so, you can click on them and select the "In Links" tab to see what page the 404 is being picked up on.
I checked the source code of some of the pages on www.kempruge.com and didn't see any relative links which usually create problems like this. My bet is on a site scraping your site and creating 404 errors when they link back to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
Hello Everyone, So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version? Assuming speed and other elements wouldn’t be an issue and it's done correctly. Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources. Any help or insight would be appreciated. Please let us know if there are any further details we could provide that might help. Looking forward to hearing from all of you! Thank you in advance for the help. Best,
Intermediate & Advanced SEO | | Ben-R1 -
Need Magento SEO expert for 301 clean up - any reco's?
My site is a total mess from a clean “crawling” perspective. We are still getting traffic and doing business, but I am afraid from an SEO perspective we are driving with the parking brake on. There a lot of 301’s and some of them are causing 404 errors. Below is an overview of my 5 year old magento site which was moved from a 5 year old xcart site (so there is a lot of old junk (url’s) in there). It needs cleaning up and I need a plan and seo / 301 help. Overview: Recently moved from http to https - not sure best practices were followed, but we had lots of crawl issues before this move. Analytics Top 100 Landing Pages = 82.7% of entrances Webmaster Tools 594 Pages Indexed 65 Not found errors - most involve 301’s - examples below Sitemap: 773 Submitted, 395 Indexed URL Parameters - 41 - I can’t tell if they are doing anything (helping or hurting) Moz Crawl Total Pages 3,454 324 Redirect Issues (258 Temp and 66 Chain) Magento 11,773 Redirects 5390 System 6383 Custom On July 15, 2017 I deleted 40 redirects from htaccess that a developer had put there that were causing problems. Blog We have a wordpress blog installed on Magento site. Years ago it was moved from a subdomain to a subdirectory.
Intermediate & Advanced SEO | | SammyT0 -
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Complicated Duplicate Content Question...but it's fun, so please help.
Quick background: I have a page that is absolutely terrible, but it has links and it's a category page so it ranks. I have a landing page which is significantly - a bizillion times - better, but it is omitted in the search results for the most important query we need. I'm considering switching the content of the two pages, but I have no idea what they will do. I'm not sure if it will cause duplicate content issues or what will happen. Here are the two urls: Terrible page that ranks (not well but it's what comes up eventually) https://kemprugegreen.com/personal-injury/ Far better page that keeps getting omitted: https://kemprugegreen.com/location/tampa/tampa-personal-injury-attorney/ Any suggestions (other than just wait on google to stop omitting the page, because that's just not going to happen) would be greatly appreciated. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
Anyone managed to change 'At a glance:' in local search results
On Google's local search results, i.e when the 'Google places' data is displayed along with the map on the right hand side of the search results, there is also an element 'At a glance:'
Intermediate & Advanced SEO | | DeanAndrews
The data that if being displayed is from some years ago and the client would if possible like it to reflect there current services, which they have been providing for some five years. According to Google support here - http://support.google.com/maps/bin/answer.py?hl=en&answer=1344353 this cannot be changed, they say 'Can I edit a listing’s descriptive terms or suggest a new one?
No; the terms are not reviewed, curated, or edited. They come from an algorithm, and we do not help that algorithm figure it out. ' My question is has anyone successfully influenced this data and if so how.0 -
How long does it take before URL's are removed from Google?
Hello, I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later. Is there something I am missing? Or will it just take time for them to get de-indexed? As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical? Thanks!
Intermediate & Advanced SEO | | SeanConroy0