4XX client error with email address in URL
-
I have an unusual situation I have never seen before and I did not set up the server for this client. The 4XX error is a string of about 74 URLs similar to this:
http://www.websitename.com/about-us/info@websitename.com
I will be contacting the server host as well to troubleshoot this issue. Any ideas?
Thanks
-
Hi EliteVenu! I'm so glad Ryan pointed you in the right direction. If that turns out to fix the problem, mind marking one or both of his responses as a "Good Answer?"
-
Great! Glad I could help.
-
That gave me the right direction to look in! A social icon plugin did not require the mailto in the dashboard settings, (as it only said "enter your email address here") and the theme wrote it as href in the theme's code. I looked at the source code, but overlooked this small detail. I removed the social icon email so I will see if it helps.
Thanks for the response!
-
Hi there! Tawny from the Help Team here - I think I can help provide a little bit of insight!
If you take a look at the Site Crawl report for this site's campaign and look at just the 4XX client errors, you'll see a Linking Page column in the table below the graph. That's the page from which our crawler arrived at the 404 page, and is where you can start looking for what went wrong.
I'd recommend taking a peek at that Linking Page's source code and searching for the email address - that's likely where you'll find the issue.
I hope this helps! Feel free to write in to us at help@moz.com if you still have questions and we'll do our best to help you out!
-
If that's what you're seeing it looks like someone used a relative href link instead of a mailto link for emails on the about us page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hello.
Moz Bar | | csgosmurfcart
When I try to use 'On Page Grader' on specific site, I get an error message. "
Page Optimization Error
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page.
"
example : https://www.csgosmurfkart.com Site's robots.txt settings are good. and I think there's no blocking factor. But On Page Grader cannot crawl the sites.
But campaign crawler is working well on the site. only On Page Grader is not working.. What should I change my server's setting or site's setting for crawling site on my site?
I'm using wordpress on google cloud Thank you.0 -
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward. But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet. Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
Moz Bar | | brettmandoes2 -
Error in Duplicate Content Being Reported - Pages Aren't Actually Duplicates
The recent crawl of one of our sites revealed a high number of duplicate content issues. However, when I viewed the report for pages with duplicate content I noticed almost all of them are not duplicates. For example, these two pages are marked as dupes:
Moz Bar | | M_D_Golden_Peak
https://www.writersstore.com/publishers/hollywood-creative-directory
https://www.writersstore.com/authors/g-miki-hayden These are thin as far as content goes but definitely not duplicates. Any recommendations or ways to adjust the settings so that these false positives aren't clogging up our site crawl report?0 -
Onpage Grader not Finding Keyword in URL
I've noticed that the Onpage Grader is not including my keyword in the URL when the keyword is in the domain. If I grade an inner page and the keyword is in the sub-directory, it finds it. Is this intentional? If so, why does the grader not include my keyword in the domain as Keyword in the URL?
Moz Bar | | Dino640 -
Cannot Crawl ... 612 : Page banned by error response for robots.txt.
I tried to crawl www.cartronix.com and I get this error: 612 : Page banned by error response for robots.txt. I have a robots.txt file and it does not appear to be blocking anything www.cartronix.com/robots.txt Also, Search Console is showing "allowed" in the robots.txt test... I've crawled many of our other sites that are similarly set up without issue. What could the problem be?
Moz Bar | | 1sixty80 -
URLS appearing twice in Moz crawl
I have asked this question before and got a Moz response to which i replied but no reply after that. Hi, We have noticed in our moz crawl that urls are appearing twice so urls like this - http://www.recyclingbins.co.uk/about/ www.recyclingbins.co.uk/about/ Thought it may be possible rel=canonical issue as can find URL's but no linking URL's to the pages. Does anyone have any ideas? Thank you Jon I did the crawl test and they were not there
Moz Bar | | imrubbish0 -
Unspecified errors
Why am I getting an Unspecified Error when adding my keywords? Screen_Shot_2013-10-21_at_1.10.03_PM.png
Moz Bar | | RandyMilanovic1 -
Crawl Diagnostics: Exlude known errors and others that have been detected by mistake? New moz analytics feature?
I'm curious if the new moz analytics will have the feature (filter) to exclude known errors from the crwal diagnostics. For example, the attached screenshot shows the URL as 404 Error, but it works fine: http://en.steag.com.br/references/owners-engineering-services-gas-treatment-ogx.php To maintain a better overview which errors can't be solved (so I just would like to mark them as "don't take this URL into account...") I will not try to fix them again next time. On the other hand I have hundreds of errors generated by forums or by the cms that I can not resolve on my own. Also these kind of crawl errors I would like to filter away and categorize like "errors to see later with a specialist". Will this come with the new moz analytics? Anyway is there a list that shows which new features will still be implemented? knPGBZA.png?1
Moz Bar | | inlinear0