:443 - 404 error
-
I get strange :443 errors in my 404 monitor on Wordpress
https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
https://www.compleetverkleed.nl:443/maskers/I have no idea where these come from :S
-
These are normal 404 errors - the :443 after the url just indicates that it's using the https port (you can add the 443 after each of your https urls - they redirect to the normal ones); Don't know why wordpress is adding the :443
Check your site if you have internal links to these 404 pages (try Screaming Frog or another crawltool. These pages seem to have existed in the past: check this example http://webcache.googleusercontent.com/search?q=cache:_M5jZnMoFmcJ:https://www.compleetverkleed.nl/hoed-al-capone-panter-8713647758068-2/+&cd=2&hl=nl&ct=clnk&gl=nl
If no internal / external links to these pages exist - this problem should disappear after a while.
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
403 Errors Issue
Hi, all! I've been working with a Wordpress site that I inherited that gets little to no organic traffic, despite being content rich, optimized, etc. I know there's something wrong on the backend but can't find a satisfactory culprit. When I emulate googlebot, most pages give me a 403 error. Also, google will not index many urls which makes sense and is a massive headache. All advice appreciated! The site is https://www.diamondit.pro/ It is specific to WP Engine, using GES (Global Edge Security) and WPWAF
Technical SEO | | SimpleSearch0 -
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
Manual Webspam Error. Same Penalty on all sites on Webmaster Tools account.
My URL is: www.ebuzznet.comToday when i checked webmaster tools under manual spam section. I got Manual spam and reason was Thin content with little or no added value. Then I checked other sites from same webmaster tools account, there are 11 sites, all of them received same manual action. I never received any mail, no notification on site messages section regarding this manual action. I just need confirmation whether it is something to do with any error in webmaster tools or all of the sites really received manual spam actions.Most of the article on sites are above 500 words, quality content (no spun or copied).Looking for suggestions, answers
Technical SEO | | ndroidgalaxy0 -
404 errors is webmaster - should I 301 all pages?
Currently working on a retail site that shows over 1200 404 errors coming from urls that are from products that were on the site, but have now been removed as they are seasonal/out of stock. What is the best way of dealing with this situation ongoing? I am aware of the fact that these 404s are being marked as url errors in Google Webmaster. Should I redirect these 404s to a more appropriate live page or should I leave them as they are and not redirect them? I am concerned that Google may give the site a penalty as these 404s are growing (as the site is a online retail store and has products removed from its page results regularly). I thought Google was able to recognise 404s and after a set period of time would push them out of the error report. Also is there a tool out there that on mass I can run all the 404s urls through to see their individual page strength and the number of links that point at each one? Thanks.
Technical SEO | | Oxfordcomma0 -
Massive Increase in 404 Errors in GWT
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them. We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own. Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there. Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again. I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
Technical SEO | | Marketing.SCG0 -
My 404 page shows in the report as an error.
How can i make my actual 404 page not show up as a 404 error in the report?
Technical SEO | | LindseyNewman0 -
4xx error - but no broken links founded by Xenu
In my SeoMoz crawl report I get multiple 4XX errors reported and they are all on the same type of links. www.zylom.com/nl/help/contact/9/ and differiate between the number at the end and the language. But I i look in the source code we nice said: <a class="<a class="attribute-value">bigbuttonblue</a>" style="<a class="attribute-value">float:right; margin-left:10px;</a>" href="[/nl/help/contact/9/?sid=9&e=login](view-source:http://www.zylom.com/nl/help/contact/9/?sid=9&e=login)" onfocus="<a class="attribute-value">blur()</a>" title="<a class="attribute-value">contact</a>"> contact a> I already tested the little helpfull tool Xenu, but this also doesn't give any broken links for the url's which I found in the 4xx error report. Could somebody give me a suggestion Why these 4xx errors keep coming? Could it be that the SeoMoz crawlers break the part ?sid=9&e=login' from the URL. Because if you want to enter the link, you first get a pop-up to fill in a login screen. Thanks for you answers already
Technical SEO | | Letty0 -
Disappeared from Google with in 2 hours of webmaster tools error
Hey Guys I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it he made the below changes and within 2 hours the site has drop off the face of google “in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed” “I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools” I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex do you guys have any further advice ? Ben
Technical SEO | | elbeno1