Very wierd pages. 2900 403 errors in page crawl for a site that only has 140 pages.
-
Hi there,
I just made a crawl of the website of one of my clients with the crawl tool from moz.
I have 2900 403 errors and there is only 140 pages on the website.
I will give an exemple of what the crawl error gives me.
|
http://www.mysite.com/en/www.mysite.com/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en
|
http://www.mysite.com/en/www.mysite.com/en/en/en/en/en/en/en/en/en/en/en/en/en/index.html#?lang=en
|
|
|
|
|
|
|
|
|
|
There are 2900 pages like this.
I have tried visiting the pages and they work, but they are only html pages without CSS.
Can you guys help me to see what the problems is. We have experienced huge drops in traffic since Septembre.
-
Thank you so much for your response!
Yes. Could you please email me at eliotostiguy@gmail.com? I will be able to give you the url via email
-
Almost right, but 'just about' wrong; the 403 error is only served once an URL 'is' accessed. The content may not be accessible (as it's forbidden) but the URL itself, still is. Whilst it's unlikely that these URLs would ever be indexed, there's still an infinite loop in the link architecture which could impact upon crawl allowance and site health metrics
I'd get it sorted out!
-
but 403 is a forbidden error so those pages wouldn't be getting accessed from google. Google can't access them which in this case is a good thing right.
-
This is almost assuredly a link-based architectural error. It will be something similar to this:
- You load a page on EN
- You click the EN flag or language icon
- Instead of just reloading the page you are already on (since you're already on EN) the link is coded wrong and adds another /EN/ layer to the URL
- Once the new URL loads, the problem can be repeated
- This creates infinity URLs on your site
- Bad for Google, and Moz's crawler
Bet you it's something like that. If you give me the exact URL I might even be able to find the flaw and detail it for you via email or something
-
Hi there,
Thanks so much for reaching out - Sam from Moz's Help Team here!
I'm just going to be reaching out to you directly from help@moz.com about this, after taking a look into your campaign and crawl. I'll be in touch soon!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Crawling with Firewall Plugin
Just wondering if anyone has any experience with the WordPress Simple Firewall plugin. I have a client who is concerned about security as they've had issues in that realm in the past and they've since installed this plugin: https://wordpress.org/support/view/plugin-reviews/wp-simple-firewall?filter=4 Problem is, even with a proper robots file and appropriate settings within the firewall, I still cannot crawl the site with site crawler tools. Google seems to be accessing the site fine, but I still wonder if it is in anyway potentially hindering search spiders.
Technical SEO | | BrandishJay0 -
403 error
Hey guys, I know that a 403 is not a terrible thing, but is it worth while fixing? If so what is the best way to approach it. Cheers
Technical SEO | | Adamshowbiz0 -
Locating 404 Page Errors for Deletion
On my SEOmoz report, there are several 404 pages that I assume need deletion. Yes? When I am looking at my pages from the back-end of WordPress, how do I identify these to delete or fix them? In the list of pages I have created, it is not at all apparent when I click into "edit" the page that any of these are broken pages. I think the 404 pages are urls from pages that I changed the url to be more seo friendly, but they don't really exist. I hope this makes sense - it is baffling to me : ) Thank you for any insight and help with getting these cleared. The errors are listed below from the report. Sheryl | 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/Cosmetic_Dentistry_Services_Teeth_Whitening_Montezuma_CO.html 404 1 0 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/General_Dentistry_Services_White_Fillings_Montezuma_CO.html 404 1 0 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/Request_an_Appointment.html 404 1 0 404 : Error http://durangocodentists.com/videos/repairing-teeth/pid%3A4078865 404 1 0 404 : Error http://durangocodentists.com/videos/teeth-whitening/pid%3A4078865 404 1 0 404 : Error http://durangocodentists.com/videos/veneers/pid%3A4078865 | 404 | 1 | 0 |
Technical SEO | | TOMMarketingLtd.0 -
Nginx 403 and 503 errors
I have a client with a website that is hosted on a shared webserver running on an Nginx server. When I started working on the website a few months ago I found the server was throwing 100s of 403s and 503s and at one point googlebot couldn't access robots.txt. Needless to say this didn't help rankings! Now the web hosting company has partially resolved the errors by switching to a new server and I'm now just seeing intermittent spikes in Webmaster Tools of 30 to 70 403 ad 503 errors. My questions: Am I right in saying there should (pretty much) be no such errors (for pages that we make public and crawlable). Having already asked the web hosting company to look in to this. Any advice on specifically what I should be asking them to look at on the server? If this doesn't work out, does anyone having a recommendation for a reliable web hosting company in the U.S. for a lead generation website with over 20,000 pages and currently 500 to 1000 visits per day? Thanks for the help Mozzers 🙂
Technical SEO | | MatShepSEO0 -
Should I ask third pages to erase their links pointing at my site?
Good Morning Seomoz Fans, let me explain what is going on: A surfing site has included a link to my Site in their Footer. apparently, this could be good for my site, but as It has nothing to do with my site, I ask myself if I should tell them to erase it. Site A (Surfing Site) is pointing at Site B (Marketing Site) on their Footer. So Site B is receiving backlinks from every single page on Site A. But Site B has nothing to do with Site A: Different Markets. Should I ask them to erase the link on their footer as Surfing people will not find my Marketing Site interesting? Thanks in advance.
Technical SEO | | Tintanus0 -
ECommerce site - Duplicate pages problem.
We have an eCommerce site with multiple products being displayed on a number of pages. We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find. Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate. We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?. General thoughts if anyone has similar problems welcome
Technical SEO | | SarahCollins0 -
Decreasing the size of a site to increase SEO value of remaining pages?
My website has thousands of pages and I have so many keywords on the bottom of page 1 and on page 2 of SERPs. I am considering making the site smaller to lessen the dilution of the overall domain authority and in theory the remainder pages should get pushed up in rank. Do you feel this theory is flawed? Is it better to 301 or remove the pages if they don't have backlinks directly to the internal page? These are pages I would re-enable down the road once overall domain authority is increased. thanks, David couponcactus.com
Technical SEO | | CouponCactus0 -
Webmaster Tools 404 Errors Pages Never Created
Recently, 196 404 errors appeared in my WMT account for pages that were never created on my site. Question: Any thoughts on how they got there (i.e. WMT bug, tactic by competitor)? Question: Thoughts on impact if any? Question: Thoughts on resolution?
Technical SEO | | Gyi0