HI all,
I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit.
-
404 Erorrs: >80'000
-
Soft 404 Errors: 300
-
500 Errors: 1600
All of the above reported in GWT.
Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing.
What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind.
So my question is: Generally, what is the appropriate way of handling this?
-
Telling the client that he has to investigate that (I gave my best to at least report the errors)
-
Engaging my firm further and get a developer from my side to investigate?
Thanks in advance!!