Sitemap issue - Tons of 404 errors
-
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil.
This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder.
I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues?
The site in question is www.atozqualityfencing.com
-
Thanks again for the awesome help. I really appreciate your time and effort!!
-
I don't think it would snowball. It should be the end of the issue, as I think google will have found all of the pages it is going to find. You might have some more popup like tags pages and thing like that, but nothing major. I don't know if your webmaster is letting you see the webmaster tools or not, but it has an error date of when it last detected the error. It should look like this, http://screencast.com/t/5a9lpC6o then you can click on the link and pull this window up, http://screencast.com/t/boyAdXGoOLl From there you can see if the links were internal or external that were triggering the 404 pages. It could very well be that external backlinks were triggering them. If they are internal links, to be safe I would search the source of the pages for the links.
Also, Moz's crawler should pick up the 404 errors and let you know if it is still because of links on the site. The 301 redirects will handle the issue if the links were from the old site, but if the links are because of internal links on the new site that are broken, I would find them and fix them with Moz's crawler or Ravens Crawler.
-
Thank you for your insight Lesley! If we do as you suggest, will that be the end of the issue or could it snowball? Wouldn't you think that if there were changes to the site after Google indexed it the next crawl by Google would correct it? Is there a way to get Google to crawl it immediately? Probably not, huh? lol
-
This one is really difficult to tell what has actually gone wrong. I am thinking there might have been changes to the site once google indexed the site for the first time and the point it is at now. I went to the internet archive and I could not see many of the pages, so I do not really know.
The fix however is to write 301 redirects for all of the pages that are pulling a 404, but there is a page that represents them. It looks like some of the pages might have had a url change and others might have been done away with.
-
Thanks for your reply, Lesley. I am checking with the developer as to which exact steps she took to make the site live from a subdirectory. Some of the 404 pages include:
http://www.atozqualityfencing.com/newsite/feed/
http://www.atozqualityfencing.com/fencing-styles/
http://www.atozqualityfencing.com/fence-materials/conact
http://www.atozqualityfencing.com/newsite/conact/
http://www.atozqualityfencing.com/faq/wood-fencing-gallery
http://www.atozqualityfencing.com/faq/vinyl-fencing-gallery
http://www.atozqualityfencing.com/faq/structures-gallery
http://www.atozqualityfencing.com/faq/horse-fencing-gallery
http://www.atozqualityfencing.com/faq/horse-shelter-gallery
http://www.atozqualityfencing.com/conact
http://www.atozqualityfencing.com/author/aaron-smith/wood-fencing-galleryThere are a total of 210 of them.
What other information can I provide to help get this figured out?
-
It is really hard to tell without seeing the errors. Are the pages at the same address as the previous pages? Did you redirect them? Is there something internally wrong that is hard to tell? It would be easier to diagnose if we could the a list of the 404 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
Cannot work out why a bunch of urls are giving a 404 error
I have used the Crawl Diagnostic reports to greatly reduce the number of 404 errors but there is a bunch of 16 urls that were all published on the same date and have the same referrer url but I cannot see the woood for trees as to what is causing the error. **The 404 error links have the structure:**http://www.domainname.com/category/thiscategory/page/thiscategory/this-is-a-post The referrer structure is: http://www.domainname.com/category/thiscategory/page/2/ Any suggestions as to how to unravel this would be appreciated.
Technical SEO | | Niamh20 -
Wordpress Website + 404 Errors
Hi everyone, I like to do a bit of auditing for our clients using SEOMoz. Once client that's using a Wordpress website had reported over a couple hundred 404 errors. However, when checking out the links, all the webpages (that I've tested) loaded just fine. Does anyone know why this would be the case? I thought, perhaps, the website might have gone down when it was crawling, but I have no evidence to back this up.
Technical SEO | | ThinkShiftInc0 -
Does anyone know a sitemap generation tool that updates your sitemap based on changes on your website?
We have a massive site with thousands of pages which we update everyday. Is there a sitemap generator that can create google sitemaps on the fly and change only based on changes in the site? Our site is much too large to create new sitemaps on regular basis. Is there a tool that will run on server that does this automatically?
Technical SEO | | gwynethmarta0 -
Why would SEOMoz and GWT report 404 errors for pages that are not 404ing?
Recently, I've noticed that nearly all of the 404 errors (not soft 404) reported in GWT actually resolve to a legitimate page. This was weird, but I thought it might just be old info, so I would go through the process of checking and "mark as fixed" as necessary. However, I noticed that SEOMoz is picking up on these 404 errors in the diagnostics of the site as well, and now I'm concerned with what the problem could be. Anyone have any insight into this? Rich
Technical SEO | | secretstache0 -
Differences in Sitemaps SEO wise?
I'm a bit confused about sitemaps. I'm just learning SEO so forgive me if this is a basic question. I've submitted my site to google webmaster using http://pro-sitemaps.com and the sitemap generator it creates. I've also seen sites do this: http://www.johnlewis.com/Shopping/ProductList.aspx and http://www.thesafestcandles.com/site-map.html so I did something similar for my site (www.ldnwicklesscandles.com). You figure you see everyone do it you might as well try it too and hope it works. 😉 So I've done both 1 and 2. Which sitemap is best for SEO purposes or should I do both? Is there any format that should or shouldn't be used for Option 2? Any site examples for good practice would be helpful.
Technical SEO | | cmjolley0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0